LMS. - LM-Studio Commands

<< Click to Display Table of Contents >>

Navigation:  3. Script Language > AI - Artificial Intelligence Commands >

LMS. - LM-Studio Commands

The LMS.* commands provide integration with LM Studio, allowing you to run local Large Language Models (LLMs) on your own hardware without relying on cloud APIs.

Run local LLMs (Llama, Mistral, Phi, etc.) without internetFull control over model parameters (temperature, tokens, etc.)Support for system prompts and conversation contextJSON mode for structured outputModel management (load, unload, list available models)

Available Commands

Command

Description

LMS.ask

See full documentation

LMS.Ask-MCP

See full documentation

LMS.AskEx

See full documentation

LMS.ExtractBetween

See full documentation

LMS.ExtractCleanAnswer

See full documentation

LMS.ExtractCodeBlock

See full documentation

LMS.ExtractXmlTag

See full documentation

LMS.generateuniquefilename

See full documentation

Quick Start Example

; Check if LM Studio server is running

LMS.Ping

JNO.ServerNotRunning

 

; Load a model

LMS.SelectModel|llama-3.1-8b

 

; Set generation parameters

LMS.SetTemp|0.7

LMS.SetMaxTokens|1024

 

; Send a prompt

LMS.Ask|Explain quantum computing in simple terms|response

MSGBOX response

See Also

AI---Artificial-Intelligence-CAIG_---Google-AI