Documentation Index
Fetch the complete documentation index at: https://docs.simular.ai/llms.txt
Use this file to discover all available pages before exploring further.
Class: AskModel
Defined in: index.d.ts:229 A free-form chat-completions LLM (optionally vision-capable) — the JS analogue of theask primitive.
Given a prompt, optional accessibility/structural text, and zero or
more images, [AskModel.ask] returns the model’s response as a plain
string. The wire format is OpenAI-compatible chat completions, so any
provider config that points at such an endpoint works.
Constructors
Constructor
new AskModel(): AskModel
Returns
AskModel
Accessors
name
Get Signature
get name(): string
Defined in: index.d.ts:250
The wire-level model identifier sent in the request body.
Returns
string
Methods
ask()
ask(Defined in: index.d.ts:264 Ask the model a question, optionally grounded in accessibility text and/or images.prompt,text?,images?):string
prompt: the question or task to answer.text: optional accessibility-tree (or any) text included as structural context. Passnull/ omit to skip.images: zero or more images to attach. Each is encoded as a base64 data URL and sent as animage_urlchat content part. Passnull/ omit for none.
Parameters
prompt
string
text?
string | null
images?
Image[] | null
Returns
string
availableAliases()
Defined in: index.d.ts:248 Every LLM model alias accepted bystaticavailableAliases():string[]
byAlias on this machine,
deduplicated and sorted alphabetically. Use to discover what aliases
the loaded config (bundled defaults plus any user provider files)
advertises.
Returns
string[]
byAlias()
Defined in: index.d.ts:241 Resolve a model alias against the loaded config (e.g.staticbyAlias(alias):AskModel
"openrouter_gpt_4o_mini" from the bundled openrouter provider, or
any alias declared by a user provider). Throws if the alias is unknown.
Parameters
alias
string
Returns
AskModel
default()
Defined in: index.d.ts:235 First LLM model advertised by the first LLM-capable provider in the loaded configuration whose credentials are currently available. Throws if no provider in the loaded config advertises an LLM service.staticdefault():AskModel
Returns
AskModel
