TTMSFNCCloudAI
Usage
A component that provides access to several LLM AI services. The context can be sent to the service with an optional system and/or assistant role and the service returns its response. At this moment, there is support for OpenAI, Grok, Gemini, Claude, Perplexity, Mistral, DeepSeek and Ollama.
Authorization information
The LLM AI service API keys are set via TTMSFNCCloudAI.APIKeys
Published Properties
Property name | Description |
---|---|
APIKeys | Class holding the API keys for the different supported AI services. |
Busy | A boolean property that is true while the REST API request to the service is running. |
Context | A stringlist holding multiline context information for which the LLM should respond. |
Logging | When true, logging of all REST API calls to the cloud LLM service is enabled. |
LogFileName | Sets the name of the file to which the logging of the REST API calls to the cloud LLM are logged. |
Service | Selects the LLM AI service to use. The choices are aiOpenAI, aiGemini, aiGrok, aiClaude, aiPerplexity, aiOllama, aiMistral, aiDeepSeek |
Settings | Class holding additional settings for the different AI services. This includes the model name & temperature for all services, the model for the Perplexity service. For the Ollama service, the host & port for the Ollama server to use is provided. |
SystemRole | Sets the system role prompt part as TStringList. |
Tools | Collection holding all function calling tools provided by the class |
TTMSFNCCloudAISettings
The TTMSFNCCloudAISettings
class provides configuration settings for interacting with various cloud-based AI services. It includes options for selecting models, adjusting generation parameters, and integrating with services like OpenAI, Grok, Gemini, Claude, Ollama, and others.
Published Properties
Property | Type | Default | Description |
---|---|---|---|
GeminiModel |
string |
Specifies the model used for Gemini API requests. | |
OpenAIModel |
string |
Specifies the OpenAI model to be used. | |
OpenAIServers |
TTMSFNCCloudOpenAIServers |
oaDefault |
Defines the OpenAI server configuration. |
GrokModel |
string |
Specifies the model used for Grok (xAI) requests. | |
ClaudeModel |
string |
Specifies the model used for Claude API requests. | |
OllamaModel |
string |
Specifies the Ollama model to use for local inference. | |
DeepSeekModel |
string |
Specifies the DeepSeek model to be used. | |
PerplexityModel |
string |
Specifies the model used for Perplexity API calls. | |
OllamaHost |
string |
Hostname or IP address for the Ollama server. | |
OllamaPath |
string |
Custom endpoint path for Ollama integration. | |
OllamaPort |
integer |
11434 |
Port number for Ollama server connection. |
MistralModel |
string |
Specifies the model used for Mistral AI requests. | |
Temperature |
Double |
Controls randomness in responses (higher = more random). | |
MaxTokens |
integer |
0 |
Maximum number of tokens for the AI response. |
CustomOptions |
string |
JSON or key-value formatted options for advanced settings. | |
WebSearch |
boolean |
false |
Enables or disables web search integration. |
TTMSFNCCloudAITool
The TTMSFNCCloudAITool
class represents a tool or function that can be registered and executed by the AI engine. It includes metadata, parameter definitions, and an execution event handler. TTMSFNCCloudAITool is the collection item class in the Tools collection accessible as published property in the TTMSFNCCloudAI class.
Published Properties
Property | Type | Default | Description |
---|---|---|---|
Enabled |
boolean |
true |
Determines whether the tool is active and available for use. |
Type |
TTMSFNCCloudAIToolType |
ttFunction |
Specifies the kind of tool (e.g., function or another supported type). |
Name |
string |
The unique name of the tool. This is typically used to invoke the tool via AI calls. | |
Description |
string |
A short description of what the tool does, for AI model understanding. | |
Parameters |
TTMSFNCCloudAIParameters |
Defines the input parameters the tool accepts. | |
Tag |
NativeInt |
0 |
A user-defined value for tagging or identifying tools in code. |
OnExecute |
TTMSFNCCloudAIToolExecuteEvent |
Event handler that is triggered when the tool is executed by the AI engine. |
TTMSFNCCloudAIParameter
The TTMSFNCCloudAIParameter
class represents a single parameter definition used in AI tool declarations. It supports complex structures such as objects, arrays, enumerations, and various data types with metadata and validation flags. TTMSFNCCloudAIParameter is the collection item class for the class that represents all parameters for a function call TTMSFNCCloudAIParameters.
Published Properties
Property | Type | Default | Description |
---|---|---|---|
ArrayType |
TTMSFNCCloudAIParameterType |
ptString |
Specifies the data type of elements if the parameter is an array. |
ArrayProperties |
TTMSFNCCloudAIParameters |
Defines nested properties for array elements, when they are complex objects. | |
Enum |
TStrings |
List of valid values for this parameter (used for enum-like behavior). | |
Format |
TTMSFNCCloudAIFormat |
fmtNone |
Specifies the expected format of the value (e.g., date, URI). |
Name |
string |
The name of the parameter. This should match the name expected in JSON payloads. | |
Type |
TTMSFNCCloudAIParameterType |
ptObject |
Defines the basic type of the parameter (e.g., string, object, array). |
Required |
boolean |
true |
Indicates whether the parameter must be provided. |
Description |
string |
Describes the parameter for documentation and AI understanding. | |
Properties |
TTMSFNCCloudAIParameters |
Defines nested properties if the parameter is an object. | |
Tag |
NativeInt |
0 |
Optional user-defined tag for internal tracking or classification. |
TTMSFNCCloudAI Methods
TTMSFNCCloudAI – Public Methods
Method | Description |
---|---|
Assign(Source: TPersistent) |
Copies the content of another instance of a compatible object into this one. Used for cloning or state transfer. |
Execute(id: string = ''): boolean |
Executes the AI operation using an optional service or model ID. Returns true if successful. |
GetModels(id: string = ''): boolean |
Retrieves a list of available models, optionally filtered by ID. Returns true on success. |
GetServices(UseFunctionCalling: boolean = false; UseFiles: boolean = false): TStringList |
Returns a list of available AI services depending on capabilities like function calling or file support. |
ClearFiles |
Clears the current list of uploaded or attached files. |
GetFiles |
Fetches the list of uploaded files associated with the current AI session or configuration. |
GetAssistants(const AComplete: TTMSFNCCloudAIRunEvent = nil) |
Retrieves the list of AI assistants and optionally invokes a completion callback when done. |
UploadFile(const AFileName: string; AType: TTMSFNCCloudAIFileType; const ACreated: TTMSFNCCloudAICreatedEvent = nil) |
Uploads a file for use with AI services, triggering a callback when the upload completes. |
AddFile(const AFileName: string; AType: TTMSFNCCloudAIFileType) |
Adds a file to the session without uploading it to a remote server. |
AddText(const AText: string; AType: TTMSFNCCloudAIFileType) |
Adds plain text as a virtual file, useful for document-based interactions. |
AddURL(const AURL: string; AType: TTMSFNCCloudAIFileType) |
Adds a URL as a data source to be processed or fetched by the AI engine. |
CreateAssistant(AName, AInstruction: string; ATool: TTMSFNCCloudAIAssistantTools = [aitFileSearch]; const ACreated: TTMSFNCCloudAICreatedEvent = nil) |
Creates a new AI assistant with specified tools, instructions, and optional creation callback. |
CreateThread(const ACreated: TTMSFNCCloudAICreatedEvent = nil) |
Initializes a new thread (conversation context) for assistant interaction. |
CreateMessage(const AThreadID, ARole, AContent: string; AFiles: TStrings; ATool: TTMSFNCCloudAIAssistantTool; const ACreated: TTMSFNCCloudAICreatedEvent = nil; const AFailed: TTMSFNCCloudAIRunEvent = nil) |
Adds a message to a thread, associating files and tools, and handling success/failure via callbacks. |
RunThread(const AThreadID, AAssistantID: string; const ARun: TTMSFNCCloudAICreatedEvent = nil) |
Executes a thread using a specific assistant, with a callback for completion. |
RunThreadAndWait(const AThreadID, AAssistantID: string; const AComplete: TTMSFNCCloudAIRunEvent = nil) |
Runs a thread synchronously and waits for the assistant to finish responding. |
CheckStatus(const AComplete: TTMSFNCCloudAIRunEvent = nil) |
Checks the status of a running thread or assistant operation and invokes a callback when done. |
TTMSFNCCloudAI – Published Events
These events allow developers to handle and respond to key operations within the TTMSFNCCloudAI
class, such as execution, model listing, file handling, and assistant retrieval.
Event | Type | Description |
---|---|---|
OnExecuted |
TTMSFNCCloudAIResultEvent |
Triggered after the AI execution is completed. Typically used to process or display the result returned by the AI model. |
OnGetAssistants |
TTMSFNCCloudAIResultEvent |
Fired when the list of assistants has been successfully retrieved. Allows you to update the UI or internal state based on available assistants. |
OnGetModels |
TTMSFNCCloudAIResultEvent |
Occurs after the available AI models have been fetched from the server. Useful for populating model selection UIs. |
OnGetFiles |
TTMSFNCCloudAIResultEvent |
Called after the file list has been retrieved. Can be used to display or process uploaded files. |
OnFileDeleted |
TTMSFNCCloudAIFileEvent |
Fired when a file has been successfully deleted. Enables UI updates or logging. |
OnFileUpload |
TTMSFNCCloudAIFileEvent |
Triggered upon completion of a file upload. Useful for monitoring progress or enabling subsequent actions. |
The response object TTMSFNCCloudAIResponse
Property name | Description |
---|---|
Id: string | The Id that was used as parameter for the Execute() call. |
CompletionTokens | The number of tokens that the response contains. |
Content: TStrings | The LLM response as text. |
PromptTokens | The number of tokens taken by the LLM prompt (i.e. the Context). |
ServiceId: string | Contains the unique response Id returned by the service itself |
ServiceModel: string | Contains the exact model name the LLM service used |
TotalTokens | The total number of tokens used by the LLM request. |
Sample
The request:
TMSFNCCloudAI.Context.Text := 'What do you know about Delphi?';
TMSFNCCloudAI.OnExecuted := Executed;
TMSFNCCloudAI.Execute;
procedure Executed(Sender: TObject; AResponse: TTMSFNCCloudAIResponse; AHttpStatusCode: Integer;
AHttpResult: string);
begin
if Assigned(AResponse) then
begin
memo.Lines.Text := AResponse.Content.Text;
label.Caption := 'Prompt tokens:' + AResponse.PromptTokens.ToString+' / CompletionTokens:' + AResponse.CompletionTokens.ToString+ ' / TotalTokens:'+AResponse.TotalTokens.ToString;
end
else
ShowMessage('Error with HTTP status code:' + AHttpStatusCode.ToString)
end;