AI Prompt (Multi-LLM)
This action is suited to execute LLM-prompts with MCP support from multiple AI Providers, including on-prem once based on Ollama/vLLM. LLMs Full list.
How to configure and use the action
Drag & Drop the Action from the actions palette on the left to the stage:

Once on the stage you can configure your favourite LLM prompt using the action detail panel.

To properly use you Action you need to configure your LLM models in the Settings > Integrations section, provinding the corresponding LLM API-KEY.
E.g. for Google Gemini:

Using Ollama with Action PromptAI
As with others LLMs, in order to use Ollama with the PromptAI action, you need to configure the Ollama Integration on Tiledesk by going to Settings → Integrations, entering:
The URL of the machine where Ollama is running
(Optional) Your favorite models to use for a faster action configuration

To add a model to your Favorites List, type the exact model name and press the Enter button. Finally save the settings clicking on Save button.
In the PromptAI action, select Ollama as LLM and choose a model between the predefined favorite models

MCP support
MCP tools are fully supported in your AI Prompt action.

You can add as many tools your current selected model supports using the " + Add MCP tools" button in the AI Prompt detail panel, just under the prompt section.
Once you press the button a popup appears where you can add the tools. Simply choose a name for your tool and fill the corresponding MCP endpoint URL. Consider that actually only MCP of type HTTP streamable are supported.

Add your own tools pressing the "Add MCP Server" button on the bottom of the MCP popup, then fll out the MCP server form to setup your tool

We hope you enjoy our new Action that will let you use your favourite LLM provider and models!
If you have questions about the AI Prompt Action or other Tiledesk features feel free to send us an email to [email protected] or leave us feedback
Last updated