Open-source UML and SysML toolkit

TTool

AI-assisted modeling, simulation, and formal verification for complex embedded systems.

TTool logo

  • Configuration
  • References

TTool-AI

TTool-AI is an extension of TTool to allow the use of AI engines (such as ChatGPT, MistralAI, or LMStudio) as a modeling assistant. This modeling assistant is part of TTool (nothing else needs to be installed), but you need to configure TTool in order to use the AI engine of your choice.


Configuration of TTool-AI



Once you have installed TTool, you need to configure TTool to use ChatGPT or MistralAI. Actually, the configuration should work for any AI model using the same JSON interface.

Open the configuration file of TTool (default file: config.xml), and add the following information:

Setting up the API Key

You need an API key provided by your AI provider.
<OPENAIKey data="put your key here" />
or
<MistralAIKey data="put your key here" />

GPT model

Optionally, you can also configure the AI models you intend to use. For instance:
<OPENAIModel data="gpt-3.5-turbo" />
You can also provide a list of possible models that you can select in the AI window. For instance:
<OPENAIModel data="gpt-4-0125-preview gpt-3.5-turbo" />

MistralAI model

Just like for ChatGPT, you can configure the mistralAI model you would like to use:
<MistralAIModel data="pixtral-12b-2409" />

Self-hosted AI (e.g., LMStudio)

This procedure should work for any self-hosted AI that conforms to the OpenAI json interface. Adapt it to your special case.
<CustomAIHost data="http://localhost:1234/v1/chat/completions" />
<CustomAIModel data="openai/gpt-oss-20b" />
(This configuration applies only if your self-hosted AI is compatible with the REST API.)


You may also configure an authentication token:
<CustomAIToken data="..." />

Then, you have to be sure that the self-hosted AI is accessible via the configured host. You can test that the AI is reachable as follows (adapt to your configuration):
$ curl http://localhost:1234/v1/models
(You can also try with a browser).
If the AI runs on a remote server, you could for instance set up an SSH tunnel to access the AI:
$ ssh -N -L 1234:localhost:1234 -J login@ssh.lovely.server
and then the curl command given before should work. Do not try with TTool until you are sure the access to the AI works.

Using openai/codex in TTool

Codex, by OpenAI, can be used in TTool as an AI supplier. Installation steps are as follows:
  • Install the command-line interface of codex.
  • Check where codex is installed. For instance:
    $ which codex
    /usr/local/bin/codex
    
  • Use codex in a command-line interface to log into your ChatGPT/OpenAI account.
  • Edit the configuration file of TTool and add the following configuration lines (adapt to your case):
    <CustomAIModel data="codex ..." />
    <CodexPath data="/usr/local/bin/codex" />
    
  • In TTool, select "codex" as the AI model you wish to use.

TTool as a MCP Server

If you wish to use TTool as an MCP Server, you need to configure TTool as follows. This configuration has been tested with LMStudio. Adapt it to your personal configuration.
<MCPServer data="ttool" />
<MCPServerCompletionPath data="http://localhost:1234/api/v1/chat" />
You also need to explain how TTool can copy model files to the MCP server. For instance:
<MCPCopyFile data="scp /tmp/mcpFile mylogin@machineonwhichmcpserverwillrun:/tmp/mcpFile" />

LMStudio configuration

In LMStudio, you need to give the path to TTool, and tell it that you accept TTool as an MCP server. In LMStudio, look for the `mcp.json` configuration file located in ./lmstudio (create it if necessary), and configure it as follows (adapt to your configuration):
  {
  "mcpServers": {
    "ttool": {
      "command": "java",
      "args": [
        "-jar",
        "/home/mylogin/TTool/bin/ttool-cli.jar",
        "-mcp"
      ],
      "env": {
        "JAVA_HOME": "/usr/lib/jvm/default-java"
      },
      "cwd": "/home/mylogin/TTool"
    }
  }
}

Codex configuration

You need to configure that Codex can use TTool as a MCP tool. To do this, edit ~/.codex/config.toml and provide the typical data (adapt to your installation):
[mcp_servers.ttool]
command = "java"
args = ["-jar", "/Users/mylogin/TTool/bin/ttool-cli.jar", "-mcpcodex"]

[mcp_servers.ttool_suggestion]
command = "java"
args = ["-jar", "/Users/ludovicapvrille/TTool/bin/ttool-cli.jar", "-mcpcodex-suggestion"]

[mcp_servers.ttool_mutation]
command = "java"
args = ["-jar", "/Users/ludovicapvrille/TTool/bin/ttool-cli.jar", "-mcpcodex-mutation"]



References


Design generation (architecture, behavior)

  • Ludovic Apvrille, "Automated System Engineering with Artificial Intelligence", Keynote at The ESM'2023 (The 37th annual European Simulation and Modelling Conference), Toulouse, France, Oct. 24-26, 2023. Slides
  • L. Apvrille, B. Sultan, "System Architects are not alone Anymore: Automatic System Modeling with AI", Proceedings of the 12th international conference on Model-Based Software and Systems Engineering (Modelsward'2024), Rome, Italy, Feb. 21-23, 2024. online paper paper (author version) slides Best paper award! bibtex
  • B. Sultan, L. Apvrille, "Continuous AI Assistance for Model-Driven Engineering", 14th international conference on Model-Based Software and Systems Engineering (Modelsward'2026), Marbella, Spain, March 2026.
  • B. Sultan, L. Apvrille, P. de Saqui-Sannes, "Automated Derivation of Formal Properties from Requirements", 20th Annual IEEE International Systems Conference, Halifax, Canada, April 2026.

Attack tree generation

  • A. Birchler De Allende, B. Sultan and L. Apvrille, "Automated Attack Tree Generation Using Artificial Intelligence & Natural Language Processing", proceedings of the 19th International Conference on Risks and Security of Internet and Systems (CRiSIS'2024), 26-28 Nov. 2024, Aix-en-Provence, France. paper (author version)

Coherency between models

AI and formal verification

  • B. Sultan, L. Apvrille, "Towards Safe LLM-Based Model Driven Engineering: when Syntax Checking and Safety Formal Verification Join the Loop", 13th European Congress of Embedded Real Time Systems (ERTS), Feb 2026, Toulouse, France. online paper (hal) bibtex