Configuration

Configuration #

EndpointElf supports configuration through CLI flags or a configuration file.

Example #

baseDocument: <string>
outputPath: <string>
license:
  key: <string>
  filepath: <string>
ai:
  engine: <string> # e.g. ollama, openai, anthropic
  model: <string>  # e.g. phi3, gpt-4o, claude-3-5-sonnet-latest
  url: <string>    # e.g. http://localhost:11434/ - optional
ast:
  engine: <string> # e.g. gorilla
useAI: <boolean>
useAST: <boolean>

Basic Config #

TODO

AI Models #

EndpointElf supports multiple AI engines for extracting routes and model descriptions.

ai:
  engine: "openai"
  model: "gpt-4o" # Optional
  url: <string>   # Optional

Refer to the OpenAI documentation for their current models.

ai:
  engine: "anthropic"
  model: "claude-3-5-sonnet-latest" # Optional
  url: <string>                     # Optional

Refer to the Anthropic documentation for their current models.

ai:
  engine: "ollama"
  model: "qwen2.5"
  url: <string> # Optional

Refer to the Ollama documentation for their current models.

Danger Uploading code to third-party AI agents may not be allowed by your company or license covering your project. Make sure you’re authorized to upload code to the AI agent you’re using or consider bringing your own model.