Configuration
NEX can be configured by editing the file located at:
This file uses a standard env-style format and accepts key-value pairs in the form:
Changes are applied automatically upon restart.
Configure License
To activate your NEX license, add the following line to your config file:
At the next startup, the license will be automatically detected and NEX will use the default AI API gateway.
Configure a Custom LLM Service
NEX allows you to connect your own Large Language Model (LLM) provider instead of using the default gateway.
All providers share these common fields:
| Variable | Description |
|---|---|
LLM_PROVIDER |
The backend provider name (e.g. openai, google, bedrock, ollama) |
LLM_API_KEY |
Your API key for the provider (use any for local models) |
LLM_MODEL |
The model identifier or name |
LLM_DISPLAY_NAME |
A friendly name shown in the NEX UI |
LLM_BASE_URL |
(Optional) Custom API base URL (mainly for local/self-hosted providers) |
LLM_PROVIDER_REGION |
(Optional) Cloud region for providers like AWS Bedrock |
Info
Only one provider can be active at a time. Make sure you define a single LLM_PROVIDER block in your config.
Example Configurations
Below are ready-to-use examples for the most common providers.
Google (Gemini)
LLM_PROVIDER=google
LLM_API_KEY=your-api-key-here
LLM_MODEL=gemini-3-pro-preview
LLM_DISPLAY_NAME="Gemini 3 Pro"
OpenAI
AWS Bedrock (Anthropic Claude)
Warning
Make sure your AWS credentials and Bedrock permissions are properly configured in your environment before starting NEX.
LLM_PROVIDER=bedrock
LLM_API_KEY=your-api-key-here
LLM_MODEL=arn:aws:bedrock:us-east-1:456510618647:inference-profile/global.anthropic.claude-sonnet-4-5-20250929-v1:0
LLM_PROVIDER_REGION=us-east-1
LLM_DISPLAY_NAME="Claude Sonnet 4.5"
Ollama and Other Local LLM Providers
Tip
Ensure your local LLM server is running before starting NEX.