CLI Authentication
Use the litellm cli to authenticate to the LiteLLM Gateway. This is great if you're trying to give a large number of developers self-serve access to the LiteLLM Gateway.
Demo​
Usage​
Prerequisites - Start LiteLLM Proxy with Beta Flag​
CLI SSO Authentication is currently in beta. You must set this environment variable when starting up your LiteLLM Proxy:
export EXPERIMENTAL_UI_LOGIN="True"
litellm --config config.yaml
Or add it to your proxy startup command:
EXPERIMENTAL_UI_LOGIN="True" litellm --config config.yaml
Configuration​
JWT Token Expiration​
By default, CLI authentication tokens expire after 24 hours. You can customize this expiration time by setting the LITELLM_CLI_JWT_EXPIRATION_HOURS environment variable when starting your LiteLLM Proxy:
# Set CLI JWT tokens to expire after 48 hours
export LITELLM_CLI_JWT_EXPIRATION_HOURS=48
export EXPERIMENTAL_UI_LOGIN="True"
litellm --config config.yaml
Or in a single command:
LITELLM_CLI_JWT_EXPIRATION_HOURS=48 EXPERIMENTAL_UI_LOGIN="True" litellm --config config.yaml
Examples:
LITELLM_CLI_JWT_EXPIRATION_HOURS=12- Tokens expire after 12 hoursLITELLM_CLI_JWT_EXPIRATION_HOURS=168- Tokens expire after 7 days (168 hours)LITELLM_CLI_JWT_EXPIRATION_HOURS=720- Tokens expire after 30 days (720 hours)
You can check your current token's age and expiration status using:
litellm-proxy whoami
Steps​
-
Install the CLI
If you have uv installed, you can try this:
uv tool install 'litellm[proxy]'If that works, you'll see something like this:
...
Installed 2 executables: litellm, litellm-proxyand now you can use the tool by just typing
litellm-proxyin your terminal:litellm-proxy -
Set up environment variables
On your local machine, set the proxy URL:
export LITELLM_PROXY_URL=http://localhost:4000(Replace with your actual proxy URL)
-
Login
litellm-proxy loginThis will open a browser window to authenticate. If you have connected LiteLLM Proxy to your SSO provider, you should be able to login with your SSO credentials. Once logged in, you can use the CLI to make requests to the LiteLLM Gateway.
-
Make a test request to view models
litellm-proxy models listThis will list all the models available to you.