Skip to main content

Perplexity AI (pplx-api)

https://www.perplexity.ai

API Key​

# env variable
os.environ['PERPLEXITYAI_API_KEY']

Sample Usage​

from litellm import completion
import os

os.environ['PERPLEXITYAI_API_KEY'] = ""
response = completion(
model="perplexity/sonar-pro",
messages=messages
)
print(response)

Sample Usage - Streaming​

from litellm import completion
import os

os.environ['PERPLEXITYAI_API_KEY'] = ""
response = completion(
model="perplexity/sonar-pro",
messages=messages,
stream=True
)

for chunk in response:
print(chunk)

Supported Models​

All models listed here https://docs.perplexity.ai/docs/model-cards are supported. Just do model=perplexity/<model-name>.

Model NameFunction Call
sonar-deep-researchcompletion(model="perplexity/sonar-deep-research", messages)
sonar-reasoning-procompletion(model="perplexity/sonar-reasoning-pro", messages)
sonar-reasoningcompletion(model="perplexity/sonar-reasoning", messages)
sonar-procompletion(model="perplexity/sonar-pro", messages)
sonarcompletion(model="perplexity/sonar", messages)
r1-1776completion(model="perplexity/r1-1776", messages)
info

For more information about passing provider-specific parameters, go here