Skip to main content

Petals

Petals: https://github.com/bigscience-workshop/petals

Open In Colab

Pre-Requisites​

Ensure you have petals installed

pip install git+https://github.com/bigscience-workshop/petals

Usage​

Ensure you add petals/ as a prefix for all petals LLMs. This sets the custom_llm_provider to petals

from litellm import completion

response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}]
)

print(response)

Usage with Streaming​

response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}],
stream=True
)

print(response)
for chunk in response:
print(chunk)

Model Details​

Model NameFunction Call
petals-team/StableBelugacompletion('petals/petals-team/StableBeluga2', messages)
huggyllama/llama-65bcompletion('petals/huggyllama/llama-65b', messages)