Models
For Completions, we currently support the following models:
The model name cygnet
will point to our latest model.
Messages
To interact with our model, you will need to format your prompt as a series of messages. Each message has a role
and content
. Different role
s are treated differently by the model. The available roles are:
system
(optional): Used to provide instructions, guidelines, and context to the model regarding the ensuing conversation between user and model
user
: Used for messages sent by the user to the model
data
: See the guide on Data elements for an explanation of this role
assistant
: Used for messages sent by our model to the user
In our Python client, messages are represented as follows:
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What day is after Monday?"},
{"role": "assistant", "content": "The day after Monday is Tuesday."}
]
Completions
Below we provide examples of usage of our Completions endpoint, for non-streaming, streaming, async non-streaming, and async with streaming.
Non-streaming
import os
from gray_swan import GraySwan
GRAYSWAN_API_KEY = os.environ.get("GRAYSWAN_API_KEY")
client = GraySwan(
api_key=GRAYSWAN_API_KEY,
)
completion_create_response = client.chat.completion.create(
messages=[{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is a large language model?"}],
model="cygnet",
)
print(completion_create_response.choices[0].message.content)
Streaming
completion_create_response = client.chat.completion.create(
messages=[{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is a large language model?"}],
model="cygnet",
stream=True
)
for r in completion_create_response:
delta_content = r.choices[0].delta.content
print(delta_content, end="")
Async without streaming
import os
from gray_swan import AsyncGraySwan
GRAYSWAN_API_KEY = os.environ.get("GRAYSWAN_API_KEY")
client = AsyncGraySwan(
api_key=GRAYSWAN_API_KEY,
)
completion_create_response = await client.chat.completion.create(
messages=[{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is a large language model?"}],
model="cygnet",
stream=True
)
for r in completion_create_response:
delta_content = r.choices[0].delta.content
print(delta_content, end="")
Async with streaming
import os
from gray_swan import AsyncGraySwan
GRAYSWAN_API_KEY = os.environ.get("GRAYSWAN_API_KEY")
client = AsyncGraySwan(
api_key=GRAYSWAN_API_KEY,
)
completion_create_response = client.chat.completion.create(
messages=[{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is a large language model?"}],
model="cygnet",
stream=True
)
async for r in completion_create_response:
delta_content = r.choices[0].delta.content
print(delta_content, end="")