Functions
chat
fn (model-id: Str, message: Str): Str | AwsError
fn (model-id: Str, message: Str, system: Str): Str | AwsError
fn (model-id: Str, message: Str, system: Str, max-tokens: Int): Str | AwsError
Simple chat with a Bedrock model - send a message and get a response string.
Supported Models
anthropic.claude-3-5-sonnet-20241022-v2:0- Claude 3.5 Sonnetanthropic.claude-3-haiku-20240307-v1:0- Claude 3 Haiku (faster, cheaper)amazon.titan-text-express-v1- Amazon Titanmeta.llama3-8b-instruct-v1:0- Meta Llama 3
Example
response chat("anthropic.claude-3-5-sonnet-20241022-v2:0", "What is the capital of France?")
// Returns: "The capital of France is Paris."
chat-with-image
fn (model-id: Str, message: Str, image-format: Str, image-base64: Str): Str | AwsError
fn (model-id: Str, message: Str, image-format: Str, image-base64: Str, system: Str): Str | AwsError
Chat with an image attachment.
Example
// Read and encode image
image-data ::hot::base64/encode(::hot::file/read-bytes("photo.png"))
response chat-with-image("anthropic.claude-3-5-sonnet-20241022-v2:0", "What's in this image?", "png", image-data)
converse-multi
fn (model-id: Str, history: Vec, message: Str, system: Str): Map
Have a multi-turn conversation with a Bedrock model.
Example
history []
// First turn
result1 converse-multi("anthropic.claude-3-5-sonnet-20241022-v2:0", history, "Hi, my name is Alice", "You are a helpful assistant.")
new-history result1.history
// Second turn (remembers context)
result2 converse-multi("anthropic.claude-3-5-sonnet-20241022-v2:0", new-history, "What's my name?", "You are a helpful assistant.")
// result2.response will mention "Alice"