Building a Serverless Chat App with Bedrock & OpenAI
Building a serverless chat application with contextual note integration is a significant step in modern application development. In this post, we will explore how to build a scalable application on AWS using Amazon Bedrock and the OpenAI API, with the SST (Serverless Stack) framework.
Project Overview
This project involves creating a chat application that not only allows for real-time communication but also uses contextual notes to enrich conversations. Essentially, the notes you create and store can be passed as context to your chat. The model can reference previously stored information to make conversations more useful.
Technologies Used and Architecture
Our application is entirely serverless and based on a pay-per-use model:
- SST (Serverless Stack): To develop and deploy the application to AWS.
- Next.js: For the frontend interface.
- AWS Lambda: For backend logic.
- Amazon DynamoDB: For data storage (notes and chat history).
- Amazon Bedrock: To access AI models (e.g., Anthropic Claude or Amazon Titan).
- OpenAI API: For alternative AI models.
Architecture Diagram
The application routes user requests via API Gateway to Lambda functions. Lambda functions fetch context data from DynamoDB and send it to Bedrock or OpenAI API to generate a response.
Step 1: Getting Started and Prerequisites
Before starting, ensure you have the following installed:
- Node.js: Latest LTS version.
- AWS CLI: A configured AWS account.
To initialize an SST project, run the following command in your terminal:
npx create-sst@latest my-chat-app
cd my-chat-app
npm install
Step 2: Defining Infrastructure (Infrastructure as Code)
SST allows you to define your infrastructure using TypeScript. In the sst.config.ts file, we can define Lambda functions, DynamoDB tables, and the Next.js site.
import { SSTConfig } from "sst";
import { NextjsSite, Table, Function } from "sst/constructs";
export default {
config(_input) {
return {
name: "my-chat-app",
region: "us-east-1",
};
},
stacks(app) {
app.stack(function Site({ stack }) {
const table = new Table(stack, "Notes", {
fields: {
noteId: "string",
content: "string",
},
primaryIndex: { partitionKey: "noteId" },
});
const site = new NextjsSite(stack, "site", {
bind: [table],
});
stack.addOutputs({
SiteUrl: site.url,
});
});
},
} satisfies SSTConfig;
Step 3: Backend Logic and Bedrock Integration
On the backend, we create a Lambda function that takes the message and related notes from the user and sends them to the Bedrock API.
import { BedrockRuntimeClient, InvokeModelCommand } from "@aws-sdk/client-bedrock-runtime";
const client = new BedrockRuntimeClient({ region: "us-east-1" });
export const handler = async (event) => {
const { message, context } = JSON.parse(event.body);
const command = new InvokeModelCommand({
modelId: "anthropic.claude-v2",
contentType: "application/json",
accept: "application/json",
body: JSON.stringify({
prompt: `Human: Context: ${context}
Question: ${message}
Assistant:`,
max_tokens_to_sample: 300,
}),
});
const response = await client.send(command);
const result = JSON.parse(new TextDecoder().decode(response.body));
return {
statusCode: 200,
body: JSON.stringify({ reply: result.completion }),
};
};
Note: This part is simplified. In a real application, you should add error handling and security checks.
Step 4: Frontend Development (Next.js)
On the Next.js side, we create the user interface to make requests to our backend API. Within React components, we can use fetch to call the Lambda function and display the response on the screen.
Frequently Asked Questions (FAQ)
1. Why should I use SST?
SST is a framework built on top of AWS CDK that makes it extremely easy to build serverless applications and test them locally (Live Lambda Development).
2. What is the difference between Bedrock and OpenAI?
Bedrock is a service hosted on AWS that provides access to multiple models (Claude, Titan, Llama 2, etc.) via a single API. The OpenAI API provides direct access to OpenAI models (GPT-4, etc.).
3. What is the cost of this architecture?
Since it is completely serverless, there are no server costs. You only pay for Lambda execution time, DynamoDB read/write operations, and Bedrock/OpenAI token usage. Costs are typically very low for low-traffic applications.
For more information, you can check out our AWS Consultancy and Kubernetes Consultancy services. Also, visit our Home Page for our general technology blog.
Source: https://awsfundamentals.com/blog/amazon-bedrock-the-openai-api-and-sst