What is AWS Bedrock?
AWS Bedrock is a managed service that provides access to a variety of foundation models for natural language processing, computer vision, and other AI tasks. It allows developers to easily integrate AI capabilities into their applications without the need for extensive machine learning expertise.
Example Setup
In this example, we'll set up a simple chatbot using AWS Bedrock and AWS Lambda. The architecture includes an API Gateway to expose the Lambda function as an HTTP endpoint, which will serve as the interface for our chatbot.
File Structure
project/
│
├── src/
│ └── chatbot/
│ └── chatbot_lambda.py
│
└── template.yaml
CloudFormation
The template.yaml
file contains the CloudFormation template to define our AWS resources:
Resources:
ChatbotLambda:
Type: AWS::Serverless::Function
Properties:
Handler: chatbot_lambda.lambda_handler
Runtime: python3.11
Timeout: 300
CodeUri: src/chatbot/
Policies:
- Version: '2012-10-17'
Statement:
- Effect: Allow
Action:
- bedrock:*
- cloudwatch:*
Resource: '*'
TheApi:
Type: AWS::Serverless::Api
Properties:
StageName: beta
MethodSettings:
- DataTraceEnabled: true
HttpMethod: '*'
LoggingLevel: 'INFO'
ResourcePath: '/*'
TracingEnabled: true
DefinitionBody:
openapi: 3.0.1
info:
title: My API
version: 1.0.0
paths:
/chatbot:
post:
x-amazon-apigateway-integration:
uri: !Sub arn:aws:apigateway:${AWS::Region}:lambda:path/2015-03-31/functions/${ChatbotLambda.Arn}/invocations
httpMethod: POST
type: aws_proxy
security:
- api_key: []
securityDefinitions:
api_key:
type: apiKey
name: x-api-key
in: header
Lambda Function
The chatbot_
lambda.py
file contains the code for our Lambda function:
import json
import boto3
bedrock = boto3.client('bedrock', region_name='us-east-1')
bedrock_runtime = boto3.client('bedrock-runtime', region_name='us-east-1')
def lambda_handler(event, context):
foundation_models = bedrock.list_foundation_models()
matching_model = next((model for model in foundation_models["modelSummaries"] if model.get("modelName") == "Jurassic-2 Ultra"), None)
question = json.loads(event['body'])['question']
body = json.dumps({
"prompt": question,
"maxTokens": 500,
"temperature": 0.7,
"topP": 1,
})
response = bedrock_runtime.invoke_model(
body=body,
modelId=matching_model["modelArn"],
accept='application/json',
contentType='application/json'
)
response_body = json.loads(response.get('body').read())
answer = response_body.get('completions')[0].get('data').get('text')
return {
'statusCode': 200,
'body': json.dumps({
'question': question,
'answer': answer
})
}
Example Client Code
The following JavaScript code can be used to call the chatbot API:
const axios = require('axios');
const apiEndpoint = 'https://your-api-endpoint/beta/chatbot';
const apiKey = 'your-api-key';
async function callApi() {
try {
const response = await axios.post(apiEndpoint, {
question: 'What is the capital of France?'
}, {
headers: {
'x-api-key': apiKey
}
});
console.log('Response:', response.data);
} catch (error) {
console.error('Error calling API:', error.response.data);
}
}
callApi();
Conclusion
In this blog post, we demonstrated how to build a simple chatbot using AWS Bedrock and AWS Lambda. By leveraging AWS Bedrock's foundation models, we can easily add AI capabilities to our applications without needing deep machine learning expertise. The use of AWS Lambda and API Gateway provides a scalable and cost-effective way to deploy and manage our chatbot.