AWS DynamoDB Example

VerticalServe Blogs
3 min readAug 17, 2024

--

Creating an efficient system to ingest customer transaction data from a CSV file into AWS DynamoDB and querying it using a FastAPI application involves several steps. This blog post will guide you through the process, including setting up the infrastructure on AWS and writing the Python code to interact with DynamoDB.

Step 1: Setting Up AWS Infrastructure

1.1 Create a DynamoDB Table

First, create a DynamoDB table to store customer transactions. You can do this via the AWS Management Console:

  • Navigate to the DynamoDB service.
  • Click on “Create table.”
  • Enter the table name, e.g., CustomerTransactions.
  • Set the primary key, e.g., TransactionID.
  • Choose the appropriate read/write capacity mode.

1.2 Set Up Amazon S3 and Lambda for CSV Ingestion

To ingest CSV files into DynamoDB, we’ll use Amazon S3 to store the CSV files and AWS Lambda to process them.

Create an S3 Bucket:

  • Navigate to the S3 service.
  • Click on “Create bucket.”
  • Enter a unique bucket name, e.g., customer-transactions-bucket.

Create a Lambda Function:

  • Navigate to the Lambda service.
  • Click on “Create function.”
  • Choose “Author from scratch.”
  • Enter the function name, e.g., CSVToDynamoDB.
  • Choose Python 3.x as the runtime.
  • Set up the execution role with permissions to access S3 and DynamoDB.

Configure the Lambda Function:

  • Use the following Python code in your Lambda function to read the CSV file from S3 and write to DynamoDB:
import boto3
import csv
import os

dynamodb = boto3.resource('dynamodb')
s3 = boto3.client('s3')

def lambda_handler(event, context):
bucket = event['Records'][0]['s3']['bucket']['name']
key = event['Records'][0]['s3']['object']['key']

# Download the CSV file from S3
s3.download_file(bucket, key, '/tmp/transactions.csv')

# Open the CSV file and read its contents
table = dynamodb.Table('CustomerTransactions')
with open('/tmp/transactions.csv', 'r') as csvfile:
csv_reader = csv.DictReader(csvfile)
for row in csv_reader:
# Insert each row into the DynamoDB table
table.put_item(Item=row)

Set Up S3 Event Trigger:

  • In the S3 console, go to the properties of your bucket.
  • Under “Event notifications,” create a new event to trigger the Lambda function whenever a CSV file is uploaded.

Step 2: Ingesting Data

Upload your CSV file to the S3 bucket. This action will trigger the Lambda function, which will read the CSV file and insert the data into the DynamoDB table.

Step 3: Querying DynamoDB with FastAPI

3.1 Set Up FastAPI

Install FastAPI and Uvicorn:

pip install fastapi uvicorn boto3

3.2 Create the FastAPI Application

Here’s an example FastAPI application that queries the DynamoDB table:

from fastapi import FastAPI, HTTPException
import boto3

app = FastAPI()

dynamodb = boto3.resource('dynamodb')
table = dynamodb.Table('CustomerTransactions')

@app.get("/transactions/{transaction_id}")
async def get_transaction(transaction_id: str):
try:
response = table.get_item(Key={'TransactionID': transaction_id})
item = response.get('Item')
if not item:
raise HTTPException(status_code=404, detail="Transaction not found")
return item
except Exception as e:
raise HTTPException(status_code=500, detail=str(e))

3.3 Run the FastAPI Application

Run the application using Uvicorn:

uvicorn my_fastapi_app:app --reload

Replace my_fastapi_app with the name of your Python file.

Conclusion

This setup allows you to efficiently ingest customer transaction data from a CSV file into DynamoDB and query it using a FastAPI application. By leveraging AWS services like S3 and Lambda, you can automate the ingestion process, while FastAPI provides a robust framework for building RESTful APIs to interact with your data. This approach is scalable and can handle large datasets efficiently.

About — The GenAI POD — GenAI Experts

GenAIPOD is a specialized consulting team of VerticalServe, helping clients with GenAI Architecture, Implementations etc.

VerticalServe Inc — Niche Cloud, Data & AI/ML Premier Consulting Company, Partnered with Google Cloud, Confluent, AWS, Azure…50+ Customers and many success stories..

Website: http://www.VerticalServe.com

Contact: contact@verticalserve.com

--

--