- The Cloud Economist
- Posts
- How This Company Reduced Database Migration Costs By 80%
How This Company Reduced Database Migration Costs By 80%
Database migrations often involve high costs.
You typically need to scan through all of the records in all your tables and write them to another database.
This is what this company (CMU) faced - a straightforward but common issue. They needed to move 20 million records from a MySQL database to DynamoDB without blowing their limited budget.
Initially, the migration seemed simple enough until they realized they would have to duplicate data due to DynamoDB’s schema design.
This meant instead of writing 20 million records to DynamoDB, they needed to write 40 million.
Since DynamoDB charges for each write request, the rough cost estimation was $31. While not astronomical, it felt high for a POC and wouldn’t scale well in terms of costs when they would need to replicate the migration with a larger dataset.
Then they found a smarter solution: one that would reduce their costs by over 80%.
Here’s what they did:
Instead of directly writing each record to DynamoDB, they used Amazon S3 as an intermediary.
DynamoDB allows you to import data directly from S3 at a significantly lower cost.

Since the company’s data volume was around 12GB and S3 import charges $0.15 per GB, they ended up paying only about $6 for the entire migration.
This was successful not only for the proof of concept but also demonstrated a much more scalable way of handling larger database migrations.
Here’s an overview of the process they used:
1. Data transformation
They transformed their SQL data into JSON format for DynamoDB. Each record clearly indicates field types with an “S” for string types and “N” for number types.
For example:
{
"Item": {
"pk": {"S": "r#874"},
"sk": {"S": "630971678365982720"},
"source_user_id": {"N": "174305318"},
...
}
}
2. Upload to Amazon S3
Once formatted in JSON, they uploaded the records to an S3 bucket.
3. Import From S3 with DynamoDB
Within DynamoDB, they used the import from S3 feature, pointed to the S3 bucket URL, specified the partition and sort keys, and let AWS handle the rest.
In just a few clicks, and at a fraction of the cost, they had migrated their data to a single DynamoDB table, ready to test the proof of concept.
Conclusion
The approach turned out to be a perfect balance between cost and effort.
Instead of spending money on millions of write operations, they repositioned their strategy to reformat the data and use a different import solution, which paid off massively.
If your data needs to be migrated to a cost-effective and efficient database solution, consider DynamoDB if the use case is right.
(You can reply to this email directly for help with this).
Want to learn more about optimizing your cloud costs? Subscribe to our newsletter for weekly cloud computing cost-savings tips and insights.
Curated Article
“Using S3 to save 80% costs of moving 20 Million worth of data to DynamoDB”. By Dipto Chakrabarty. https://diptochakrabarty.medium.com/using-s3-to-save-80-costs-of-moving-20-million-worth-of-data-to-dynamodb-ee0bda3ff44c
Enjoying the newsletter?
🚀 Follow me on LinkedIn for daily posts on AWS and DynamoDB.
🔥 Invite your colleagues to subscribe to help them save on their AWS costs.
✍️ Check out my blog on AWS here.