Optimize Microservice Databases with AWS DynamoDB

Data Modernization & Analytics
Infrastructure & DevOps Modernization

A monolithic data store may be holding you back from making the most of your microservices architecture. Learn how a microdatabase architecture leveraging multiple purpose-built databases, like AWS DynamoDB, can help.

Microservices are now nearly ubiquitous in applications. They bring multiple business advantages - scalability, autonomy, higher innovation, and faster time-to-market. 

But microserves are designed to be decoupled and autonomous. When services remain coupled, you’re unable to fully receive the benefits of this design paradigm.  

The following warning signs may show up as a result:

  • Product may still be technically rigid 
  • The time-to-market value never appears. This reduces competitive edge, makes it harder to adapt to changing market demands, and limits adoption of new and more valuable technology
  • Commercial database licensing is too expensive to keep yet too expensive to replace
  • Low confidence in the ability to scale the business - an architecture that hinders instead of supports growth

How the monolithic database limits your microservices architecture

These symptoms are the consequence of several root problems.  

First, microservices fail to deliver maximum value when sharing a monolithic data store. We decompose applications in order to decouple them. This decoupling should enable service autonomy. 

By retaining a monolithic data store, these services remain coupled (see figure 1). When you couple services to a common data store, they can only scale their compute independently. When the data store needs to scale up or down, that impacts all other services sharing that same store. Further, any service consuming high amounts of IOPS puts other services at risk of performance degradation.

Second, there are some schema-related considerations with a shared data store. If more than one service consumes information from the same location (i.e., table), your team can no longer freely change that portion of the schema to support a single service. You have to consider every affected service.  

When this dependency is unknown, it requires more due diligence through impact analysis ahead of any planned change. This increases the time required in the development cycle and makes feature changes more complex. Also, there aren’t any databases that excel at every single data use case.  

Third, the data landscape is complex. You can see that in the variety of database flavors we have today: relational, NoSQL, time series, document stores, ledgers, etc. When you choose one database to serve needs outside of its design, you see the “square peg in a round hole” problem.  

For example, if you have a web application that needs to store and retrieve key-value pairs, you can get much better performance out of a purpose-built data store like DynamoDB than you can a traditional relational data store like SQL Server.

Monolithic Data Stores

Figure 1. Monolithic Data Store

The solution: microdatabase architecture

To remove coupling, companies often treat each service as a completely independent application - source code repository, CI/CD process, change management cycle, and even the database. 

Microdatabases are a proven pattern to help remedy the issues with monolithic database architecture and maximize the value of moving to microservices.  

Moving to microdatabases (see Figure 2) can be as simple or complex as necessary for your use case. In fact, I often recommend a stepping-stone approach where you do two key things:

  • First, break out your key-value stores into something purpose-built such as Amazon DynamoDB
  • Second, isolate your remaining relational data per-service through table isolation.  

This approach keeps the operational burden low by keeping your incumbent relational database platform and adding in a managed NoSQL database.

Microdatabase Data Stores

Figure 2. Microdatabase Data Stores

We often see key-value stores housed in relational database platforms.  In those instances, performance is often a pain point. 

Amazon DynamoDB is a great first-step in breaking apart these monolithic data stores. Amazon DynamoDB is purpose-built for key-value data and designed to run high-performance applications at any scale.  Since it’s a fully managed database, you don’t have to worry about the operational aspect of adding a new database platform to an application stack - capacity, instance failures, software updates, backups, etc.  Your teams can invest more time in innovation and building new features without increasing their administrative burden.

Amazon provides a great toolbox for migrations. AWS Database Migration Services (DMS) is a managed service that migrates data from a relational store, such as PostgreSQL, into Amazon DynamoDB.  You can perform migration as a one-time event or configure it as an incremental, near-real-time delta. This method is common when there are no transformation operations necessary to perform on the data.  

AWS Glue is a managed ETL service that can migrate data from PostgreSQL to DynamoDB as well, and adds more capabilities for data transformation.  For cases where you require even more control during migration, you can use the AWS CLI and SDK to import your PostgreSQL exports into DynamoDB.  

For some related large-scale customer success stories, see Amazon’s website here.  There’s also a great whitepaper that gives some prescriptive steps in migrating data out of an RDBMS into DynamoDB.  Those steps are illustrated in figure 3 below)

Migration Strategy

Figure 3. Migration Strategy

The benefits of moving from the monolithic database

By breaking down your monolithic database into multiple microdatabases, you can truly unlock the potential of your microservices architecture:

  • Microservices become truly autonomous and you can manage them independently. Releases that require data model changes will no longer be breaking changes.
  • As your competition innovates and as customer’s demands shift, you can pivot your product faster and with less risk.
  • You can slowly move away from traditional commercial database licensing and take advantage of open-source technologies and cloud financial models (pay-per-use)
  • Your platform architecture will move at the speed of business

In addition, there are value-adds from migrating to DynamoDB specifically:

  • Scalability: A highly scalable NoSQL database that can handle large amounts of data and traffic.
  • Cost: A pay-as-you-go pricing model, which can be more cost-effective than running a traditional relational database. If you have a high volume of read and write operations and want to save on costs, DynamoDB may be a good option.
  • Performance: DynamoDB can provide higher performance than relational databases which is especially valuable for applications that require low-latency access to data.
  • Acceleration: You can enable DynamoDB Acceleration (DAX) as a caching layer in front of DynamoDB to further improve performance.
  • Serverless: Eliminate the need to manage servers or infrastructure. Free up your team to focus on your IP rather than spending time managing IT.
  • NoSQL: If your application uses unstructured or semi-structured data, DynamoDB is a purpose-built data store for this exact use case.

There are many benefits to microservices and a monolithic database can reduce or even eliminate those benefits. Whether you choose to isolate your data by table or by data store, I encourage you to remove any shared data sources that keep your microservices coupled.  

Data Modernization & Analytics
Infrastructure & DevOps Modernization
Kenneth Henrichs

Kenneth Henrichs

Kenneth is fueled with a passion for transforming businesses through the power of technology. With over 20 years of industry expertise, he has helped startups thrive, empowered small businesses to scale, and collaborated with Fortune 500 clients to drive innovation. Throughout his career, Kenneth has done everything from bare-metal-to-browser and has gained an affinity for data. He has already helped several customers create value from generative AI and is energized by the wealth of possibilities that this technology ushers in.

View Kenneth's articles
Karunakar Kotha

Karunakar Kotha

A Senior Partner DB Solution Architect at Amazon Web Services, Karunakar has been working with databases for the last 15 years and serves as a technical leader, helping partners and customers select the best architecture to meet their needs. Karunakar has a particular interest in data and analytics. Outside of work, he is an avid running enthusiast and enjoys going on long rides.

View Karunakar's articles

Learn more about our related services

Caylent Services

Infrastructure & DevOps Modernization

Quickly establish an AWS presence that meets technical security framework guidance by establishing automated guardrails that ensure your environments remain compliant.

Caylent Services

Data Modernization & Analytics

From implementing data lakes and migrating off commercial databases to optimizing data flows between systems, turn your data into insights with AWS cloud native data services.

Caylent Catalysts™

Data Modernization Strategy

From implementing data lakes & migrating off commercial databases to optimizing data flows between systems, turn your data into insights with AWS cloud native data services.

Caylent Catalysts™

Serverless Data Lake

Rapidly implement a foundational low-code data lake with Caylent's data engineering experts who will also enable your teams for no-code exploratory data analysis.

Accelerate your cloud native journey

Leveraging our deep experience and patterns

Get in touch

Related Blog Posts

Optimizing Media Management on Amazon S3

Learn how we helped a media company optimize the management of their video assets on Amazon S3.

Infrastructure & DevOps Modernization

Securing Sensitive Data: A Deep Dive into PII Protection with OpenSearch

Learn how organizations can protect sensitive data using Amazon OpenSearch's security features like fine-grained access controls, encryption, authentication, and audit logging.

Data Modernization & Analytics

Optimizing AWS Data Pipelines for Compliance in Digital Advertising

Learn how we helped an advertising customer setup automated, cost-effective pipelines to ensure compliance for sensitive data in their existing processes.

Infrastructure & DevOps Modernization