My Path To AWS Certified Database Specialty

Amazon Web Services (AWS) certifications corroborate cloud expertise and facilitate effective cloud initiatives. AWS offers certifications on several streams such as Solutions Architect, Networking, Database, etc. I completed the AWS Certified Database Specialty on Feb 21, 2022. This certification requires expertise with on-premises and AWS cloud-based relational and non-relational databases. Currently, I work predominantly in database migrations from on-premise to AWS cloud and so I decided to complete this certification to showcase my expertise in recommending, designing, and maintaining optimal AWS database solutions. This was my fifth AWS certification. I found this certification more challenging than the Associate level certification exams and on-par with the Big Data Specialty exam. In this blog post, I share my preparation and experience with this certification, and also provide some recommendations to prepare for this certification.

AWS Certified Big Data Specialty

Background

I have been working with AWS and Database/Big Data technologies for several years now. Prior to taking this exam, I had working knowledge with several AWS services such as EMR, EC2, S3, Relational Database Service (RDS), Aurora, DynamoDB, Redshift and CloudFormation. At my current job, I work with database modernization and database migration from on-premise to AWS cloud. I help my clients by recommending, designing and maintaining optimal database solutions in the cloud. I felt that the knowledge gained during the preparation for this certification would enable me to be more efficient and performant at work. Furthermore, this certification would help showcasing my expertise with AWS database services and solutions. I enjoy the process of preparing for and taking exams. Most importantly, I love challenging myself to learn new things. For these reasons, I decided to take the AWS Database Speciality certification.

Exam Experience

The registration price for the Database Specialty certification was $300. I was fortunate that my employer paid for this and provided me with a prepaid voucher.

The exam was 3 hours/180 minutes long and consisted of 65 questions. It took me 150 minutes to answer all the questions. I had flagged few of the questions that I was not fully confident. I used the remaining 30 minutes to review the flagged ones.

Most of the questions were scenario based and lengthy in size. The questions would either have a single choice answer or multiple choice answers. The questions described a technical need and each of the choices in the answer offer potential solutions. Typically, 2 out of the 4 or 5 choices could be eliminated as they are just distractors or non-sensical if you have at least high level understanding of the services. The remaining choices would be potential solutions to the question and we have to choose the best one among them. I had to re-read the questions to catch the key words such as cost-effective, performance, minimal downtime, minimal maintenance, latency, throughput etc to choose the best solution among all the potential solutions.

I chose to take the exam at College of Dupage (COD), my alma mater. Completing this certification at COD felt very satisfying and it was added to my list of nostalgic memories at COD.

I took the exam at COD, my alma mater.

Pre-requisites

AWS recommends having multiple years of experience with AWS and Database technologies (relational and non-relationa) prior to taking this certification. While there are no pre-requisites for this certification, I would highly recommend completing one of the Associate level certifications before taking this exam as the Database Specialty certification assumes you already have a good knowledge of the foundational services on AWS. Also, completing Associate level certification would familiarize you with the exam taking experience.

I had completed the below certifications prior to attempting the Database Specialty certification.

  • AWS Certified Solutions Architect Associate
  • AWS Certified Developer Associate
  • AWS Certified SysOps Associate
  • AWS Certified Big Data Specialty

Preparation

It’s difficult to quantify the amount of time that I spent preparing for this certification. I had completed AWS Certified Big Data Specialty in 2019 and some of the topics overlap between Big Data Specialty and Database Specialty. Add to this, I had working experience with many of the services that were covered in this exam.

I prepared for the certification by taking couple of courses on AWS Database Specialty course, getting hands-on experience with the database services, watching official AWS videos, reading the FAQs, skimming through the Whitepapers and taking few practice exams.

I would recommend recommend preparing for the certification in this order. You can complete all these in couple of months if you spend an hour every day.

Starting Point

The official AWS Database Specialty – Exam Readiness was the starting point for my preparation. The Exam Readiness modules provide the blueprint for the exam and also go over all the topics that are covered in the certification. I kept coming back to this page to review the topics.

Tutorials

Unlike Associate and Professional level certifications, there aren’t that many resources for Database Specialty. I took a couple of courses.

  • AWS Database Specialty course on A Cloud Guru (Recommended).
    • ACloudGuru’s course does a good job of giving an idea and setting the expectations for the exam. It also covers the high level concepts required for the certification. When it comes to the AWS Specialty certifications, the devil is in the details. Unfortunately, ACloudGuru lacks the specific details that are tested in the exam. This course helps with the preparation but it shouldn’t be the only source.
  • AWS Database specialty course on CloudAcademy (Not Recommended).
    • CloudAcademy’s course is not tailored for this certification. Instead, it’s just a collection of smaller courses for several database services. This course is a good source to learn the database services but it’s probably not a good one to take if you’re only interested in the certification. I still took this course to get educated on all the database offerings from AWS.

Hands-on Practice

My goal has always been to learn not just the theory but also get practical hands-on experience with the services. So, I played around with all of the services covered in the certification. I strongly believe having hands-on experience helps passing the exam.

  • I completed A Cloud Guru’s and CloudAcademy’s hands-on labs related to AWS Database services such as Amazon Relational Database Service (Amazon RDS), Amazon Aurora, Amazon Neptune, Amazon DocumentDB, Amazon Redshift in their sandbox environments.
  • I played around with AWS Database Migration Service (DMS) and AWS Schema Conversion Tool (SCT) in my personal account. This definitely helped with many questions in the exam related to data migration and schema conversion.

AWS Materials

AWS provides a ton of learning materials on all their study materials. The below materials are a must to learn the architecture, best practices, and most importantly, the intricate details of each service.

Practice Exams

I took few practice exams during the last few days prior to my scheduled date for exam. Taking practice exams helped me get in the rhythm for the actual exam.

  • Official AWS Big Data Specialty Exam Readiness
    • The exam readiness contains about 20 questions with detailed explanations. The questions were very similar to the ones that appear in the exam.
  • A Cloud Guru’s Practice Exam
    • It’s a good one to get a feel for testing on 65 questions. Only some of the questions are similar to the ones in the certification exam.
  • CloudAcademy’s Practice Exam
    • The questions test the knowledge of the service. However, they are not related to the certification. I wouldn’t recommend taking this practice exam as a preparation for the certification exam.

Topics

The majority of the questions were related to Amazon Relational Database Service (Amazon RDS), Amazon Aurora, Amazon DynamoDB, Amazon Redshift, AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT).

There were few questions related to AWS Key Management Service (KMS), AWS CloudFormation, AWS Secrets Manager, AWS Systems Manager Parameter Store and AWS Backup.

There are couple of questions related to Amazon DocumentDB, Amazon Neptune, Amazon QLDB, Amazon Kinesis and Amazon CloudWatch Logs. I didn’t get any question to related to Amazon Keyspaces and Amazon Timestream.

  • Amazon Relational Database Service (Amazon RDS)
    • Architecture & Terminology
    • Use cases
    • RDS Database engines
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Read Replicas and Cross-Region Read Replicas
    • Multi-AZ
    • Migration
    • Backups
    • Snapshots
    • Monitoring (Logs, Performance Insights, CloudWatch metrics, Enhanced Monitoring)
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon Aurora
    • Architecture & Terminology
    • Use cases
    • MySQL vs PostgreSQL
    • Aurora vs RDS
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Write and Reader Nodes
    • Multi-AZ
    • Global Database
    • Endpoints (cluster, reader, instance, custom)
    • Migration
    • Backups
    • Snapshots
    • Monitoring (Logs, Performance Insights, CloudWatch metrics, Enhanced Monitoring, Database Activity Streams)
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon DynamoDB
    • Architecture & Terminology
    • Use cases
    • Table design
    • Creating Table and configurations
    • On-demand provisioning & auto-scaling
    • Write Capacity Units (WCU) & Read Capacity Units
    • Burst capacity
    • Adaptive capacity
    • Read consistencies
    • Partitions
    • Streams
    • Replication
    • Erros/Exceptions
    • DynamoDB Accelerator (DAX)
    • Global Tables
    • Local Secondary Indexes
    • Global Secondary Indexes
    • Backups
    • Hot partitions
    • TTL
    • Monitoring
    • Best practices
    • Integration with other services
    • Limitations
    • Encryption & Security
  • Amazon Redshift
    • Architecture & Terminology
    • Use cases
    • Table design
    • Distribution key
    • Sort key
    • Copy
    • Unload
    • Instance types
    • Scaling
    • Workload management
    • Compression
    • Snapshots
    • Access Control
    • Monitoring
    • Best practices
    • Integration with other services
    • Limitations
    • Encryption & Security
  • Amazon ElastiCache
    • Architecture & Terminology
    • Use cases
    • Redis vs Memcache
    • Cluster mode enabled vs disabled
    • Caching strategies (Lazy loading vs write-through)
    • TTL
    • Shards
    • Scaling
    • Best practices
    • Limitations
    • Encryption & Security
  • Amazon DocumentDB
    • Architecture & Terminology
    • Use cases
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Backups
    • Monitoring
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon Neptune
    • Architecture & Terminology
    • Use cases
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Backups
    • Monitoring
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon QLDB
    • Architecture & Terminology
    • Use cases
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Backups
    • Monitoring
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon Timestream
    • Architecture & Terminology
    • Use cases
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Backups
    • Monitoring
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon Keyspaces
    • Architecture & Terminology
    • Use cases
    • Administration
    • Scalability
    • Availability
    • Durability
    • Disaster Recovery
    • Fault Tolerance
    • Backups
    • Monitoring
    • Authentication
    • Encryption & Security
    • Best practices
    • Limitations
  • Amazon S3
    • Different tiers
    • Version control
    • Bucket policies
    • Access Control List
    • Lifecycle Policies
    • Vault Lock
    • Encryption & Security
  • AWS CloudFormation
    • Architecture & Terminology
    • Use cases
    • Templates
    • Parameters
    • Mappings
    • Resources
    • Output
    • Deletion
    • Stacks
    • Nested stacks
    • Drift
    • Stackset
    • Best practices
    • Limitations
  • AWS Database Migration Service (AWS DMS)
    • Homogenous and heterogenous migration
    • Multi-AZ
    • Full load and Change Data Capture (CDC)
    • Full LOB mode vs Limited LOB mode
  • AWS Schema Conversion Tool (AWS SCT)
    • Sources and Targets
    • Migration Assessment Reports
    • Schema Comparison
    • Schema Conversion
    • Snowball Edge
  • AWS DataBackup
  • AWS Key Management Service (KMS)
  • AWS CloudHSM
  • AWS Identity and Access Management (IAM)
  • Amazon CloudWatch
  • Amazon CloudWatch Logs
  • Amazon Kinesis

Final Words

AWS Database Specialty Certification, being a specialty exam, is both challenging and rewarding. It requires knowledge of all the database services offered by AWS. This certification helps with showcasing expertise in designing database solutions in AWS cloud. I had a satisfying experience in preparing and completing the certification.

If you’re preparing for this certification, I strongly recommend reading the FAQs and getting hands-on experience with all the database services especially Amazon Relational Database Service (Amazon RDS), Amazon Aurora, Amazon DynamoDB, Amazon Redshift, Amazon Elasticache, AWS Database Migration Service (AWS DMS) and AWS Schema Conversion Tool (AWS SCT). I believe the preparation guide shared in this blog post will put you in a good spot to take the exam.

I wish you good luck with your certification journey. Please feel free to ask any questions or provide your thoughts on the comment section.

If you’re into taking certifications or interested in Cloud and Big Data technologies, checkout my other blog posts on AWS Big Data Specialty and Cloudera CCA Spark and Hadoop Developer (CCA175) Certification.

2 Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Scroll to Top