Skip to main content

AWS DVA-C02 Drill: DynamoDB TTL & Lambda Integration - Minimizing Custom Code

Jeff Taakey
Author
Jeff Taakey
21+ Year Enterprise Architect | AWS SAA/SAP & Multi-Cloud Expert.

Jeff’s Note
#

Unlike generic exam dumps, ADH analyzes this scenario through the lens of a Real-World Lead Developer.

For DVA-C02 candidates, the confusion often lies in how to effectively minimize custom polling and code when working with data expiration processes in DynamoDB. In production, this is about knowing exactly when to leverage DynamoDB’s native TTL feature combined with event-driven Lambda processing to avoid writing your own cleanup jobs. Let’s drill down.

The Certification Drill (Simulated Question)
#

Scenario
#

A software team at BrightApps maintains a web service that tracks time-sensitive promotions stored in an Amazon DynamoDB table. Each promotion item includes an attribute expireAt that records the UNIX timestamp when the promotion becomes invalid. Currently, their application relies on manual querying, archiving expired items, and deleting them.

The product team has decided to retire the current application soon, and the engineers want to automate this expiration and archiving process with the least possible development effort while leveraging AWS managed capabilities.

The Requirement
#

Provide an automated solution to delete expired DynamoDB items and process them for archiving with minimal new code.

The Options
#

  • A) Enable TTL on the expireAt attribute in the DynamoDB table. Create a DynamoDB stream. Create an AWS Lambda function triggered by the DynamoDB stream to process deleted items and archive them.
  • B) Create two AWS Lambda functions: one to scan and delete expired items, another to process them. Use DynamoDB Streams and make explicit DeleteItem API calls based on expireAt.
  • C) Create two AWS Lambda functions: one for deletion and one for processing. Schedule these via Amazon EventBridge rules. Use DeleteItem API calls to delete expired items and GetRecords API calls to retrieve and process them.
  • D) Enable TTL on the expireAt attribute. Configure an Amazon SQS dead-letter queue as the deletion target. Create a Lambda function to process the items from the queue.

Google adsense
#

leave a comment:

Correct Answer
#

A.

Quick Insight: The Developer Imperative
#

Leveraging DynamoDB’s native TTL mechanism combined with DynamoDB Streams and Lambda event triggers results in minimal operational and application management overhead. This approach avoids reinventing scheduled jobs and polling logic.

Content Locked: The Expert Analysis
#

You’ve identified the answer. But do you know the implementation details that separate a Junior from a Senior?


The Expert’s Analysis
#

Correct Answer
#

Option A

The Winning Logic
#

This solution uses DynamoDB’s Time to Live (TTL) on the expireAt attribute, which automatically deletes expired items behind the scenes with zero developer intervention. When TTL deletes an item, it triggers a record in a DynamoDB stream, which invokes a Lambda function.

  • The Lambda function can then process the deleted items — for example, archival to S3 or further auditing.
  • This pattern eliminates the need for custom scanning, polling, or manual delete operations.
  • It leverages event-driven architecture and managed services to minimize code and operational overhead.

The Trap (Distractor Analysis):
#

  • Why not Option B?
    While technically feasible, manually scanning and deleting expired items and handling streams increases code complexity and operational overhead compared to native TTL support.
  • Why not Option C?
    Scheduled Lambda invocations require additional orchestration and introduce latency between expiration and deletion. Plus, GetRecords does not directly query the DynamoDB table; it reads stream records, so it cannot be used to get table items.
  • Why not Option D?
    TTL deletes do not support sending items to an SQS queue. TTL deletions only generate stream records. Setting a DLQ as a target for deletions is unsupported and misunderstands the purpose of DLQs, which handle failed asynchronous processes, not TTL item deletes.

The Technical Blueprint
#

# Enable TTL on the 'expireAt' attribute using AWS CLI
aws dynamodb update-time-to-live --table-name PromotionsTable --time-to-live-specification "Enabled=true, AttributeName=expireAt"

# Example Lambda trigger policy snippet
{
  "Effect": "Allow",
  "Action": ["dynamodb:DescribeStream", "dynamodb:GetRecords", "dynamodb:GetShardIterator", "dynamodb:ListStreams"],
  "Resource": "arn:aws:dynamodb:region:account:table/PromotionsTable/stream/*"
}

The Comparative Analysis
#

Option API Complexity Performance Use Case
A Minimal (Native TTL + Stream-triggered Lambda) Low latency, event-driven Automated expiration + processing with minimal code
B High (Manual scan + delete + stream processing) Higher latency, more overhead Custom solutions needing explicit control
C Moderate (Scheduled jobs + manual API calls) Delayed deletion, more maintenance Batch processing without TTL knowledge
D Invalid (TTL + SQS DLQ unsupported) N/A Misunderstood architecture pattern

Real-World Application (Practitioner Insight)
#

Exam Rule
#

“For the exam, always pick Enable TTL with DynamoDB Streams + Lambda when you see data expiration processing with minimal coding requirements.”

Real World
#

“In practice, teams rely on this pattern to scale data lifecycle management without running cron jobs or complex polling mechanisms.”


(CTA) Stop Guessing, Start Mastering
#


Disclaimer

This is a study note based on simulated scenarios for the AWS DVA-C02 exam.

The DevPro Network: Mission and Founder

A 21-Year Tech Leadership Journey

Jeff Taakey has driven complex systems for over two decades, serving in pivotal roles as an Architect, Technical Director, and startup Co-founder/CTO.

He holds both an MBA degree and a Computer Science Master's degree from an English-speaking university in Hong Kong. His expertise is further backed by multiple international certifications including TOGAF, PMP, ITIL, and AWS SAA.

His experience spans diverse sectors and includes leading large, multidisciplinary teams (up to 86 people). He has also served as a Development Team Lead while cooperating with global teams spanning North America, Europe, and Asia-Pacific. He has spearheaded the design of an industry cloud platform. This work was often conducted within global Fortune 500 environments like IBM, Citi and Panasonic.

Following a recent Master’s degree from an English-speaking university in Hong Kong, he launched this platform to share advanced, practical technical knowledge with the global developer community.


About This Site: AWS.CertDevPro.com


AWS.CertDevPro.com focuses exclusively on mastering the Amazon Web Services ecosystem. We transform raw practice questions into strategic Decision Matrices. Led by Jeff Taakey (MBA & 21-year veteran of IBM/Citi), we provide the exclusive SAA and SAP Master Packs designed to move your cloud expertise from certification-ready to project-ready.