Skip to main content

AWS DVA-C02 Drill: Lambda Concurrency Management - Prioritizing Multiple SQS Event Sources

Jeff Taakey
Author
Jeff Taakey
21+ Year Enterprise Architect | AWS SAA/SAP & Multi-Cloud Expert.

Jeff’s Note
#

Unlike generic exam dumps, ADH analyzes this scenario through the lens of a Real-World Lead Developer.

For DVA-C02 candidates, the confusion often lies in how Lambda manages concurrency per event source when multiple SQS queues trigger the same function. In production, this is about knowing exactly how to configure Lambda event source mappings to control invocation concurrency and processing order when queues have different priorities. Let’s drill down.

The Certification Drill (Simulated Question)
#

Scenario
#

TechInsights, a SaaS company, runs a serverless application on AWS using an AWS Lambda function to process messages from two Amazon SQS queues: a “critical-jobs” queue and a “background-jobs” queue. The Lambda function currently processes messages from the critical-jobs queue and the development team plans to add the background-jobs queue as a second event source. The processing requirements are:

  • The Lambda function must always read and process up to 10 simultaneous messages from the critical-jobs queue before it processes any messages from the background-jobs queue.
  • The Lambda function must not exceed 100 concurrent invocations in total (across both queues).

The Requirement:
#

Which solution will ensure the Lambda function gives priority to processing messages from the critical-jobs queue, maintains up to 10 concurrent reads from that queue, and respects the 100 concurrent invocation limit?

The Options
#

  • A) Set the batch size in the event source mapping to 10 for the critical-jobs queue and 90 for the background-jobs queue.
  • B) Set the delivery delay to 0 seconds for the critical-jobs queue and 10 seconds for the background-jobs queue.
  • C) Set the maximum concurrency property in the event source mapping to 10 for the critical-jobs queue and 90 for the background-jobs queue.
  • D) Set the batch window in the event source mapping to 10 seconds for the critical-jobs queue and 90 seconds for the background-jobs queue.

Google adsense
#

leave a comment:

Correct Answer
#

C) Set the maximum concurrency property in the event source mapping to 10 for the critical-jobs queue and 90 for the background-jobs queue.

Quick Insight: The DVA-C02 Imperative
#

  • For a Lead Developer, the key is understanding that Lambda’s event source mapping supports a maximumConcurrency setting that limits the number of concurrent Lambda invocations triggered from each SQS queue independently.
  • This property allows you to reserve part of your overall Lambda concurrency for high priority queues, which batch size or delivery delay alone cannot guarantee.
  • Batch size controls the number of messages per invocation, not concurrency across queues.
  • Delivery delay affects message visibility, but doesn’t directly control concurrency or prioritization for Lambda triggers.

Content Locked: The Expert Analysis
#

You’ve identified the answer. But do you know the implementation details that separate a Junior from a Senior?


The Expert’s Analysis
#

Correct Answer
#

Option C

The Winning Logic
#

  • AWS Lambda’s event source mapping for SQS supports a maximumConcurrency setting that explicitly controls how many concurrent Lambda function invocations can be triggered from that specific SQS queue.
  • By setting maximumConcurrency = 10 for the critical-jobs queue and 90 for the background-jobs queue, you guarantee that at most 10 function instances process critical jobs simultaneously, reserving concurrency capacity accordingly.
  • This approach respects the global Lambda concurrency limit (100) by partitioning it between the two SQS queues according to priority.
  • Batch size controls how many messages are retrieved per invocation, not how many Lambdas run concurrently per queue. A batch size of 10 means up to 10 messages in a single invocation, but does not prevent Lambda from scaling beyond 10 concurrent invocations for that queue.
  • Delivery delay or batch window values affect message timing and batching behavior but do not provide concurrency guarantees or prioritization between queues.

The Trap (Distractor Analysis):
#

  • Option A (Batch size 10 & 90): Batch size defines messages per invocation, not how many invocations run concurrently. This cannot enforce priority or concurrency limits across queues.
  • Option B (Delivery delay 0 vs 10 seconds): Delivery delay controls when messages become visible, but it does not guarantee prioritization or concurrency constraints in Lambda processing.
  • Option D (Batch window in seconds): Batch window controls how long Lambda waits before invoking with a batch. It does not limit concurrency or prioritize between multiple event sources.

The Technical Blueprint
#

# Example CLI commands to set maximum concurrency for event source mappings on a Lambda function

# Assume the Lambda function name is "ProcessJobsFunction"

# Set max concurrency to 10 from 'critical-jobs' SQS queue
aws lambda update-event-source-mapping \
  --uuid <mapping-uuid-critical-jobs> \
  --maximum-concurrency 10

# Set max concurrency to 90 from 'background-jobs' SQS queue
aws lambda update-event-source-mapping \
  --uuid <mapping-uuid-background-jobs> \
  --maximum-concurrency 90

The Comparative Analysis
#

Option API/Config Parameter Behavior Pros Cons
A Batch Size (10 / 90) Controls messages per invocation Simple to configure Does not limit concurrency per queue
B Delivery Delay (0 / 10 seconds) Controls message visibility delay Temporarily delays lower priority No concurrency or prioritization control
C Maximum Concurrency (10 / 90) Limits concurrent Lambda invocations Fine-grained concurrency control & prioritization Requires knowledge of event source mapping API
D Batch Window (10s / 90s) Controls batching wait time May impact latency No concurrency or priority guarantees

Real-World Application (Practitioner Insight)
#

Exam Rule
#

“For the exam, always remember: maximum concurrency on Lambda event source mappings is your friend when juggling multiple SQS triggers needing different priority and concurrency levels.”

Real World
#

“Production workloads often combine reserved concurrency with this setting to enforce strict priority queues and avoid resource starvation. Batch size tuning helps throughput but won’t enforce your concurrency partitioning.”


(CTA) Stop Guessing, Start Mastering
#


Disclaimer

This is a study note based on simulated scenarios for the AWS DVA-C02 exam.

The DevPro Network: Mission and Founder

A 21-Year Tech Leadership Journey

Jeff Taakey has driven complex systems for over two decades, serving in pivotal roles as an Architect, Technical Director, and startup Co-founder/CTO.

He holds both an MBA degree and a Computer Science Master's degree from an English-speaking university in Hong Kong. His expertise is further backed by multiple international certifications including TOGAF, PMP, ITIL, and AWS SAA.

His experience spans diverse sectors and includes leading large, multidisciplinary teams (up to 86 people). He has also served as a Development Team Lead while cooperating with global teams spanning North America, Europe, and Asia-Pacific. He has spearheaded the design of an industry cloud platform. This work was often conducted within global Fortune 500 environments like IBM, Citi and Panasonic.

Following a recent Master’s degree from an English-speaking university in Hong Kong, he launched this platform to share advanced, practical technical knowledge with the global developer community.


About This Site: AWS.CertDevPro.com


AWS.CertDevPro.com focuses exclusively on mastering the Amazon Web Services ecosystem. We transform raw practice questions into strategic Decision Matrices. Led by Jeff Taakey (MBA & 21-year veteran of IBM/Citi), we provide the exclusive SAA and SAP Master Packs designed to move your cloud expertise from certification-ready to project-ready.