I'm a software engineer transitionning to cloud architecture. I've been tasked with implementing a system to compute and save stats in real time. I designed a system but I feel like I'm missing something.
My backend sends events (for instance, "a new object has been bought") and I want my system to consume those events by spawning lambdas that will update data in DynamoDB. Since there will be a lot of events, I don't want to spawn one lambda per event but queue those events and consume them with a number of lambda that would scale according to the number of events in the queue. Here's what I done:
In my head, my backend send a basic POST to API Gateway with the event info. API Gateway sends the event to eventbridge. Eventbridge will then dispatch it to the available lambdas. When I want to read data, I send a GET to API Gateway that will read from DynamoDB.
I struggle with two things:
- I thought that EventBridge could "queue" incoming events but I'm starting to doubt that. Should I add SQS and change the way I "create event" to handle every request ?
- Am I wrong with the way I would send my messages ? Should my backend send message directly to EventBridge or should my backend send a message to API Gateway, then spanw a lambda that will send an event to eventbridge ? I'm starting to think that I'm overengineering it
