Architectural question here. I have a Web App (HTML5/Javascript/AngularJS) which logs user activity into a Backend DB via Web API. This function runs quite frequently on events such as Menu and button Clicks throughout the app. Making an API call for each such event seems expansive, and I was wondering what are best practices for these kind of updates, Or, How is that being implemented with a service such as Google Analytics on their client scripts?
I've been looking at something very similar where I work. First you really want the data sent to your API as small as possible so that sending all of these events doesn't use to much bandwidth. The next thing we realised was to use a queue of some sort, RabbitMQ, Azure Service Bus, MSMQ or Amazon SQS or something similar.
The API endpoint should just take the data and put it on the queue. Then have another program or service that then reads the messages on the queue at it's own pace and does any inserts into databases.
At peak times you shouldn't then see any dip in performance from the API, the only thing that will happen when your system is being hammered is that the data may take a few seconds to a few minutes to be inserted into the DB.
You're essentially batching up on the queue and don't have to worry about batching client side