Are there standard approaches to persisting data for every hit that a web app receives? This would be for analytics purposes (as a better alternative to log mining down the road).
Seems like Redis would be a must. Is it advisable to also use a different DB server for that table, or would Redis be enough to mitigate the impact on the main DB?
Also, how common is this practice? Seems like a no brainer for businesses who want to better understand their users, but I haven’t read much about it.
8
I think it’s far more common to just use conventional logging to a textfile. Most webservers can log all the requests. You can analyze those external to any application that handles the requests.
Just keep it cheap and simple. You could still use Redis when making a reporting solution by importing the log files.
0
Use a service layer
tl;dr – log errors on the server-side, log analytics by sending an ajax request to a data tracking service from the client-side.
Just create a thin service layer to track data and fire an AJAX request to it from your site.
The request will contain all the data you want, just create a database to log the specific attributes you want.
Isn’t that the basic concept behind most statistic tracking services? A request contains a lot of information about the user.
It won’t provide error tracking but that’s what the server logs are supposed to be for in the first place.