We currently have a custom logging system. Its a single dll that pushes logs to Microsoft SQL server. Those logs are viewed via a webpage, with advanced sorting/filtering/paging/download. I built the system from scratch and I was new to Entity Framework and am not super strong in SQL.
One co-worker is insisting that Windows Event Log is a better choice than the SQL server, as it would save on bandwidth, therefore being quicker. Also that it is a tried and true solution.
I am questioning the practicality of using Windows Event Log to handle our extensive logging. Currently in SQL we have a single Log table which hosts around 150 million logs received from ~100 different machines across ~600 applications is about 90gb total.
Is using Windows Event Log a viable solution for that kind of load? Then creating a way for an admin server/user to request/sort/filter/page those logs without an intermediary SQL? Or am I just blinded by due to having built the current system?
Additional Info
Log is a single table Log(Id, AppId, Type, Lvl, Source, Msg, Time)
150 million logs covers Errors for 7 days, everything else for 3 days
Normalizing Source and Msg would drastically reduce the size of the table as there are many duplicate messages.
Most complaints about the current system is that the webpage times out on certain applications when requesting logs. The timeout is still 30 seconds, logs are limited to 10,000 total, at that point you have to refine your search. The timeout really only occurs on applications which have an average of 800k+ logs. Other apps return results at an average max of 10 seconds.
I am making changes to indexes and have thought about normalizing the table to increase efficiency but its an ongoing process. If Windows Event Log is a better solution however I would rather spend my time implementing that.
This is my sole own opinion
Windows event logging for distributed systems is a bad idea.
Your idea of centralizing your logs for your applications is great. This will help you do statistics, you have a backup mechanism, you can use sql server for clustering, etc.
If people complain about performance issues, maybe it would be time to look into optimizing your solution. Indices maybe? Look at what could be indexed. Make sure SQL Server can make use of your search arguments (SARGS) when using WHERE conditions.
Can you split your data into multiple tables? Maybe normalizing your data should be looked into.
You could also look into pre-made filters that your users use and make them available and work on them (performance)
This is just a start. There's probably details we don't know and that you can't disclose