We are using GTM to manage our external javascripts. We really love using GTM and don't want to remove it, but it is really problematic when it comes to Pagespeed Insights.
I'm an SEO Expert, so it bothers me that we are having a "red color warning" on Pagespeed, that we can't solve.
The warning stays as it is.
Any suggestions? How do you deal with it?
We have tried pretty much everything we could possibly find on Google. Deferring, Async etc., didn't work.
PS: The problem is not the scripts inside the GTM, the problem is the GTM-Script's itself. We have already dealt with the scripts in the GTM container...
What you're seeing there is that Lighthouse found that the GTM library (as well as the gtag library, as well as probably majority of JS libraries) are loading a lot of code that isn't being used. At least at this very second.
GTM does some optimizations on its side depending on what's being used. You can try helping it by removing unused variables (including the built-in ones), and unused tags and triggers, especially the ones that introduce new types of triggers/tags. Reducing their number with more agile CJS is deemed to be a good practice. For example, majority of GA4 tracking on typical e-commerce sites only really requires two-three tags.
Still, you have to understand the report correctly. The size of the GTM/gtag and other libraries doesn't really matter. Not with modern network performance. Has no influence on page speed and really only matters on the initial hit, where the library is not yet cached.
You can try running lighthouse on other sites and compare the results. I doubt it would matter. Also, Google doesn't use Lighthouse in its ranking mechanisms. Not all things Lighthouse reports matter to SEO.
You can also use the resource blocker logic to test how much faster your pages would load when you block gtm. Spoiler alert: unless your GTM is a complete mess, you won't be able to measure the difference with any statistical significance.