I'm currently working on an existing Tabular Model that has about 1.5M rows. It is driven by SSMS DB views and deploys to SSAS.
My question is which language processes better/faster?
When adding measures I can either create SQL case
statements in the view or place the measure directly in the model. I'm just starting this project but the data grows pretty steadily and I'm concerned about long term performance as the model continues to scale up with new fields, measures and records.
I use the following rules of thumb:
In the background, DAX is translated into SQL queries (you can see them using DAX Studio: https://www.sqlbi.com/tools/dax-studio/) . If DAX is written correctly, execution plans will be as efficient as SQL can be. Of course, if your DAX is wrong, it might translate into a very ineffective SQL, but that should be expected.
Finally, just to set expectations correctly: a Tabular model is considered "large" if it exceeds 1 Billion records. Models between 100M - 1B records are considered medium-sized and might need some optimization (basically, you must know what you are doing). Anything less than 100M records is a small model, and if you have performance issues, it's either your data model is wrong, or your DAX is incorrect/ineffective.