I have created a power bi report which actually gets data sourced from Azure data lake storage gen2. The report source was a CSV file and I used power bi desktop to complete the report.
Below I have mentioned the workflow.
CSV file(in azure data lake) -> read using power bi desktop -> published into power bi workspace -> embedded report into power apps portal.
what my concern was,
without refreshing the power bi dataset was loading the older data to the corresponding report and I wonder where the data was preserved to show up until the next refresh? Is it storing the report data in any caching mechanism or what?
As you are working with CSV's Power BI Desktop will be working in import mode, so it is a copy of the file in the data lake, held in the Power BI Desktop file. When you load to the service, it loads the file to the (hidden) services blob storage there, so when you go to the service, it basically loads the file in the background, with the data stored in it and displays it on the web page.
So you can update the data in the file, and Power BI desktop will not get the latest data until you manually refresh it. Also the service will not get the latest data until to load the refreshed desktop file to the service.
You can of course set up a scheduled refresh of the data in the service, with out having to refresh/reload the file in PBI desktop first.
The below image gives some context around how the service works/stores data, with meta data items stored in the Azure SQL DB and files on Azure Blob Storage