Concerning the dashboard I am building. I have daily price metrics for about 100 company stock tickers which takes up about 100k rows and several columns in an excel sheet. I have a query sending the table to the data model in Power Pivot which in turn, is then used by a Dashboard in a separate workbook.
My question is, am I missing anything that would enhance the storing and or handling of a large data table? Maybe there isn't even an issue here, but I would like any expert advice if you perceive there might be some improvement or direction as to best practices. I have PQ, PP, & DB courses so feel free to direct me to the proper lesson if I am forgetting.
Thank you for your advise
Michael
Hi Michael,
If your data is in Excel file A and you use Power Query in Excel file B to get the data and load it direct to Power Pivot, then you can't get more efficient than converting Excel file A into a text or csv file (these are faster for Power Query to get and load), and also making sure you're not importing any data you don't need in your dashboard. e.g. index columns, fields that aren't used etc.
Mynda