You want to pull all of your Connect data together and share it across your sites. Let's say you have 10 different facilities. You all use L2L Connect to gather real-time data about performance, uptime, and downtime. You want to make the resulting production data available to all sites in a centralized repository, a Data Lake, if you will. You're considering both exporting the data and/or utilizing Power BI to do this. Either approach will facilitate allowing you to compile and analyze data.
The following generalizations will help you get started:
- Using the Power BI approach, you will have one connection per Factory ID
- There is a connector available for this.
- Connector hasn't been updated in a while may or may not work correctly at the moment.
- Using the data export option
- Manually downloading data from each table
- Manually upload that data to the data lake.
- This would be done per factory ID as well
- Writing custom scripts
- There are also API calls that can be made to Connect to automatically grab the table data required.
- There is additional documentation for grabbing the authorization token necessary for this approach.
- This is still done per factory ID, but the custom script can be written such that there is a list of factory IDs with associated authorization tokens and just looping through the factory IDs getting the desired necessary data.