Azure Data Storage Integration
In this article, we’ll integrate an Azure Blob Storage data source with Y42.
Azure Blob Storage is massively scalable and secure object storage for cloud-native workloads, archives, data lakes, high-performance computing, and machine learning. Refer to the Blob REST API documentation for more information.
Overview
Authentication
Import Settings
Select the container from which you would like to import objects.
Schema
There is no predefined schema for this integration. The integration will dynamically load the tables & columns which are defined within your specific system.
Note: If you are importing a spreadsheet and none of your rows have numeric values, the column names from the spreadsheet will not be imported as heading onto the Google BigQuery. However, if there's a number in any of the columns data, then the import will successfully take the first row as column headers. Therefore, we recommend creating an index column to avoid the issue.
Updating your data
For this source you can schedule full imports. Every time the source updates it will fully sync all your data. You have the option of scheduling updates by the month, weeks, days, and even by the hour.
Azure Data Storage Setup Guide:
- On Integrate, click on "Add..." to search for Microsoft Azure Blob Storage and select it.
- Name your integration.
- Sign in with your Connection Key and type the container name.
- Once you have connected your Microsoft Azure Blob container, you can start importing your objects.
- Select the tables you need and click import. You can access the tables once the status is “Ready”.
Note: You can always import and reimport other tables as well, or delete them.