-
Overview 2 min
-
Load data from a Microsoft Azure Blob Store 5 min
-
Load data (CSV and Parquet) from an Amazon S3 Bucket 7 min
-
Load data from a GCP Bucket using Spark via the External Table SQL Statement 5 min
- Feedback
-
Take Course Survey
Loading data from a Cloud Data Store into an Actian Data Platform Warehouse using an External Table
This series of short how to training videos provide step by step instructions for loading data from an Amazon S3 bucket, Microsoft Azure Blob Storage and Google Cloud Store into an Actian Data Platform Warehouse using an the 'External Table' feature.
Course Outcome:
The 'external table' feature provides you with the ability to read from and write to data sources stored outside of Actian Data Platform. The data source must be one that Apache Spark is able to read from and write to, such as files stored in formats like Parquet, ORC, JSON, or tables in external database systems. The information provided during this training will provide you with the knowledge required to implement this functionality to meet the needs of your use case.
Course Style:
The course is provided in a step-by-step fashion to ensure you are equipped with the information required.
Audience:
For Data Engineers, Developers, System or Database Administrators who have responsibility for maintaining your Actian Data Platform estate.
Prerequisites:
- No knowledge of the Actian Data Platform is required
- Actian Data Platform login credentials are assumed to be available to you
- Access to your Cloud Data Store and the requisite data files within
Supplementary Resources:
Product Documentation:
|
|
Actian Community:
|
|
Software Download:
|