Learn How to load data from Amazon AWS S3 bucket to Snowflake Data Warehouse
Step:1:
Create a snowflake account for 30 days.
Set the role to ACCOUNTADMIN
Step:2
A warehouse in snowflake provides the required resources, such as CPU, memory, and temporary storage, to perform the DML operations.
create warehouse ‘Warehouse_Name’;
create database ‘database_name’
create schema ‘database_name.Schema_name’
Step:3
Execute the below command to create storage object. This object helps you to connect to AWS bucket.
create or replace storage integration Snow_OBJ
type = external_stage
storage_provider = s3
enabled = true
storage_aws_role_arn = 'arn:aws:iam::933658582618:role/dbt_Snowflake_Role'
storage_allowed_locations = ('s3://bucky2023/');
Step:4
The below describe command helps you get the Amazon Resource Names ARN.Amazon Resource Names (ARNs) uniquely identify AWS resources.
desc integration Snow_OBJ;
Step:5
create or replace file format csv_format type = csv field_delimiter = ',' skip_header = 1 null_if = ('NULL', 'null') empty_field_as_null = true;
Step:6
A stage specifies where data files are stored so that the data in the files can be loaded into a table.
create or replace stage snow_stage_2023
storage_integration = Snow_OBJ
url = 's3://bucky2016/'
file_format = csv_format;
Step:7
Create a target table
create or replace table product
(
Rank_ int,
Name varchar(200),
Platform varchar(20),
Year_ varchar(5),
Month_ varchar(15),
Genre varchar(50),
Publisher varchar(50),
Country varchar(50),
City varchar(50),
State varchar(50),
Region varchar(50),
NA_Sales decimal(2),
Global_Sales decimal(2),
NA_Profit decimal(2)
)
Step:8
Use the below copy command to load data in the table.
copy into product from @snow_stage_2023
ON_ERROR = 'skip_file';
![](https://i.ytimg.com/vi/woFc8Om1-kY/maxresdefault.jpg)