Banner Image

All Services

Programming & Development Database Design & Administration

Azure Data Engineer

$65/hr Starting at $100

Data Engineer – Confidential -World-leading shipping company

• Create Databrick’s workspace on a cluster then configure the cluster in Azure to run notebooks.

• Create Python/Scala/SQL code within Databricks notebooks that read and write data from files, which

store in various formats like CVS, JSON, Parquet, and XML, located in different data sources.

• Create notebooks in Databricks that ingested data from Azure Data Lake Storage into Databricks

pipelines that shaped and curated data for

• Prepare advanced SQL queries within notebooks that run jobs, which extract, transform and load (ETL/

ELT) data, that can infer schema change, modify Delta Live Table, and monitor data loads.

• Prepare Azure Data Factory (ADF) pipelines that ingest and move data from Azure Data Lake Storage

(ADLS) into SQL Server Database,

• Prepare Data warehouse star schema using slow changing dimension (SCD) both type 1 and type 2

method

• Create Azure Synapse workspace using serverless SQL pools to query parquet files existing on ADLS

Azure Data Engineer – Confidential -American leading donut company

• Create pipelines with data flow activities to move data from Rest API to Azure Data Lake based on

Franchisee’s location and HQ’s data requirements.

• Build data movement and data transformation logic within the data pipeline’s activities, which utilize

complex and iterative processing logic, for data ingestion and preparation in both Azure Data Factory

and Databrick’s Notebooks.

• Monitor data pipelines and activities output to identify data within different tables, files, folders, and

documents.

• Create data models of business’ information to translate to the logical data model layer to create

tables and schema in snowflake.

• Detailed oriented data engineering professional with over 15 years of experience who undertakes complex

assignments and delivers consistent customer value-focus performance.

• Design integration layer from Azure Data Factory and Azure Data Lake to move transformed data into

Snowflake data warehouse and Azure Synapse Analytics.

• Design a data warehouse architecture of a star schema that adopted fact and dimensional tables with

Snowflake for business intelligence tool consumption.

• Build a Type 2 Slowly Changing Dimension (SCD) using Snowflake’s Stream functionality and automate

the process using Snowflake’s Task functionality.

About

$65/hr Ongoing

Download Resume

Data Engineer – Confidential -World-leading shipping company

• Create Databrick’s workspace on a cluster then configure the cluster in Azure to run notebooks.

• Create Python/Scala/SQL code within Databricks notebooks that read and write data from files, which

store in various formats like CVS, JSON, Parquet, and XML, located in different data sources.

• Create notebooks in Databricks that ingested data from Azure Data Lake Storage into Databricks

pipelines that shaped and curated data for

• Prepare advanced SQL queries within notebooks that run jobs, which extract, transform and load (ETL/

ELT) data, that can infer schema change, modify Delta Live Table, and monitor data loads.

• Prepare Azure Data Factory (ADF) pipelines that ingest and move data from Azure Data Lake Storage

(ADLS) into SQL Server Database,

• Prepare Data warehouse star schema using slow changing dimension (SCD) both type 1 and type 2

method

• Create Azure Synapse workspace using serverless SQL pools to query parquet files existing on ADLS

Azure Data Engineer – Confidential -American leading donut company

• Create pipelines with data flow activities to move data from Rest API to Azure Data Lake based on

Franchisee’s location and HQ’s data requirements.

• Build data movement and data transformation logic within the data pipeline’s activities, which utilize

complex and iterative processing logic, for data ingestion and preparation in both Azure Data Factory

and Databrick’s Notebooks.

• Monitor data pipelines and activities output to identify data within different tables, files, folders, and

documents.

• Create data models of business’ information to translate to the logical data model layer to create

tables and schema in snowflake.

• Detailed oriented data engineering professional with over 15 years of experience who undertakes complex

assignments and delivers consistent customer value-focus performance.

• Design integration layer from Azure Data Factory and Azure Data Lake to move transformed data into

Snowflake data warehouse and Azure Synapse Analytics.

• Design a data warehouse architecture of a star schema that adopted fact and dimensional tables with

Snowflake for business intelligence tool consumption.

• Build a Type 2 Slowly Changing Dimension (SCD) using Snowflake’s Stream functionality and automate

the process using Snowflake’s Task functionality.

Skills & Expertise

Data ManagementData WarehouseDatabase AdministrationDatabase DesignDatabase DevelopmentEngineeringMicrosoft SQL ServerModelingSQLStored ProceduresTransact SQL

0 Reviews

This Freelancer has not received any feedback.