Summary
Overview
Work History
Education
Skills
Websites
Timeline
Generic

Bhargavkumar Rami

Etobicoke,ON

Summary

Accomplished Data Engineer with 6 Years of experience and specializing in data pipeline development and migration to Azure Data Lake. Proficient in Azure Data Factory and SQL database management, I excel in delivering robust ETL solutions and fostering collaboration across teams to drive data-driven decision-making.

Overview

7
7
years of professional experience

Work History

Data Engineer

George Brown College
Toronto, Ontario
11.2021 - Current
  • Create ADF pipeline to extract the data from different sources systems such as MS SQL, File system to Azure Blob Storage, and processed the data by applying business transformations and build the medallion architecture to build raw, silver, and gold layers.
  • Installed and configured the Self-Hosted Integration runtime to connect to the on-premise source systems.
  • Hands on experience in migrating MS SQL Server from On-premise to Azure Cloud.
  • Implemented meta data driven ADF pipeline as a re-usable components.
    Designed and implemented ADF pipelines and created triggers to schedule and run the pipelines based on schedule and event based approach.
  • Implemented Python notebooks to process the data in Azure Databricks to build medallion architecture such as raw, silver and gold layers.
  • Developed and managed enterprise-wide data analytics environments.
  • Implemented data visualization tools like Tableau and Power BI to create dashboards and reports for business stakeholders.

Data Engineer

Slalom
Toronto, Ontario
08.2019 - 08.2021
  • Design and implemented end-to-end ADF Pipelines to Extract the data from Heterogeneous system to the landing zone like Azure Blob Storage, process and transform to prepare the curated Dataset.
  • Designed and implemented ADF pipelines, and created triggers to schedule and run the pipelines based on a schedule and events-based approach.
  • Expert in setting up the Self-Hosted Integration Runtime on the on-premise side to connect and extract data from heterogeneous systems from on-premise to Azure.
  • Designed and implemented end to end Data pipelines in Synapse Analytics.
  • Experience on Migrating MS SQL Database to Azure Data Lake, Azure SQL Database and controlling and granting database access and migrating on premise database to Azure Data Lake store using Azure Data Factory.
  • Good understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, Cluster and Worker Nodes, Stages, Executors and Tasks.
  • Extensive knowledge and Experience in dealing with Relational Database Management Systems including Normalization, Stored Procedures, Constraints, Querying, Joins, CTE, Keys, Indexes, Data Import/Export, Triggers and Cursors.
  • Expert in Creating, deploying ADF pipelines using ARM templates.
  • Analyzed user requirements, designed and developed ETL processes to load enterprise data into the Data Warehouse.

MSBI Developer

Life Insurance Corporation
Ahmedabad, Gujarat,India
07.2018 - 07.2019
  • Created new tables, wrote stored procedures for application developers and some user defined functions. Created SQL scripts for tuning and scheduling.
  • Experienced in ETL (Extract, Transform and Load) development using Data Transformation Services (DTS) for migration of data from legacy systems to SQL Server 2000.
  • Designed Logical and Physical design of database for project on basis of user requirementsGenerated server-side T-SQL scripts for data manipulation and validation and created various snapshots and materialized views for remote instances.
  • Created a table-driven Quality Assurance Process integrated with Replication Administrator which would store and compare function calls results with current and prior periods, generate an email notification and report in an excel attachment.
  • Responsible for defining the complete Data Warehouse architecture (e.g. ODS, ETL process).
  • Used various SSIS tasks such as Conditional Split, Derived Column etc., which did Data Scrubbing, including data validation checks during Staging, before loading the data into the Data warehouse.
  • Designed high level ETL architecture for overall data transfer from the source server to the Enterprise Services Warehouse with different transformations to get the accurate data using SSIS and deployed the SSIS packages to different environments by maintaining the configuration settings in the XML files and variables.
  • Administrated the created Reports by assigning permission to the valid user for executing the reports.

Education

Bachelor of Commerce -

Gujarat University
Ahmedabad
05-2018

Skills

  • Data pipeline development
  • Data migration
  • Azure Data Factory
  • SQL database management
  • ETL processes
  • Data analysis
  • Azure Data Lake
  • Azure Synapse
  • Azure Databricks
  • Python programming
  • Azure SQL
  • Power BI
  • MSBI (SSMS, SSIS, SSRS, SSAS)

Timeline

Data Engineer

George Brown College
11.2021 - Current

Data Engineer

Slalom
08.2019 - 08.2021

MSBI Developer

Life Insurance Corporation
07.2018 - 07.2019

Bachelor of Commerce -

Gujarat University
Bhargavkumar Rami