Summary
Overview
Work History
Education
Skills
Certification
Timeline
Generic

Vasanth Ale

Summary

Results-driven Data Engineer with over 6 years of experience designing and implementing scalable data solutions across diverse industries. Proven expertise in ETL/ELT development using Talend, SSIS, and Azure Data Factory, along with strong knowledge of Data Warehousing, Big Data ecosystems, and cloud-based data pipelines. Proficient in delivering end-to-end workflows and insights using tools such as Databricks, Power BI, Tableau and Splunk with a strong focus on data quality and performance.

Overview

6
6
years of professional experience
3
3
Certification

Work History

Senior Data Engineer

Berkley Canada
11.2023 - 01.2025
  • Company Overview: a Berkley Company
  • Understand the current Production state of the application and determine the impact of new implementation on existing business processes.
  • Designed and developed ETL solutions to extract source/raw data from Oracle, DB2, Excel and Flat files using SSIS.
  • Migrated legacy ETL workflows from SSIS to Azure Data Factory (ADF) enhancing scalability, monitoring, and cloud integration.
  • Designed and implemented ETL processes to import claims data from TPAs into a database staging environment and then integrate the data into reports and Cubes.
  • Re-engineered over 50+ SSIS packages into ADF pipelines reducing operational overhead and improving execution efficiency by 30%.
  • Performed data modeling and led data cleansing activities using Azure Data Factory (ADF) to normalize data, improve data quality, and eliminate redundant or unused datasets in the cloud environment.
  • Designed and maintained data models in SQL Server Analysis Services (SSAS) improving self-service analytics and visualization capabilities for Power BI and SSRS.
  • Built reports and dashboard using SSRS and Power BI to improve business processes while maintaining and supporting existing Reports, Dashboards, Cubes and SSIS packages.
  • Used ADF variables with Azure Key Vault to securely manage Databricks tokens and environment settings, ensuring safe and consistent configuration across all stages.
  • Created and maintained documentation for data pipelines, models, and system design, including data sources and ETL processes. Helped teams understand the data flow, supported smooth handovers and improved collaboration across the analytics environment.
  • Developed, implemented and maintained change control / testing processes for migration of database objects across environments.
  • A Berkley Company

Senior Integration Specialist (ETL Engineer)

Loblaw Companies Ltd
02.2022 - 10.2023
  • Designed, developed, and maintained ETL (extract, transform, load) processes to ingest data from diverse sources into a data warehouse which includes data extraction, transformation, and loading of structured and unstructured data.
  • Leveraged Azure Databricks for big data analytics, enabling processing of large-scale datasets and extraction of valuable insights using Apache Spark.
  • Experienced in performance tuning of Spark applications by optimizing batch interval settings, configuring appropriate levels of parallelism, and effective memory management.
  • Independently led the migration of Databricks notebooks to Azure Synapse Analytics by rewriting essential Spark functionalities and integrating web activities in Azure Data Factory to streamline and optimize the transition.
  • Developed and executed ETL scripts (BTEQ, MLOAD, TPT, FASTLOAD) to load point-of-sale (POS) sales data from legacy systems into Teradata, integrating with SAP and processing data on Hive-based Big Data platforms.
  • Migrated legacy ETL workflows to Airflow and Databricks, making them easier to monitor with clear DAG visualizations and detailed logging for better transparency and control.
  • Built advanced Power BI reports and Splunk dashboards with KPIs and interactive features like drill-down and drill-through, delivering clear operational insights.
  • Integrated Delta Lake with Azure Databricks to support ACID-compliant transactions, schema evolution, and time travel for reliable audit and traceability.
  • Designed and implemented solutions to decommission legacy servers by handling mitigation, data migration and cleanup of temporary files used in ETL processes.
  • Implemented automated data integrity audit checks using Python scripting and Autosys, detecting discrepancies and missing data across ETL pipelines.
  • Demonstrated experience in the principles of ITIL Incident Management, Problem Management, and Change Management.

ETL Consultant

Artha Solutions
11.2018 - 02.2022
  • Transformed complex SQL scripts into SSIS ETL packages and collaborated with IT to support package implementation from development to production.
  • Integrated Airflow with ADF, PySpark and Python Notebooks of Databricks For end-to-end data orchestration.
  • Scheduled and monitored over 100+ ETL jobs daily using Airflow, ensuring high availability, error handling and SLA compliance.
  • Worked with Memory cache for static and dynamic cache for the better throughput of sessions containing Rank, Lookup, Joiner, Sorter and Aggregator transformations.
  • Generated research reports, wealth information and accounts information reports, which were then published within the Power BI server for clients and advisors via ad-hoc reporting.
  • Implemented row-level security, utilizing various transformations in Power BI to restrict data access for different users.

Data Analyst

Capgemini
06.2016 - 12.2016
  • Company Overview: Hyderabad
  • Designed and deployed executive dashboards in Tableau to deliver critical business insights to senior leadership, enabling data-driven decisions across departments.
  • Collaborated with cross-functional teams to gather requirements, document business needs (BRD) and translate them into actionable insights using Tableau, Excel and SQL.
  • Analyzed credit card transaction data to identify fraud patterns, supporting fraud mitigation strategies that led to a 17% reduction in fraudulent activity.
  • Developed and optimized Excel-based tools with advanced formulas and VBA macros to automate reporting tasks, improving efficiency and reducing manual effort.
  • Collected and analyzed raw data to support the development of new strategies and rules, including support for User Acceptance Testing (UAT).
  • Maintained and improved database views that feed into the enterprise data warehouse and reporting systems to ensure data accuracy and performance.
  • Hyderabad

Education

Post-Graduation - Computer Software & Database Development

Lambton College
01.2018

Bachelor of Technology - Computer Science & Engineering

JNTU Hyderabad
01.2016

Skills

  • Windows
  • Mac OS X
  • Linux
  • SQL
  • PL/SQL
  • Python
  • Spark
  • Shell
  • MySQL
  • MS SQL
  • Teradata
  • MongoDB
  • Oracle
  • Amazon Redshift
  • HDFS
  • MapReduce
  • Hive
  • PySpark
  • Airflow
  • Databricks
  • Power BI
  • Tableau
  • SSRS
  • Splunk
  • Talend DI
  • Talend Big Data
  • SSIS
  • Azure ADF
  • AWS Glue
  • Informatica Data Quality (IDQ)
  • Microsoft Azure
  • Snowflake
  • Talend Cloud
  • Git
  • GitHub
  • SVN

Certification

  • Talend Data Integration V7 Certified Developer - 21821391
  • Azure Fundamentals (AZ-900) - D0C31305EA6F5C9D
  • Azure Data Engineer Associate (DP-203) - 399B19A3CD12FB2F
  • SnowPro Core (COF-C01)

Timeline

Senior Data Engineer

Berkley Canada
11.2023 - 01.2025

Senior Integration Specialist (ETL Engineer)

Loblaw Companies Ltd
02.2022 - 10.2023

ETL Consultant

Artha Solutions
11.2018 - 02.2022

Data Analyst

Capgemini
06.2016 - 12.2016

Post-Graduation - Computer Software & Database Development

Lambton College

Bachelor of Technology - Computer Science & Engineering

JNTU Hyderabad
Vasanth Ale