Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Portfolios - Linkedin
Timeline
Generic

Kishore Raghavendran Shanmugam

Toronto,Canada

Summary

Dynamic Data Engineer with proven expertise at Ford in optimizing ETL workflows and enhancing data quality. Skilled in Python and SQL, I excel at designing interactive dashboards and automating data pipelines. A collaborative problem-solver, I thrive in cross-functional teams, delivering impactful analytics solutions that drive informed decision-making.

Overview

5
5
years of professional experience
1
1
Certification

Work History

Data Engineer

Ford
Chennai
11.2023 - 02.2025
  • Developed Interactive Dashboards & Scorecards – Built data-driven dashboards in Qlik Sense for Ford indirect purchasing data, enabling stakeholders to make informed decisions
  • Optimized Data Pipelines for Reporting – Automated Alteryx workflows to extract, transform, and load (ETL) data from CSV, SQL databases, and APIs, reducing manual data handling and improving turnaround times
  • SQL for Data Analysis – Queried and analyzed large datasets in BigQuery and SQL databases, optimizing data aggregation
  • Integrated API Data Pipelines – Designed Python-based backend solutions to extract data from ARIBA Sync API, apply business logic, and store results in GCP buckets for downstream analysis
  • Data Automation with Alteryx – Designed and implemented automated Alteryx workflows for complex data transformation, cleansing, and integration tasks
  • Leveraged Alteryx's powerful tools to streamline data extraction from multiple sources, including SQL databases and APIs, and efficiently loaded the transformed data into BigQuery for further analysis, significantly improving processing speed and accuracy
  • Developed and deployed Python-based microservices using Flask and Fast API to automate data extraction from the Ariba Sync API
  • Implemented OAuth 2.0 for secure authentication, handled asynchronous data retrieval, and transformed JSON responses using Pandas
  • Integrated processed data into Google Cloud Storage and BigQuery, enabling real-time analytics and reporting
  • Ensured reliability through error handling, logging, and automated CI/CD deployment with Tekton
  • End-to-End Data Automation with Tekton Pipelines - Leveraged Tekton to automate data ingestion workflows, ensuring seamless API-to-GCP data transmission
  • Informatica Cloud Service – Developed and managed job schedules within Informatica Cloud to automate data integration and processing tasks

Data Analyst Analytical and Data Platforms

Loblaw
Brampton, Canada
04.2022 - 10.2023
  • Technology Stack: Google Cloud Platform, AXON, BigQuery, Citrix, OpenText, Python
  • Metadata Management & Data Governance in AXON Designed and implemented Unix shell scripts to automate the extraction, parsing, and transformation of mainframe metadata
  • Citrix & OpenText Administration: Administered Citrix Workspace and OpenText Content Server environments, ensuring stable access to enterprise applications and documentation across business units
  • ETL Pipeline Development in BigQuery: Developed and optimized ETL pipelines in BigQuery, performing fast SQL-based data processing and ensuring low-latency analytics for key business dashboards
  • Cloud Infrastructure Design: Designed and deployed cloud-native infrastructure components on GCP, including Compute Engine, Kubernetes clusters, and secure networking configurations
  • Python API Integrations: Developed Python scripts using requests and pandas to integrate third-party APIs with internal databases, streamlining data collection and reducing manual ingestion tasks

Business Data Analyst

FB Canada Express
Mississauga, Canada
11.2019 - 03.2022
  • Technology Stack: SQL, Power BI, Excel, Google Sheets
  • Part Cost Excellence Gen 2 Project: Led the end-to-end data lifecycle for a high-impact cost optimization initiative
  • Responsible for gathering, cleaning, and aligning large data sets to ensure accuracy in procurement cost analysis
  • Advanced Excel/Google Sheets Expertise: Utilized advanced spreadsheet capabilities (pivot tables, VLOOKUP, conditional formatting, data validation) to process and analyze procurement and operational data efficiently
  • SQL Development & Data Modeling: Developed robust SQL queries and stored procedures to extract and transform large volumes of operational data
  • Applied dimensional modeling and Snowflake schema techniques to support analytical reporting
  • Power BI Dashboard Design: Created interactive dashboards in Power BI that provided real-time visibility into cost-saving opportunities, part performance metrics, and operational KPIs
  • Used Power Query and DAX for data shaping, modeling, and high-performance reporting
  • Data Quality & Pipeline Development: Engineered SQL-based ETL pipelines to cleanse and transform inconsistent datasets, ensuring consistency and reliability for business reporting across departments

Education

Full Stack Training Program - Data Engineering

Centre For Artificial Intelligence
Online
07.2022

Masters in Project Management - Project Management

Fanshawe College
London, ON, Canada
09.2019

Bachelor of Engineering -

Anna University
India
04.2018

Skills

  • Languages : SQL, Python, R, PySpark, VBA and Linux Environments
  • Frameworks: Pandas, NumPy, Matplotlib, FastAPI, and Flask Platforms: Pycharm, Jupyter Notebook, Visual Studio Code and Intellij IDEA
  • Data Visualization & BI Tools: Looker, Qlik Sense, Power BI, Tableau and Spotfire
  • ETL & Data Integration: Alteryx, Informatica (ETL, IDQ, EDC, AXON) and Informatica Intelligent Cloud Service
  • Cloud & Data Platforms: Google Cloud Platform (GCP) – BigQuery, Cloud Storage, Cloud Run and Kubernetes Version Control & CI/CD Pipelines: GitHub, Tekton (Pipeline Automation), Git-based Deployments
  • Reporting & Collaboration Tools: Jira, Confluence, SharePoint, Service now Other Technologies: Teradata, SAS, MicroStrategy, DBT (Data Build Tool)0

Certification

CERTIFIED DATA SCIENTIST, Centre for Artificial Intelligence

Accomplishments

  • Worked with Teradata, Google Cloud Platform (BigQuery), and SAS to extract, transform, and analyze large datasets.
  • Production Deployment via Git & CI/CD – Deployed Python scripts to production by leveraging Git-based workflows and CI/CD pipelines.
  • Studied and implemented cloud-based ETL workflows using AWS Glue, Azure Data Factory, and BigQuery.
  • Explored and tested serverless computing models, deploying applications using AWS Lambda and Azure Functions.
  • Proficiency in English (Certified IETLS Candidate with a band score of 7.5) - Canadian Permanent Resident

Portfolios - Linkedin

https://www.linkedin.com/in/kishore-r-shanmugam-he-him-589876185/

Timeline

Data Engineer

Ford
11.2023 - 02.2025

Data Analyst Analytical and Data Platforms

Loblaw
04.2022 - 10.2023

Business Data Analyst

FB Canada Express
11.2019 - 03.2022

Full Stack Training Program - Data Engineering

Centre For Artificial Intelligence

Masters in Project Management - Project Management

Fanshawe College

Bachelor of Engineering -

Anna University
Kishore Raghavendran Shanmugam