Summary
Overview
Work History
Education
Skills
Timeline
Generic

Christian Bruneau

Longueuil,Québec

Summary

Analytics data developer consultant with over 6 years of experience, offering a range of services to help businesses manage their data effectively. Technical skills include expertise in Spark, Scala, Python, AWS Cloud, Databricks, and Snowflake. Proven experience in developing ETL pipelines, data lakes, and data warehouses, and can help identify the best technologies, processes, and strategies for enterprise data management.

Overview

5
5
years of professional experience

Work History

Big Data Developer (Consultant)

INTACT
07.2022 - Current
  • Contributed as member of Data engineering pipeline squad dedicated to claims family projects.
  • Delivered bronze and silver layer of Medallion architecture by bringing and cleansing multiple data sources into data platform to enable data science team and VP reports use cases.
  • Gave expertise of cloud platform technologies such as Databricks, Delta Lake, Snowflake, and AWS.
  • Improved current pipelines by making them idempotent and more efficient (some were improved from 1 hour compute into 2 minutes reducing compute costs).
  • Collaborated with departments in order to architect and build pipelines that enforce security and data governance for data in cloud and data masking for claims sensitive data when required

Big Data Developer (Consultant)

BLUE CROSS
03.2022 - 06.2022
  • Responsible for developing Blue Cross proof of concept to validate Azure Databricks tool as new datalake platform to replace Dremio.
  • Migrated over 10 data pipelines using PySpark, Delta Lake, and Terraform.
  • Demonstrated efficiency of Delta Table and best practices for Medallion architecture, jobs idempotence, data quality testing, unit testing, and PySpark code base development.
  • Assisted Blue Cross analytics team to use and experiment Azure Databricks in proof of concept.

Big Data Developer (Consultant)

VIDEOTRON
09.2019 - 06.2022
  • Built architecture and participated to develop datalake migration from Cloudera Hadoop to Confluent, Databricks, and Snowflake.
  • Leveraged cloud technologies to save more than 2 million dollars for fixed costs and more than 1 million for variable costs per year.
  • Improved datalake data delivery performance by switching seamlessly from batch to near real-time for more than 50 pipelines using Spark Streaming, Delta Lake, Kafka, and Terraform on AWS cloud.
  • Monitored and enforced 2 minutes ETA of data delivery with near real-time intensive and data crunching pipeline.
  • Supported and operated datalake in order to deliver quality data for multiple use cases as helix support monitoring, VPs report team, marketing, data science exploration.
  • - Filtered, enriched, cleansed, and deduplicated data from pipelines sources sent by Comcast vendor to deliver best quality data possible for Videotron department.

Big Data Developer (Consultant)

NATIONAL BANK
12.2018 - 07.2019
  • Contributed to building datalake for HR department with focus on preparing data and using knowledge graph to reason about it.
  • Collaborated with knowledge graph expert, data scientists, and devops to deliver big data platform solution using Azure Datalake Gen2, Databricks jobs, and PySpark.
  • Explored Microsoft Graph and Azure technologies to leverage data and documents for NBC employees in order to feed knowledge graph for reasoning.
  • Architected and developed good data pipeline with quality in mind for HR datalake using Medallion architecture.
  • Participated in activities of code reviews, tests and quality control of applications

Big Data Developer (Consultant)

CANADIAN NATIONAL RAILWAY
07.2018 - 11.2018
  • Built quality data gathered from CN train to enhance datalake used by monitoring application to support backend microservices team.
  • Contributed to architecture and design for CN's datalake with big data team.
  • Developed Apache Storm application to handle real-time data coming from trains under PTC program in Hadoop Cloudera ecosystem.
  • Contributed to development of over 10 microservices using springboot java backend application into Bitbucket.
  • Built batch data pipelines for analytics and reporting using Scala Spark with Hadoop.

Education

Bachelor of Science - Computer Science

University of Montreal
Montreal
07.2014

Skills

  • Spark
  • Scala
  • Python
  • Java
  • Terraform
  • Aws
  • Databricks
  • Deltalake
  • Snowflake
  • Hadoop
  • ETL pipeline
  • Data modeling

Timeline

Big Data Developer (Consultant)

INTACT
07.2022 - Current

Big Data Developer (Consultant)

BLUE CROSS
03.2022 - 06.2022

Big Data Developer (Consultant)

VIDEOTRON
09.2019 - 06.2022

Big Data Developer (Consultant)

NATIONAL BANK
12.2018 - 07.2019

Big Data Developer (Consultant)

CANADIAN NATIONAL RAILWAY
07.2018 - 11.2018

Bachelor of Science - Computer Science

University of Montreal
Christian Bruneau