Summary
Overview
Work History
Education
Skills
Certification
Accomplishments
Timeline
Generic

Ikenna Okolo

Edmonton,Alberta

Summary

Experienced Big Data Engineer with over 10+ years of experience in IT industry. Excellent reputation for resolving problems and improving customer satisfaction.

Background includes data engineering, Batch and Streaming processing. Proficient in machine learning and Cloud Technologies and Hadoop Ecosystem. Quality-driven and hardworking with excellent communication and project management skills.

Dedicated IT professional with history of meeting company goals utilizing consistent and organized practices. Skilled in working under pressure and adapting to new situations and challenges to best enhance the organizational brand.

Overview

10
10
years of professional experience
1
1
Certification

Work History

Senior Solutions Engineer (Big Data)

Google
Waterloo, Canada
02.2023 - Current
  • I wrote data processing pipelines using Apache Beam's SDKs in languages like Python and Java. These pipelines typically handle data extraction, transformation, and loading (ETL) processes.
  • I integrated data from various sources (e.g., databases, APIs, files) and ensured that data is processed in real-time or near-real-time (streaming) or in batches using GCP Dataflow
  • I implemented data validation, cleansing, and deduplication techniques to ensure the quality and integrity of the data.
  • Monitored the performance of data pipelines, troubleshooting issues (e.g., data loss, slow processing), and implementing fixes to ensure the system is reliable.
  • Lead the Dataflow and PubSub teams

Solutions Engineer (Big Data)

Google
Warsaw, Poland
09.2021 - 01.2023
  • Designed ETL jobs for ingesting large volumes of data, applying complex transformations using Apache Beam SDKs, and storing the processed data in BigQuery
  • Integrated with various data sources and sinks such as databases, file systems, or cloud storage (e.g., Google Cloud Storage, AWS S3, BigQuery, etc.).
  • Ensured pipelines were optimized for performance and cost-efficiency
  • Implemented robust error handling and retry mechanisms for pipeline failures.
  • Ensured comprehensive logging for pipelines and debug issues when they arise.

Lead Data Engineer

Tradedepot
01.2021 - 09.2021
  • Developed, implemented, supported and maintained data analytics protocols, standards and documentation.
  • I integrated into over 5 data sources that stored the transaction data.
  • Completed the extraction of the data to the Redshift data warehouse.
  • Designed and created ETL code installations, aiding in transitions from one data warehouse to another.
  • Designed integration tools to combine data from multiple, varied data sources such as RDBMS, SQL and big data installations.
  • Managed data quality issues during ETL processes, directing qualitative failures to team lead for amelioration.

Lead Data Engineer

Betking
London, UK
07.2020 - 01.2021
  • Provided comprehensive analysis and recommend solutions to address complex business problems and issues using data from internal and external sources and applied advanced analytical methods to assess factors impacting growth and profitability across product and service offerings.
  • Managed a team of 8 Data Engineers
  • Worked with the global Betking team to integrate data from various countries that we trade in.
  • Led projects and analyzed data to identify opportunities for improvement.
  • Explained data results clearly and discussed how it can be utilized to support project objectives.
  • Improved data collection methods by designing surveys, polls and other instruments.

Data Engineer

Betking
08.2019 - 05.2020
  • I successfully set up the organizations Data warehouse by building Robust and Intelligent ETL Pipelines on Azure stack
  • Integrated into over 10 data sources both structured and unstructured data sources
  • Designed over 50 Datamarts that met business goals.
  • Collaborated with BI Team to produce customized ETL solutions for specific goals.
  • Collaborated with business analysts to understand and define requirements.
  • Validated warehouse data structure and accuracy.
  • Cooperated fully with product owners and enterprise architects to understand requirements.

Data Engineer

Sterling Bank
07.2018 - 08.2019
  • Performed large-scale data conversions, transferring lots of data into standardized structured and Unstructured formats for integration into Data-warehouse and also Data lake.
  • Collaborated with team on ETL (Extract, Transform, Load) tasks, maintaining data integrity and verifying pipeline stability.
  • Interpreted data models for conversion into ETL diagrams and code.
  • Managed data quality issues during ETL processes, directing qualitative failures to team lead for amelioration.
  • Prepared functional and technical documentation data for warehouses.
  • Carried out day-day-day duties accurately and efficiently.

BI Developer

Sterling Bank
01.2018 - 07.2018
  • Wrote and optimized SQL statements to assist business intelligence practices.
  • Assessed platform statistics to create customized, data-intensive BI analytics.
  • Developed intelligence-sharing dashboards, providing company-wide access to collected data.
  • Handled massive quantities of business intelligence data, personally managing as much as 5 terabytes.
  • Collected data from wide-ranging sources.
  • Developed and managed dashboards using Azure stack.

Application Support Engineer

Sterling Bank
02.2017 - 12.2017
  • Provided support for Sterling bank's Internet banking services and also all Mobile banking platforms.
  • Ensured Servers and all necessary resources that applications run on were properly maintained and updated.
  • Recommended changes and areas of Improvement to the business for better customer satisfaction
  • In general, I Increased Customer onboarding rate on our E-platforms by over 100% based on improved reliability E-channels platform provided.

Data Analyst

Crisp TV
11.2014 - 01.2017
  • Developed ETL packages for data loading from various sources, such as flat files, XML files, and Oracle databases, into the Azure Data Warehouse using SSIS.
  • Configured error handling for all SSIS packages by establishing connection managers, event handlers and logging to track package execution status.
  • Created complex SSIS packages with multiple transformations such as derived columns, lookups, conditional splits, script tasks and execute SQL tasks.
  • Designed and built efficient data models in Power BI Desktop, defining relationships between tables, and optimizing for performance.
  • Created compelling and informative visuals (charts, graphs, maps, tables) using Power BI's visualization tools.
  • Shared reports and dashboards with stakeholders, setting up access permissions, and facilitating collaboration.
  • Worked with stakeholders to understand their reporting needs and translate them into technical specifications.
  • Performed exploratory data analysis to identify trends, patterns, and insights.

Education

Master of Science - Applied MSc in Data Engineering For AI.

DataScience Tech Institute
Paris, France
03-2024

BEng - Electronics And Computer Engineering 2.1

University of Technology
10-2014

Skills

  • Cloud Technologies: AWS, GCP, AZURE
  • NoSQL Databases: Mongo DB, Cassandra DB, Hive, Redis, BigTable
  • Programming Language: Python, Java, Scala, SQL
  • Big Data Technologies: Hadoop, Spark, Apache Beam, Apache Airflow
  • Devops Engineering: Devops, CI/CD, Terraform, Kubernetes, Microservices
  • Storage and Pipelines: ETL, Data-Warehousing, Batch Processing, SSIS, Data-Fussion, Kinessis , DataFlow, ADF
  • SQL DB: MS SQL, Oracle, My SQL, PostgreSQL, Redshift, BigQuery, Azure Synapse, CloudSQL

Certification

  • Microsoft Technical Associate, Database Fundamental
  • Querying Microsoft SQL Server 2012/2014
  • Developing Microsoft SQL Databases
  • Implementing a Microsoft Data Warehouse
  • Querying Data with Transact-SQL
  • Microsoft Certified Solutions Associate, Database Development - Certified 2019
  • Microsoft Certified Solutions Expert, Data Management and Analytics — Certified 2019
  • Simplilearn, Big Data for Data Engineering.
  • Simplilearn, Data Engineering with Hadoop
  • Associate Cloud Engineer, Google Cloud - Certified 2022
  • Google Cloud Professional Data Engineer - Certified 2022
  • Google Cloud Professional Cloud Architect - Certified 2022
  • AWS Certified Cloud Practitioner - Certified 2022
  • AWS Certified Solutions Architect - Certified 2022
  • AWS Certified Developer - Certified 2023
  • AWS Certified Devops Professional - Certified 2023
  • AWS Certified Database Specialty - Certified 2023
  • AWS Certified Database Specialty - Certified 2023
  • Microsoft Certified: Azure Data Fundamentals - Certified 2023
  • GCP Certified Cloud DevOps Engineer - Certified 2023
  • GCP Certified Cloud Database Engineer - Certified 2023
  • GCP Certified Cloud Digital Leader Certified 2023
  • Azure Certified Data Fundamentals Certified 2024
  • Azure Certified Fundamentals Certified 2024
  • Neo4J Certified Professional Certified 2024
  • Databricks Certified Data Engineer Certified 2024

Accomplishments

  • In 2022, my performance was exceptional, earning me a Transformative Impact rating in my appraisal. This distinguished rating, akin to a 5-star accolade, is granted to less than 1% of Googlers, acknowledging outstanding contributions and exceptional performance throughout the calendar year.
  • Led the migration of Sterling Bank's Data Architecture from On-premise to the Azure Cloud, utilizing SSIS and ADF (Azure Data Factory). Implemented automation of Sterling Bank's ETL pipeline, enhancing resilience in case of failures.
  • Architected and executed BETKING's Data Engineering roadmap, establishing the Data Warehouse and constructing ETL Pipelines from the ground up using ADF, SSIS, and Python scripts. Implemented triggers, metrics, and alerts to ensure data consistency, bolstering business confidence in the Data Team.
  • Successfully integrated Tradedepot's CRM platforms, including Freshdesk, Intercom, and Odoo, into the Data Warehouse (AWS Redshift) using a combination of Python scripts, ADF, Stitch Data, and Dataflow.
  • Designed the underlying logic empowering Machine Learning models for Loan Approval.

Timeline

Senior Solutions Engineer (Big Data)

Google
02.2023 - Current

Solutions Engineer (Big Data)

Google
09.2021 - 01.2023

Lead Data Engineer

Tradedepot
01.2021 - 09.2021

Lead Data Engineer

Betking
07.2020 - 01.2021

Data Engineer

Betking
08.2019 - 05.2020

Data Engineer

Sterling Bank
07.2018 - 08.2019

BI Developer

Sterling Bank
01.2018 - 07.2018

Application Support Engineer

Sterling Bank
02.2017 - 12.2017

Data Analyst

Crisp TV
11.2014 - 01.2017

Master of Science - Applied MSc in Data Engineering For AI.

DataScience Tech Institute

BEng - Electronics And Computer Engineering 2.1

University of Technology
Ikenna Okolo