Summary
Overview
Work History
Education
Skills
Personal Information
Timeline
Generic

Sridevi Kandregula

Seattle,United States

Summary

Analytical and highly knowledgeable Data Engineer with 15+ years of IT experience, proficient in designing and optimizing data pipelines. Goal-oriented, focused on quality and responsive to emergencies and specific business needs. Skilled in managing updates and new deployments.

Overview

17
17
years of professional experience

Work History

Data Engineer II

Electronic Arts
Vancouver
04.2021 - Current
  • Created and maintained Airflow data pipelines for business consumers.
  • Transformed Jenkins ETL pipelines into ELT/ETL pipelines by migrating from Jenkins to Airflow, automating data transfer from various sources such as APIs, Snowflake, and AWS S3 to an AWS Redshift database using DBT Core (Data Build Tool), resulting in cost savings by decommissioning the Jenkins server
  • Experienced in designing and implementing standardized processes for various data layers, including Staging, Raw, Mart, and Metrics
  • Experienced in standardizing processes for Dimensions, Measures/Facts, and Metric loads to support business operations, enabling analysts to create reporting dashboards effortlessly and allowing the business to analyze data more efficiently
  • Skilled in creating comprehensive documentation for pipelines, including detailed flowcharts, to facilitate on-boarding and enhance process understanding in Confluence
  • Developed custom scripts for automated backups, ensuring data integrity and security by automating multiple back-fill requests from analysts using Airflow ELT design, significantly reducing engineering effort.
  • Skilled in designing and developing the "Extraction" logic for ETL and ELT pipelines using Python
  • Proficient in extracting data from various sources via API calls utilizing Python's requests library, processing the data with Pandas DataFrames, and saving the results as CSV files in the RAW area of an S3 bucket for ETL pipelines
  • Established robust monitoring solutions to proactively detect potential issues before they escalate by implementing data quality checks using DBT and data health checks using an LSTM model for each data pipeline using python modules tensorflow.
  • Set up alert notifications via Slack to quickly identify issues and provide remediation for the business and this is designed and developed using python slack_sdk module
  • Collaborated with cross-functional teams to gather requirements for new database projects.
  • Evaluated new technologies and tools for possible integration into existing systems by engineering a cutting-edge real-time Chatbot named "SnowBot" by leveraging the capabilities of GenAI (OpenAI), Streamlit, and the Snowflake database for customer support.
  • Trained junior team members in best practices for effective database engineering by providing thorough on-boarding documentation.
  • Working on providing an DataOps requirement of creating and maintaining ECS environment triggered through Airflow operator for memory intensive ETL pipelines.

Data Analyst

Factors group of Companies
Vancouver
06.2019 - 03.2021
  • Worked on migrating SSRS reports from OLTP system to OLAP system by creating near real time ETL pipelines using SSIS and SQL Server Stored Procedures
  • This project involved Data Analysis, Data Modelling for Dimensions and Facts and design and develop data pipelines and deploying to production and documenting the tasks.

Lead Software Engineer

Fidelity Investments
Bangalore , India
11.2013 - 03.2019
  • While working as a Senior ETL and BI developer at Fidelity Investments, my responsibilities included writing ETL technical documentation for BI reporting requirements, developing ETL pipelines using Oracle Stored Procedures for OLTP systems, and developing ETL pipelines using Informatica for OLAP systems
  • Additionally, I developed several Business Intelligence reports using OBIEE and provided production support during installs and go-live.

Senior Associate

Cognizant Technology Solutions
Bangalore , India
03.2007 - 11.2013
  • During my tenure at Cognizant Technology Solutions, I worked as an Informatica developer and PLSQL developer for various banking clients, such as CompuCredit and BNYM (Global Class Action), and a retail client, McAfee
  • My role here is develop and maintain ETL data pipelines using Informatica.

Education

Bachelor of Engineering and Technology in Electronics Engineering -

JNTU Hyderabad
06.2006

Skills

  • DBT (Data Build Tool)
  • Snowflake
  • Data Warehousing
  • Python
  • Airflow
  • AWS S3
  • Informatica
  • Oracle Database
  • Database Performance Tuning
  • Looker
  • AWS EC2
  • AWS ECS (Beginner level)
  • Data Modeling
  • Real-time Analytics
  • SQL Expertise
  • Data pipeline control
  • Data Quality Assurance
  • Data Migration
  • Data Pipeline Design
  • ETL Design
  • SQL Programming
  • ELT Design
  • Query Optimization

Personal Information

Title: Data Engineer

Timeline

Data Engineer II

Electronic Arts
04.2021 - Current

Data Analyst

Factors group of Companies
06.2019 - 03.2021

Lead Software Engineer

Fidelity Investments
11.2013 - 03.2019

Senior Associate

Cognizant Technology Solutions
03.2007 - 11.2013

Bachelor of Engineering and Technology in Electronics Engineering -

JNTU Hyderabad
Sridevi Kandregula