Summary
Overview
Work History
Education
Skills
Languages
Certification
Timeline
Generic

Shekar Reddy M

Summary

Over 12+ years of work experience in the Software industry, specializing in Data Warehousing, ETL, and Reporting. Involved in Business Requirements Analysis, Application Design, Development, testing, and documentation. Proficient in all phases of the Software Development Life Cycle (SDLC) and Agile/Waterfall Project processes. Strong understanding of Dimensional Data modeling, including Star and Snowflake Schema, as well as Normalization/De-normalization. Extensive experience in full life cycle implementation of Data warehouses/Data Marts. This includes Data Analysis, design, development, support, documentation, and implementation. Tools used: Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), Informatica Data Replication (IDR), Informatica Data Quality (IDQ), Datastage, Talend Open Studio, and Power BI (Power Query). Expertise in integrating various data sources with Multiple Relational Databases, such as Oracle, SQL Server, Teradata, Salesforce, Snowflake, and DB2. Project experience as Tech Key/Lead ETL Developer implementing larger to medium ETL projects, including Data Migration and Data Integrations. Skilled in ETL of legacy data to Data Warehouse using Informatica Data Quality (IDQ), Informatica Developer, Informatica Cloud Real Time (ICRT), and Informatica Intelligent Cloud Services (IICS). Proficient in using Informatica debugger for error detection in mapping. Also, skilled in troubleshooting existing ETL bugs. Worked extensively on IICS Data Integration assets, including Mapping, Mapping Tasks, Task Flows, Data Sync tasks, and Scripts. Building Informatica Cloud REST/SOAP/Event Based API Processes using different Step Types for Cloud applications like Salesforce.com and Azure SQL. Proficient in creating IICS Service Connection Actions with different binding methodologies, input payload content types, and implementing Catch faults for error handling. Developed various IICS connections like Azure SQL Database, Teradata, Flat-file, AWS, SAP BW, Salesforce, and more. Good working knowledge of Teradata architecture, Data Warehouse concepts, and loading utilities (BTEQ, FLOAD, MLOAD). Experience using Informatica Data Replication (IDR) for real-time data replication from various source systems. Skilled in developing, monitoring, extracting, and transforming data using DTS/SSIS, Import Export Wizard, and Bulk Insert. Proficient in data profiling and analysis using Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ). Applied Change Data Capture (CDC) concept and used Informatica Power Exchange (PWX) to import source data from Legacy systems. Integrated SFDC (Salesforce.com) data with Informatica Cloud through Data Synchronization Task wizard. Proficient in processing tasks, scheduling sessions, and monitoring workflows. Also skilled in writing scripts like Stored Procedures, Views, Materialized Views, and Triggers. Proficient in writing SQL Queries, Functions, Triggers, Exception Handling, Cursor, Database Objects, and Collections. Experienced in conversion to Import and export API using the Report Conversion Tool. Skilled in handling various Reports Administrative tools, such as Management Console, Import Wizard, Report Conversion Tool, and Configuration Manager. Strong analytical and conceptual skills in database design and RDBMS implementation. Informatica administration experience including installations, configurations, Folder, Users, Migrations, Deployments on Informatica PowerCenter (9.x/10.0/10.2HF2) in Windows and Linux environment. Involved in ETL requirement procurement, establishing standard interfaces, data cleaning, developing data load strategies, designing mappings, testing, and Post Go-Live support. Extracted data from various sources like Oracle, Teradata, Azure SQL Database, SAP, DB2, XML, Netezza, and Flat Files. Utilized complex transformations like Joiner, Expression, Aggregate, Lookup, and more to load data into target systems. Created ETL mappings, mapplets, sessions, workflows, and handled Performance, error handling, Change Data Capture (CDC), and production support. Developed workflow dependency using Event Wait Task, Command Wait in Informatica. Performed data mapping, data masking, and automated Informatica workflow execution for UAT validations using shell scripts. Familiar with Microstrategy, Business Objects, Power BI Reporting Tools, SAP, Azure SQL Database, Hadoop, Informatica Data Validation (DVO), Datastage, and Python Scripting. Expertise in using Postman and SOAP UI to test REST API connections and get/post/patch the sample data to the application. Experience in UNIX shell scripting for file validations, downloads, and workflow executions. Proficient in Jira tickets, BMC Remedy incidents, Work orders, IMR (Incident Management Record), CMR (Change Management Records) process and management. Created job schedules on Event-based, File watching, Time windows, calendar based in scheduler tools like IBM redwood scheduler, Control-M, Autosys, $Universe tool to execute Informatica jobs. Used Informatica command line utilities like PMCMD/Cygwin to execute workflows in non-Windows environments. Solid industry experience in Finance, Banking, Retails, Insurance, and Food Industry domains. Strong ability to meet deadlines, handle pressure, coordinate multiple tasks in a project environment, and excellent communication and interpersonal skills. Led offshore teams for projects and involved in process design, code review, and training.

Overview

13
13
years of professional experience
1
1
Certification

Work History

Sr. Application Analyst

Sunlife Financial
01.2023 - Current
  • Worked closely with data modelers, business analysts and BI developers to understand requirements, to develop the ETL processes using Informatica Powercenter and IICS
  • Identifies strengths and weaknesses in existing processes, suggests areas of improvement, and helps enhance existing ETL processes to meet evolving requirements
  • Involved in level 2 and level 3 production support processes for resolution of ETL related issues in production
  • Develops data mapping and verify mapping and data
  • Ensures issues are identified, tracked, reported on and resolved in a timely manner
  • Provides oversight of other ETL development activities and unit testing all objects and processes, and performs fine-tuning and UAT issue resolution
  • Extensively used Informatica Client tools (Powercenter Designer, Workflow Manager, Workflow Monitor and Repository Manager) Informatica Intelligent Cloud Services
  • Worked on building different IICS connections like File, Salesforce, AWS S3, Oracle, REST V2, and Swagger
  • Implemented Reusable components like Data validation, Error handling, Reconciliation in IICS API & DI
  • Worked on importing data from AWS S3 to SQL Server using Cloud Data Integration
  • Worked on different IICS API Processes Input formats like FIELDS and WHOLE PAYLOAD
  • Worked on building Service Connector with different Actions Types like General, Abstract and Inherit & different Binding Types like JSON, URL and JSON Wrapped
  • Worked on setting up Ariba ITK connections for Import and Exports in IICS DI
  • Worked on Importing Financial data from SAP Ariba to flat file
  • Created IICS Data Integration Mapping Tasks, Synchronization tasks, Task Flows like Linear, Parallel, and Sequential with Decisions
  • Worked on IICS API lifecycle management (including Activating, Deactivating, and Deleting APIs, Publish API Metadata with Swagger, Ensure API Policy Management)
  • Created IICS Data Integration Hierarchical Schema, Saved Query, Mapplet-PC Import
  • Extensive experience in application integration using Python Scrips
  • Written Pre and Post Session SQL commands to drop and recreate the indexes on data warehouse and split Excel files, converting the Excel files to CSV using Python scripts
  • Extracted data from various heterogeneous sources like Oracle, Flat Files and performed complex joiner, Expression, Aggregate, Lookup, Stored procedure, Filter, Router transformation, Update strategy transformations to load data into the target systems
  • Extracted data from Csv/Excel to Json format file
  • Tuned the mappings and ETL procedures at both mapping and session level for the better performance
  • Identified and fixed the Bottle Necks and tuned the Mappings and Sessions for improving performance
  • Tuned both ETL process as well as Databases
  • Possesses understanding in the areas of application programming, database and system design
  • Worked on creating Control-M job schedules for IICS and SAP jobs
  • Worked on importing SAP BW Hana to SQL server using IICS DI
  • Extensively involved in enhancing and managing Unix Shell Scripts and Building Unix scripts in cleanup/Archive the source files
  • Exhibits confidence and an extensive knowledge of emerging industry practices when solving business problems
  • Takes input from supervisor and appropriately and accurately applies comments/feedback
  • Seeks and participates in development opportunities beyond training and Identifies critical issues with ease
  • Expertise in creating KT documents and Industry best practice documents
  • Environment: Informatica PowerCenter 10.4/10.5,Snowflake, Informatica Bigdata Management, IICS, Oracle 12c, MSSQL Server 2016/2018, PL/SQL Developer, Shell Scripting, Putty, Jira, Confluence, Teams, Workday, Python, Change Management, Sharepoint, AWS S3, Glue, Control-M, Autosys, SAP BW, SAP Ariba, CyperArk.

ETL Consultant

Alberta Blue Cross( ABC)
02.2021 - 12.2022
  • The purpose of the project is to implement an integration service to transfer the information required by Blue Cross Life (BCL) from Alberta Blue Cross (ABC)
  • Acquired and interpret business requirements, create technical artifacts, and determine the most efficient/appropriate solution design, thinking from an enterprise-wide view
  • Designed and constructed IICS mappings for data extraction from SFDC, Oracle, and SQL Server, followed by loading into Oracle
  • In addition, formulated ETL flows for generating business-specific flat file extracts
  • Worked in the Data Integration Team to perform data and application integration with a goal of moving data more effectively, efficiently and with high performance to assist in business critical projects coming up with huge data extraction
  • Developed Talend jobs by using multiple features in components for Oracle, file components, ELT components, SCD components and dataflow components etc
  • Migrated Talend jobs to Informatica Cloud Services(IICS)
  • Involved in building the EDW ETL Talend jobs and Source to Target mapping to load data into Data warehouse
  • Used tStatsCatcher, tDie, tLogrow to create a generic joblet to store processing stats into a database table to record job history
  • Implemented Type1, Type2, and Incremental load strategies within IICS
  • Performed data manipulations using various Talend components like tMap, tJavarow, tjava, tSqlRow, tMSSQLInput, tJDBCInput, tJDBCOutput and many more
  • Managed configuration tasks such as runtime environment setup, connection creation, scheduling, and mapping design
  • Created Talend jobs to copy the files from one server to another and utilized Talend FTP components
  • Created Implicit, local and global Context variables in the job
  • Created Mappings, Integration Templates, Bundles, and Task flows in IICS
  • Utilized diverse transformations to extract data from distinct file formats and relational sources
  • Responsible for creating the fact, lookup, dimension, staging tables and other database objects like views, stored procedure, function, indexes, and constraints
  • Implemented custom error handling in Talend jobs and worked on different methods of logging
  • Designed and optimized PL/SQL packages, stored procedures, tables, views, indexes, and functions for optimal performance in Oracle
  • Created Python scripts to inject the data to the application using get/patch/post mechanism
  • Developed a custom component to process Json files in a streaming by reading one by one row, as the Talend component was not able to deal with large data files and has limitations to process more than 2GB data
  • Worked with IICS, Talend Studio, SalesForce, Oracle, SQL Server, and GitHub
  • Good knowledge on understanding the Json schema
  • Automated the ETL jobs in $U scheduler on event based, calendar based etc
  • Environment: Talend Data Integration 7.3.1, Oracle 12x, XML Spy, GitHub, Informatica cloud services(IICS), Oracle SQL, Putty, WinSCP, Windows 10, MS-Office, Notepad ++, Python, $Universe scheduling, Confluence, Jira, Teams, Python, Postman, Soap UI, Skype.

Sr. ETL/Informatica IICS Cloud Developer

McCain Foods Canada
03.2016 - 01.2021
  • Expert in creating IICS API REST/SOAP/EVENT based processes using various step types (Create, Wait, Subprocess, etc.)
  • Proficient in IICS Protocol format translation, conversion, and API processes for data validation and error handling with XQUERY
  • Worked on building Service Connectors with different Actions Types and Binding Types
  • Implemented XQuery functions and Scripts for structured outputs in IICS API Processes and Service Connectors
  • Experienced in data integration from JSON, SOAP, and Salesforce
  • Designing and configuring API gateway for REST/SOAP application and data APIs security and monitoring
  • Created reusable components like Data validation, Error handling, and ABC framework for Reconciliation Process
  • Developed IICS Data Integration Mapping Tasks and Task Flows with various configurations
  • Worked on Dynamic Source, Dynamic Target files, and Auto/Manual Populate target columns in IICS Data Integration Mappings
  • Implemented Pre-Processing/Post-Processing commands, Email Notifications, and Batch Schedule in Mapping tasks
  • Created REST V2 Connections, Swagger Files, and various other connections
  • Worked with Informatica Discovery IQ for generating Dashboard metrics and Reports
  • Utilized Batch Scripts, UNIX shell scripts, and integrated them into IICS DI mapping tasks
  • Developed Hierarchical Schema, Saved Query, Mapplet-PC Import, and Parameter files in Secure Agent
  • Familiar with IICS Application Integration Console Advanced View and Data Integration Monitor
  • Expertise in SOA Certificate, 1-way SSL, and 2-way SSL for IICS API processes
  • Experienced in designing Real-Time, Batch Mechanism and Audit Balancing frameworks
  • Extensive knowledge of ER Modeling, Dimensional Modeling, SQL Server, Teradata, MS Azure SQL, and DB2
  • Skilled in writing UNIX Commands, Shell Scripts, Batch Scripts and Power Shell
  • Proficient in Unit, Regression, Integration, and Volume testing, as well as Development and Performance Tuning
  • Providing Application support engagements and On-Call (24
  • 7) Support for Production Problem Resolution
  • Involved in preparing Low-level, High-level design documents, Release notes and Run book
  • Participated in all phases of SDLC from requirement to support for production environment
  • Involved in Analyzing / building Teradata EDW using Teradata ETL utilities and Informatica
  • Expertise with Connected and Unconnected transformations like Lookup, Aggregator, Expression transformations
  • Proficient in designing reference data and data quality rules using IDQ
  • Worked on data profiling, data cleansing, audit balancing, Address doctor monitoring using IDQ
  • Utilized Informatica IDQ for initial data profiling and duplicate data removal
  • Expertise in implementing CDC, Real-Time, Data maps, Data registration processes with Informatica Power Exchange (PWX)
  • Worked with JMS, MQ Messaging, HTTP, XML, Web Services, Salesforce, Informatica B2B DT (Data Transformation), UDO Transformations
  • Developed mappings/sessions/workflows using Informatica Power Center 10.0/10.2HF2 for data loading
  • Created tasks and workflows in the Workflow Manager and monitored sessions in the Workflow Monitor
  • Used debugging techniques and Informatica debugger tool for mappings
  • Developed automated and scheduled load processes using IBM Redwood scheduler
  • Worked on transformations like Aggregator, Lookup for claim aggregations
  • Interacted with business users and ensured smooth application functioning
  • Involved in WhereScape tool POC, have very good understanding of using it with Teradata
  • Environment: Informatica Cloud (IICS), Microsoft Azure SQL, Informatica 10.0, 10.2HF2, Teradata 15.10, 16.10, UNIX, SQL Server 2012/2016 Windows 7/10, IBM Redwood Scheduler, Erwin, Salesforce, SAP, DB2
  • Power BI, Microstrategy, SharePoint, Business Objects, Datastage, IDQ, IDR, Bizlink, Wherescape, MS-Office 365, Jira

ETL/Informatica Developer

The Co-Operators
06.2015 - 02.2016
  • Data Information & Strategies - Customer Access
  • Responsibilities:
  • Analyzed source data and documented requirements from the business users
  • Mappings/Transformations and Informatica sessions documents confining to the business rules
  • Worked with business analysts for requirement gathering, business analysis, project coordination and testing
  • Worked on understanding and creating ETL Designs, Technical specifications
  • Extensively used ETL to load data using PowerCenter from source systems like Flat Files and COBOL, Excel Files, XML Files into staging tables and load the data into the target database Oracle
  • Created mappings in PowerCenter Designer using Aggregate, Expression, Filter, Sequence -Generator, look-up, Update Strategy, Rank, Joiner and Stored procedure transformations, Slowly Changing Dimensions (Type 1 and 2)
  • Worked with Connected and Unconnected Lookups and Stored Procedure for pre & post load sessions
  • Responsible to manage data coming from different sources like Hadoop, CSV, Flat file MS-SQL, Oracle etc
  • Loading and transforming of large sets of structured, semi structured, and unstructured data
  • Developing reports on SSRS on SQL Server 2012, OLAP cube and Architecture
  • Implemented the design on Visual Studio and TFS platform with SSIS language and its semantics, T-SQL scripts, and C#
  • Developed Automation logics for the existing Manual processes
  • Designed and Developed Pre-session/Post-session routines and batch execution routines using Informatica Server to run sessions
  • Designed SSIS Packages to Extract, Transfer, Load (ETL) existing data into SQL Server from different environments for the SSAS cubes
  • Expert in Creating and Managed OLAP Cubes using SSAS
  • Worked extensively on Mappings, Mapplets, Sessions and Workflows
  • Created UNIX Shell scripts to automate the process of generating and consuming the flat files
  • Developed test cases, prepared SQL scripts to test data, tested the sessions and workflows to meet the Unit Test Requirements, and used debugger to fix any invalid mappings
  • Provided support during UAT by working with multiple groups
  • Involved in coordinating end-to-end Test Strategy and Test Plans for Unit, UAT and Performance testing effort – includes setting up the environment, code migration, running the test cycles and resolving the issues
  • Environment: Informatica PowerCenter 9.6, Oracle 11g/10g, Microsoft SQL Server, Flat files, COBOL files, Hadoop, Unix, Schedulers, SSIS, PL/SQL, Erwin, TOAD, Windows

ETL/Informatica Developer

Staples Canada Inc
05.2014 - 04.2015
  • Co-ordinated Joint Application Development (JAD) sessions with Business Analysts and source developer for performing data analysis and gathering business requirements
  • Developed technical specifications of the ETL process flow
  • Designed the Source-Target mappings and involved in designing the Selection Criteria document
  • Worked on design and development of Informatica mappings, workflows to load data into Staging area, Data Warehouse and Data Marts in SQL Server and Oracle
  • Used Informatica PowerCenter to create mappings, sessions, and workflows for populating the data into dimension, fact, and lookup tables simultaneously from different source systems (SQL server, Oracle, Flat files, COBOL)
  • Created mappings using various Transformations like Source Qualifier, Aggregator, Expression, Filter, Router, Joiner, Stored Procedure, Lookup, Update Strategy, Sequence Generator and Normalizer
  • Performed the data profiling and analysis making use of Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ)
  • Deployed reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time
  • Used version mapping to update the slowly changing dimensions to keep full history to the target database
  • Was responsible for building ETL packages using SSIS tasks and T-SQL scripts by Visual Studio and TFS for Data analysis/profiling, Data Cleansing, Data Conversion and Metadata Management
  • Involved in migration of Informatica from 8.x to 9.x
  • Implemented sending of Post-Session Email once data is loaded
  • Created and Monitored Workflows using Workflow Manager and Workflow Monitor
  • Used Debugger to test the mappings and fixed the bugs
  • Tuned performance of mapping and sessions by optimizing source, target, bottlenecks and implemented pipeline partitioning
  • Worked with DBA for partitioning and creating indexes on tables used in source qualifier queries
  • Involved in Performance/Query tuning
  • Generation/interpretation of explain plans and tuning SQL to improve performance
  • Experience in writing expressions in SSRS and expert in fine-tuning the reports
  • Involved in exporting database, table spaces, using Data pump (10g) as well as traditional export/import (until 9i)
  • Knowledge of Cold/Hot backups and RMAN backups
  • Scheduled various daily and monthly ETL loads using Control-M
  • Involved in writing UNIX shell scripts to run and schedule batch jobs
  • Involved in unit testing and documentation of the ETL process
  • Involved in Production Support in resolving issues and bugs
  • Environment: Informatica PowerCenter 9.x/8.x, SQL Server, PL/SQL, Oracle 10g, Toad 8.0, Cognos 8.3, SSIS, ERwin, Windows NT, UNIX Shell Scripting.

Informatica Developer Enterprise

HSBC InvestDirect Limited
09.2010 - 02.2014
  • Extensively used Informatica 8x ETL tool for data transfers from Legacy systems to Oracle warehouse
  • Created test cases and assisted in UAT testing
  • Provide quality reviews for work of other team members when required
  • Work with other team members to transition application technical knowledge for cross training
  • Designed and developed complex ETL mappings by making use of transformations like Source Qualifier, Joiner, Update Strategy, Connected Lookup and unconnected Lookup, Rank, Expression, Router, Filter, Aggregator and Sequence Generator transformations
  • Used mapping parameters and variables
  • Prepared mapping specification document, which gives data flow and transformation logic for populating each column in data warehouse table
  • Used debugger to analyze data flow between source and target to fix data issues
  • Implemented audit and reconcile process to ensure Data warehouse is matching with source systems in all reporting perspectives
  • Prepared Standard Operating Procedure (Knowledge Transfer) document, which provides necessary information, required for Maintenance and Operation of application
  • Communicate and coordinate vendor and third-party consultant interactions
  • Identify opportunities to improve processes and cost effectiveness
  • Worked as ONCALL production specialist with primary and secondary duties
  • Analyzed and resolved incidents raised by application users on priority (low, medium, and high) through Altiris Enterprise queues
  • Modified and implemented existing applications mappings, workflows, design documents and test cases as per incident raised by users and internal Data issues
  • Coordinated with Technical product administration and Operations team in implementation of production break-fix, production error and production maintenance issues
  • Involved and completed new production applications Knowledge transfer and application turn over process as part of Application management services
  • Provided data loading, monitoring, System support and worked on data issues raised by end user during its production support phase.

Education

Bachelor of Engineering Technology - Computer Science

JNTU
Hyderabad, INDIA
04.2010

Skills

  • TECHNICAL SKILLS:
  • CLOUD
  • Informatica Cloud (Data Integration, Application Integration, Monitor, administration, Discovery IQ, Operational, Insights, Application Integration Console), PROS, COUPA, Salesforce, Intelex, Microsoft Azure SQL database
  • ETL Tools
  • Informatica PowerCenter 1041/105/100/102(HF2)/96/95/91/8x (PowerCenter Designer, Repository Manager, Workflow Manager, Workflow Monitor), SSIS,SSRS, Wherescape, Informatica Data Replication(IDR) 970, Informatica Data validation Option 951 (DVO) , DataStage, Bizlink , Talend Open Studio 731 Version, IDQ 102
  • Databases
  • Oracle 11g/10g/9i/12x,Snowflake, Teradata 1310,1510,1610, DB2, MS SQL Server 2018/2012/2005/2000, Hadoop, Mongo D
  • Languages
  • SQL, PL/SQL, Unix Shell Script, XQuery
  • Tools
  • TOAD, WinSCP, Putty, SOAP UI, Postman, XML Spy
  • Operating Systems
  • Windows XP/7/10, UNIX, LINUX
  • Job Scheduling
  • AutoSys, Control-M, Informatica Scheduler, IBM Redwood, $Universe
  • Communication
  • Microsoft Teams, WebEx, Zoom, Cisco Jabber , Workspace, Skype
  • Environment:
  • Informatica PowerCenter 86, Oracle 9i, TOAD, Windows XP, PL/SQL, OBIEE, Flat Files, COBOL, MS SQL Server, SQL
  • Loader, SSIS, UNIX Shell Scripting, Autosys, Erwin, Tidal, Windows XP
  • Data Warehouse Development
  • Application Development
  • Extraction Transformation and Loading (ETL)
  • Warehouse Models
  • Data Transformation
  • Data Extraction
  • Production Support
  • Technical Specifications

Languages

English
Full Professional

Certification

  • Certified DP-203 - Azure Data Engineer Associate

Timeline

Sr. Application Analyst

Sunlife Financial
01.2023 - Current

ETL Consultant

Alberta Blue Cross( ABC)
02.2021 - 12.2022

Sr. ETL/Informatica IICS Cloud Developer

McCain Foods Canada
03.2016 - 01.2021

ETL/Informatica Developer

The Co-Operators
06.2015 - 02.2016

ETL/Informatica Developer

Staples Canada Inc
05.2014 - 04.2015

Informatica Developer Enterprise

HSBC InvestDirect Limited
09.2010 - 02.2014

Bachelor of Engineering Technology - Computer Science

JNTU
Shekar Reddy M