Over 12+ years of work experience in the Software industry, specializing in Data Warehousing, ETL, and Reporting. Involved in Business Requirements Analysis, Application Design, Development, testing, and documentation. Proficient in all phases of the Software Development Life Cycle (SDLC) and Agile/Waterfall Project processes. Strong understanding of Dimensional Data modeling, including Star and Snowflake Schema, as well as Normalization/De-normalization. Extensive experience in full life cycle implementation of Data warehouses/Data Marts. This includes Data Analysis, design, development, support, documentation, and implementation. Tools used: Informatica PowerCenter, Informatica Intelligent Cloud Services (IICS), Informatica Data Replication (IDR), Informatica Data Quality (IDQ), Datastage, Talend Open Studio, and Power BI (Power Query). Expertise in integrating various data sources with Multiple Relational Databases, such as Oracle, SQL Server, Teradata, Salesforce, Snowflake, and DB2. Project experience as Tech Key/Lead ETL Developer implementing larger to medium ETL projects, including Data Migration and Data Integrations. Skilled in ETL of legacy data to Data Warehouse using Informatica Data Quality (IDQ), Informatica Developer, Informatica Cloud Real Time (ICRT), and Informatica Intelligent Cloud Services (IICS). Proficient in using Informatica debugger for error detection in mapping. Also, skilled in troubleshooting existing ETL bugs. Worked extensively on IICS Data Integration assets, including Mapping, Mapping Tasks, Task Flows, Data Sync tasks, and Scripts. Building Informatica Cloud REST/SOAP/Event Based API Processes using different Step Types for Cloud applications like Salesforce.com and Azure SQL. Proficient in creating IICS Service Connection Actions with different binding methodologies, input payload content types, and implementing Catch faults for error handling. Developed various IICS connections like Azure SQL Database, Teradata, Flat-file, AWS, SAP BW, Salesforce, and more. Good working knowledge of Teradata architecture, Data Warehouse concepts, and loading utilities (BTEQ, FLOAD, MLOAD). Experience using Informatica Data Replication (IDR) for real-time data replication from various source systems. Skilled in developing, monitoring, extracting, and transforming data using DTS/SSIS, Import Export Wizard, and Bulk Insert. Proficient in data profiling and analysis using Informatica Data Explorer (IDE) and Informatica Data Quality (IDQ). Applied Change Data Capture (CDC) concept and used Informatica Power Exchange (PWX) to import source data from Legacy systems. Integrated SFDC (Salesforce.com) data with Informatica Cloud through Data Synchronization Task wizard. Proficient in processing tasks, scheduling sessions, and monitoring workflows. Also skilled in writing scripts like Stored Procedures, Views, Materialized Views, and Triggers. Proficient in writing SQL Queries, Functions, Triggers, Exception Handling, Cursor, Database Objects, and Collections. Experienced in conversion to Import and export API using the Report Conversion Tool. Skilled in handling various Reports Administrative tools, such as Management Console, Import Wizard, Report Conversion Tool, and Configuration Manager. Strong analytical and conceptual skills in database design and RDBMS implementation. Informatica administration experience including installations, configurations, Folder, Users, Migrations, Deployments on Informatica PowerCenter (9.x/10.0/10.2HF2) in Windows and Linux environment. Involved in ETL requirement procurement, establishing standard interfaces, data cleaning, developing data load strategies, designing mappings, testing, and Post Go-Live support. Extracted data from various sources like Oracle, Teradata, Azure SQL Database, SAP, DB2, XML, Netezza, and Flat Files. Utilized complex transformations like Joiner, Expression, Aggregate, Lookup, and more to load data into target systems. Created ETL mappings, mapplets, sessions, workflows, and handled Performance, error handling, Change Data Capture (CDC), and production support. Developed workflow dependency using Event Wait Task, Command Wait in Informatica. Performed data mapping, data masking, and automated Informatica workflow execution for UAT validations using shell scripts. Familiar with Microstrategy, Business Objects, Power BI Reporting Tools, SAP, Azure SQL Database, Hadoop, Informatica Data Validation (DVO), Datastage, and Python Scripting. Expertise in using Postman and SOAP UI to test REST API connections and get/post/patch the sample data to the application. Experience in UNIX shell scripting for file validations, downloads, and workflow executions. Proficient in Jira tickets, BMC Remedy incidents, Work orders, IMR (Incident Management Record), CMR (Change Management Records) process and management. Created job schedules on Event-based, File watching, Time windows, calendar based in scheduler tools like IBM redwood scheduler, Control-M, Autosys, $Universe tool to execute Informatica jobs. Used Informatica command line utilities like PMCMD/Cygwin to execute workflows in non-Windows environments. Solid industry experience in Finance, Banking, Retails, Insurance, and Food Industry domains. Strong ability to meet deadlines, handle pressure, coordinate multiple tasks in a project environment, and excellent communication and interpersonal skills. Led offshore teams for projects and involved in process design, code review, and training.