Summary
Overview
Work History
Education
Skills
Timeline
Generic

Yashwanth Paka

Summary

  • Overall, 8 years of experience in manual testing and web-based, Client/Server, and data warehousing applications & Developing ADF Pipelines to load the various data into ADLS( Big Data).
  • • Created Power BI reports effective dashboards after gathering and translating end user
    requirements.
    • Experience in using database like SQL server, ORACLE database tables, stored procedures, functions
    and triggers.
    • Have knowledge on Data Warehouse (SSAS) Tabular and Multi-dimensional relational databases
    • Hands on experience using DTS/SSIS Import Export Data, Bulk Insert, BCP and DTS/SSIS Packages.
  • Well-versed with all stages of the Software Development Life Cycle (SDLC) and Software Testing Life Cycle (STLC).
  • Professional experience in Integration, Functional, Regression, System Testing, Load Testing, UAT Testing, Black Box, and GUI testing.
  • Experience in Web-based applications like online banking, and transaction applications.
  • Experience in SQL scripting
  • Reviewed and analyzed mapping rules to test the functionality of the Ab-Initio graphs.
  • Sound Knowledge and experience in Metadata and Star schema/Snowflake schema.
  • Analysed Source Systems, Staging area, Fact, and Dimension tables in Target D/W.
  • Tested the ETL Informatica mappings and other ETL Processes (Data Warehouse Testing).

Overview

9
9
years of professional experience

Work History

Consultant

Citibank
04.2023 - Current

• Proficiently designed and deployed scalable and secure cloud solutions using Azure architecture.
• Ensured compliance with cloud security and best practices, monitoring for violations and responding
to incidents.
• Collaborated with development teams to deploy and manage cloud-based applications.
• Leveraged Spark, Python, and Shell scripting for complex data processing and analysis.
• Successfully completed data migrations and transitions from on-premises to cloud infrastructure.
• Developed reports and visualizations using Power BI and other Microsoft Business Intelligence tools.
Knowledge of cloud security and compliance best practices.
• Collaborating with development teams to deploy and manage cloud-based applications.
• Scripting in programming languages such as PowerShell, Python.
• Ability to troubleshoot and resolve cloud infrastructure issues.
• Loaded Data from SFTP/On-Prem data through Azure Data Factory to Azure SQL Database and
Automize the pipeline schedules using Event based Scheduling in Azure Data Factory (ADF).
• Familiarity with Spark, Azure Databricks, Data Lakes, Data Warehouses, MDM, BI, Dashboards, AI, ML
• Develop and test ETL components to high standards of data quality and performance
• Developed Stored Procedures, User Defined Functions, Views, T-SQL Scripting for complex business
logic.
• Successfully created dynamic configuration packages in SSIS and used Event Handlers for Exception
Handling in SSIS packages and Job Scheduling and Package Automation of SSIS packages.

• Generated reports on wealth information, account information, and research reports using ad-hoc
reporting to be used by the clients and advisors.
• Leveraged Apache Hudi for efficient data versioning and incremental processing.
• Experience in Designing and Building the Dimensions and cubes with star schema using SQL Server
Analysis Services.
• Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of
reports like drill down, drill through, Parameterized, Cascading, Conditional, Table, Matrix, Chart, and
Sub Reports.
• Utilized PySpark for distributed data processing and analysis tasks.

Developer

Capgemini Technology Services India Limited
05.2022 - 02.2023
  • Providing business solutions to various Unilever clients
  • Analysing large-scale input files and loading applying various logics Azure technologies and loading into the Azure Data Lake
  • Providing a single report to the end user for various data Unilever products, and product and pack level information
  • Tools: Azure Data Factory (ADF)
  • Client: Unilever
  • Created scalable microservices using Docker and Kubernetes, enhancing system reliability.
  • Utilized Jenkins to automate code analysis, improving continuous integration processes.
  • Managed and optimized data processing workflows on AWS EMR clusters.
    • Integrated PySpark applications with AWS EMR for scalable big data solutions.
    • Experience with AWS Glue for ETL jobs and data cataloging.
  • Guided team through Git workflows and CI/CD pipelines using GitHub and Jenkins, ensuring seamless integration.
  • Refined existing codebase, reducing load times and improving application performance.
  • Managed client relationships through regular check-ins and updates on project progress.
  • Enhanced communication among team members to foster collaborative and supportive work environment.
  • Collaborated with cross-functional teams to successfully deliver comprehensive solutions for clients.

Associate Software Engineer

Wells Fargo International Solution Private Limited
09.2018 - 04.2022

• Worked with Business Analysts, Development Groups and Project manager in analyzing Business Specifications.

• Analyzed the Stories to understand the Test Requirements in each Acceptance Criteria.

• Designed Test Cases and Test Scenarios, conducted functional, ad-hoc and exploratory testing for the variety of applications with expanded test coverage.

• Developed and Maintained Big Data (Hadoop)/ETL Migration Test Planning & Strategy, test script preparation, test execution & documentation.

• Performed input field validations by Data Base Testing using Data Tables and Flat files; Created both Positive and Negative data for the same. Used AWS (Amazon Web Services) environment for migration testing.

• Experience using CloudWatch, S3Buckets in AWS as required. Also, able to run the Jobs using AutoSys Schedular as required.

• Performed ETL Testing on Informatic Mappings using Mapping Documents as well as performed Test Data creation as required for testing.

• As part of testing tested Big-Data (Hadoop) related Data, which is derived from MapReduce Techniques.

• Participated in Snowflake environment migration testing from Non-Cloud to Cloud.

• Worked in a Spark computing framework worked on API testing and validated the data in Hadoop hdfs.

• Used HP ALM to store Test Plans, Test Cases, Test Scripts, and for bug tracking.

• Identified software problems, wrote bug reports, and logged them into the bug tracking tool in HP ALM.

• Used complex SQL queries to retrieve data for validation in the oracle database.

• Wrote automation scripts using various checkpoints like standard, image, page, text, and database Checkpoints in UFT.

• Used Data Driven Test Scripts for the project using UFT.

• Developed automated Test Scripts to perform Functional Testing, Integration Testing, System Testing, and Regression testing of the application using UFT.

• Developed scripts to perform Regression Testing using VBScript in UFT.

• Involved in Hybrid Automation Framework using UFT.

• Created and maintained Shared Object Repositories to enable Unified Functional Testing to identify the various objects in the application.

• Involved in User Acceptance Testing and prepared UAT Test Scripts.

• Assisted the business & marketing team in the execution of UAT by providing data.

• Documented software defects, using a bug tracking system, and reported defects to software developers • Participated in various meetings and discussed enhancements and modification request issues.

• Maintained meetings with the technical teams and management, developed proper documentation, and validated current production environment.

Test Analyst

Wipro Limited
05.2016 - 05.2018
  • Involved in studying/understanding the architecture of the system from the Performance testing perspective
  • Prepared Test Approach and Test Plan with Agile methodology
  • Involved in understanding Non-Functional Requirements and preparing the Workload model
  • Created LR scripts using Web (HTTP/HTML) and Web services Protocol
  • Involved in All major Enhancements like Correlation, Parameterization, and error checking
  • Review the Test scripts for various integrated features in standalone mode
  • Creation/verification of test data
  • Executed Load and Soak tests, analyzed the results, and produced performance test reports
  • Involved in a walkthrough of the test runs along with the team
  • Analysed the results and produced performance test reports from Server Monitoring
  • Active participation in suggesting the tuning areas and retesting after tuning
  • Analyse and compile the test results and generated performance metrics reports
  • Experienced in functional, Integration, database, and regression testing
  • Expertise in preparing Test Strategies, Test Plans, Test Summary Reports, Test Cases, and Test Scripts for Automated and manual testing

Education

GED -

JNTUH
HYDERABAD

Skills

    Cloud Computing: Microsoft Azure (Azure Data Factory, Azure Databricks, Azure Synapse, Azure Key
    Vault, and Azure DevOps for effective data management and deployment), AWS (EC2, S3, EMR,
    Redshift, Glue, Snowflake)
    BI Tools: Tableau Server, Power BI, OBIEE, SSAS, SSIS, SSRS
    Data Engineering: ETL Development, Data Modeling, Data warehousing
    Programming Languages: Python, Spark, SQL, PowerShell
    Database Management: MySQL, PostgreSQL, MongoDB
    Agile Methodologies: Scrum, Waterfall
    Database Optimization: DB and Query optimization, Performance Tuning
    Excellent Communication and problem-solving skills gained from enterprise

Timeline

Consultant

Citibank
04.2023 - Current

Developer

Capgemini Technology Services India Limited
05.2022 - 02.2023

Associate Software Engineer

Wells Fargo International Solution Private Limited
09.2018 - 04.2022

Test Analyst

Wipro Limited
05.2016 - 05.2018

GED -

JNTUH
Yashwanth Paka