• Proficiently designed and deployed scalable and secure cloud solutions using Azure architecture.
• Ensured compliance with cloud security and best practices, monitoring for violations and responding
to incidents.
• Collaborated with development teams to deploy and manage cloud-based applications.
• Leveraged Spark, Python, and Shell scripting for complex data processing and analysis.
• Successfully completed data migrations and transitions from on-premises to cloud infrastructure.
• Developed reports and visualizations using Power BI and other Microsoft Business Intelligence tools.
Knowledge of cloud security and compliance best practices.
• Collaborating with development teams to deploy and manage cloud-based applications.
• Scripting in programming languages such as PowerShell, Python.
• Ability to troubleshoot and resolve cloud infrastructure issues.
• Loaded Data from SFTP/On-Prem data through Azure Data Factory to Azure SQL Database and
Automize the pipeline schedules using Event based Scheduling in Azure Data Factory (ADF).
• Familiarity with Spark, Azure Databricks, Data Lakes, Data Warehouses, MDM, BI, Dashboards, AI, ML
• Develop and test ETL components to high standards of data quality and performance
• Developed Stored Procedures, User Defined Functions, Views, T-SQL Scripting for complex business
logic.
• Successfully created dynamic configuration packages in SSIS and used Event Handlers for Exception
Handling in SSIS packages and Job Scheduling and Package Automation of SSIS packages.
• Generated reports on wealth information, account information, and research reports using ad-hoc
reporting to be used by the clients and advisors.
• Leveraged Apache Hudi for efficient data versioning and incremental processing.
• Experience in Designing and Building the Dimensions and cubes with star schema using SQL Server
Analysis Services.
• Experience in report writing using SQL Server Reporting Services (SSRS) and creating various types of
reports like drill down, drill through, Parameterized, Cascading, Conditional, Table, Matrix, Chart, and
Sub Reports.
• Utilized PySpark for distributed data processing and analysis tasks.
• Worked with Business Analysts, Development Groups and Project manager in analyzing Business Specifications.
• Analyzed the Stories to understand the Test Requirements in each Acceptance Criteria.
• Designed Test Cases and Test Scenarios, conducted functional, ad-hoc and exploratory testing for the variety of applications with expanded test coverage.
• Developed and Maintained Big Data (Hadoop)/ETL Migration Test Planning & Strategy, test script preparation, test execution & documentation.
• Performed input field validations by Data Base Testing using Data Tables and Flat files; Created both Positive and Negative data for the same. Used AWS (Amazon Web Services) environment for migration testing.
• Experience using CloudWatch, S3Buckets in AWS as required. Also, able to run the Jobs using AutoSys Schedular as required.
• Performed ETL Testing on Informatic Mappings using Mapping Documents as well as performed Test Data creation as required for testing.
• As part of testing tested Big-Data (Hadoop) related Data, which is derived from MapReduce Techniques.
• Participated in Snowflake environment migration testing from Non-Cloud to Cloud.
• Worked in a Spark computing framework worked on API testing and validated the data in Hadoop hdfs.
• Used HP ALM to store Test Plans, Test Cases, Test Scripts, and for bug tracking.
• Identified software problems, wrote bug reports, and logged them into the bug tracking tool in HP ALM.
• Used complex SQL queries to retrieve data for validation in the oracle database.
• Wrote automation scripts using various checkpoints like standard, image, page, text, and database Checkpoints in UFT.
• Used Data Driven Test Scripts for the project using UFT.
• Developed automated Test Scripts to perform Functional Testing, Integration Testing, System Testing, and Regression testing of the application using UFT.
• Developed scripts to perform Regression Testing using VBScript in UFT.
• Involved in Hybrid Automation Framework using UFT.
• Created and maintained Shared Object Repositories to enable Unified Functional Testing to identify the various objects in the application.
• Involved in User Acceptance Testing and prepared UAT Test Scripts.
• Assisted the business & marketing team in the execution of UAT by providing data.
• Documented software defects, using a bug tracking system, and reported defects to software developers • Participated in various meetings and discussed enhancements and modification request issues.
• Maintained meetings with the technical teams and management, developed proper documentation, and validated current production environment.
Cloud Computing: Microsoft Azure (Azure Data Factory, Azure Databricks, Azure Synapse, Azure Key
Vault, and Azure DevOps for effective data management and deployment), AWS (EC2, S3, EMR,
Redshift, Glue, Snowflake)
BI Tools: Tableau Server, Power BI, OBIEE, SSAS, SSIS, SSRS
Data Engineering: ETL Development, Data Modeling, Data warehousing
Programming Languages: Python, Spark, SQL, PowerShell
Database Management: MySQL, PostgreSQL, MongoDB
Agile Methodologies: Scrum, Waterfall
Database Optimization: DB and Query optimization, Performance Tuning
Excellent Communication and problem-solving skills gained from enterprise