Progressive and innovative software professional with over 6 years of experience in the field of DevOps Engineer in application configurations, code compilation, packaging, building, automating, managing and releasing code from one environment to other.
Experience focusing on Data warehousing, Data modeling, Data integration, Data Migration, ETL process and Business Intelligence.
Experience in Jenkins by installing, configuring, and maintaining for purpose of continuous integration (CI/CD) and for end-to-end automation for all build and deployments and creating Jenkins pipelines.
Experience in branching, merging and maintaining the versions using tools like Git and GitHub on windows and Linux platform.
Experience in creating of Docker containers and Docker consoles for managing the application life cycle. Orchestrate Docker container cluster using Kubernetes.
Proficient with various development, testing and deployment tools – Git, Jenkins, Docker, Chef, Ansible. Good understanding of DevOps key concepts.
Experience in Project Management and issue tracking tool like JIRA Excellent knowledge of the quality assurance process methodologies, Software Development Life Cycle (SDLC).
Hands on experience in developing scalable enterprise applications to the user needs, which serves for Finance, Telecom, Insurance and Sales.
Self-starter with strong Interpersonal and Excellent Communication Skills, excellent at learning and adapting to new technology, and ability to work individually as well as in collaborative team environment.
Good written and verbal communication skills, strong organizational skills, and a hard-working, team player, well-practiced in answering business team queries.
Analytical and Strong interpersonal and good communication skills.
Overview
17
17
years of professional experience
1
1
Certification
Work History
Sr. DevOps Engineer
Sapiens
02.2018 - Current
Design, build, and maintain scalable and reliable infrastructure using tools like Kubernetes, Docker, and cloud services (AWS, Azure)
Implement infrastructure as code (IaC) using tools like Terraform, Ansible, or Puppet to ensure consistent and reproducible environments
Develop and maintain CI/CD pipelines to automate application deployment, testing, and monitoring
Integrate automated testing and quality assurance processes into CI/CD pipelines
Identify and automate manual processes to increase efficiency and reduce human error
Set up monitoring tools to track the health and performance of applications and infrastructure
Configure alerts and notifications to proactively address issues and minimize downtime
Collaborate with development, operations, and QA teams to improve the development process and ensure smooth deployments
Act as a bridge between development and operations teams to foster a DevOps culture
Optimize resource allocation to achieve cost-effectiveness
Analyze system performance metrics and plan for scaling infrastructure to meet growing demands
Debug and resolve issues in the development, test, and production environments
Manage version control systems to track changes in code and configuration
Maintain comprehensive documentation of infrastructure setup, configurations, and processes
Provide knowledge sharing and training to junior team members
Implement backup and recovery strategies to ensure data integrity and availability in case of failures
Identify bottlenecks and performance issues in applications and infrastructure and implement optimizations
Stay updated with industry trends and emerging technologies in DevOps and cloud computing
Created and implemented chef cookbooks for deployment and use Chef Recipes to create a Deployment directly into Amazon EC2 instances
Used Jenkins tool to automate the build process
Created and maintained fully automated CI/CD pipelines for code deployment
Automated continuous integration and deployments using Jenkins and Docker
Responsible for taking source code and compiling using Maven and package it in its distributable format
Managed Amazon Web Services like EC2, S3 bucket, RDS, EBS, ELB, Auto -Scaling, IAM through AWS console
Responsible for creating multi-region, multi-zone AWS cloud infrastructure
Expert in using build tools like MAVEN for building of deployable artifacts such as war and jar from source code
Built and deployed Docker containers for implementing Microservice Architecture
Good understanding of Infrastructure as code and how you can achieve that by using tools like Chef, Ansible etc
Trouble shoot critical production issues that the DevOps teams escalate resulting in a quick resolution of errors
Performed routine system administration activities such as adding/modifying/removing user access, installing upgrades and patches, maintaining backups and restores, running scheduled processes
Supported production systems outside of standard working hours by periodically volunteering to be on call 24/7 to help with issues
Worked closely with other team members to develop, test and deploy high quality software
Involved in DevOps migration/automation processes for build and deploy systems
Orchestrated Docker container cluster using Kubernetes
Utilized Kubernetes for the runtime environment of the CI/CD system to build, test deploy
Participated in weekly release meetings with Technology stake holders to identify and mitigate potential risks associated with the release
Involved in the analysis of the user requirements and identifying the sources
Involved in Design, analysis, Implementation, Testing and support of ETL processes for Stage, ODS and Mart
Prepared ETL standards, Naming conventions and wrote ETL flow documentation for Stage, ODS and Mart
Collect and link metadata from diverse sources, including relational databases and flat files
Designed and developed Informatica Mappings and Sessions based on business user requirements and business rules to load data from source flat files and oracle tables to target tables
Worked on various kinds of transformations like Expression, Aggregator, Stored Procedure, Java, Lookup, Filter, Joiner, Rank, Router and Update Strategy
Developed reusable Mapplets and Transformations
Used debugger to debug mappings to gain troubleshooting information about data and error conditions
Involved in monitoring the workflows and in optimizing the load times
Used Change Data Capture (CDC) to simplify ETL in data warehouse applications
Involved in writing procedures, functions in PL/SQL
Involved in extensive performance tuning by determining bottlenecks at various points like targets, sources, mappings, sessions or system
Prepared UNIX Shell Scripts and these shell scripts will be scheduled in AUTOSYS for automatic execution at the specific timings
Prepared test Scenarios and Test cases in HP Quality Center and involved in unit testing of mappings, system testing and user acceptance testing
Environment: Informatica PowerCenter 8.6/8.1 (PowerCenter Designer, workflow manager, workflow monitor, Power Connect), Oracle Data integrator, MS Visio, ERWIN, IBM MQ series, BO XI R3, Cognos 8, SQL.
SR. ETL DEVELOPER / DBA
Zurich Financial
07.2007 - 09.2009
Interacted with business analysts and translate business requirements into technical specifications
Designed and developed end-to-end ETL process from various source systems to Staging area, from staging to Data Marts
Extensively worked on Informatica to extract data from Flat files, Excel files, and MS Access database to load the data into the target database
Implemented the Incremental loading of Dimension and Fact tables
Created Stored Procedures for data transformation purpose
Created Tasks, Workflows, Sessions to move the data at specific intervals on demand using Workflow Manager
Developed the control files to load various sales data into the system via SQL
Loader
Developed Unix Shell scripts for maintaining Files
Designed and created an Autosys job plan to schedule our processes
Created PL/SQL Stored procedures and implemented them through the Stored Procedure transformation
Developed, tested and implemented break/fix change tickets for maintenance
Involved in writing test cases, assisting Testing team in testing
Developed the document of the complete ETL process
Experienced in Designing Database with prominent activities like deciding Primary / Secondary index, creating Join index, creating Partitioned Primary Index, Compare Load Utilities to determine the best Scenario, using Automatic Data protection using Recovery Journals and Fallback protection
Experienced in Teradata Manager which is used to create Alerts, Monitor system
Familiar with Teradata Database Query Manager
Teradata PMON is used to monitor the system performance when it is in load
Designed database, users, tables and views structures
Responsible for COLLECT STATISTICS on all types of tables
Performance query tuning and Index creations
Used Informatics' features to implement Type II, III changes in slowly changing dimension (SCD) tables
Designing Archival Jobs, considering archival resources like the number of Media Servers, tapes, various limiting factors of Transmit/receive of data to the media
Developing code for Validation reports
Project migration into QA and Production environment