Summary
Overview
Work History
Education
Skills
Languages
Certification
Timeline
Generic

Sai Chanakya Panikala

Mississauga,Canada

Summary

Skilled DevOps/Cloud Engineer with 7 years of experience in building and managing scalable cloud infrastructure using AWS, Azure, an Extensive knowledge of Infrastructure as Code (IaC) using Terraform and AWS CloudFormation, automating infrastructure setup and improving repeatability and scalability’s containerization tools like Docker and Kubernetes and Proficient in implementing CI/CD pipelines, monitoring systems, and automating deployments to enhance delivery speed, reliability, and security.

Overview

7
7
years of professional experience
1
1
Certification

Work History

Azure DevOps Engineer

TD Bank
09.2024 - Current
  • Company Overview: TD Bank is one of Canada’s largest and most trusted financial institutions, offering personal and commercial banking, investment services, and wealth management.
  • Designed and implemented end-to-end Azure DevOps CI/CD pipelines for core banking and digital finance applications, improving delivery speed and system consistency.
  • Automated infrastructure provisioning using Terraform and ARM templates, enabling repeatable, auditable, and scalable environment setups for compliance-critical workloads.
  • Integrated Azure Data Factory and Synapse Analytics to manage secure data pipelines for real-time credit score calculations, transaction monitoring, and regulatory reporting.
  • Containerized .NET and Java-based financial apps using Docker, deploying to Azure Kubernetes Service (AKS) for fault tolerance and improved scalability.
  • Developed Python and PowerShell automation scripts for provisioning, log rotation, system updates, and data archiving across production and DR environments.
  • Configured Azure API Management to securely expose RESTful services to internal teams and fintech partners while applying throttling and IP filtering rules.
  • Implemented Azure Monitor, ELK Stack, and Grafana dashboards to monitor API latency, system health, and security anomalies in real-time.
  • Migrated legacy reporting systems to Azure Databricks and Azure SQL, reducing batch processing time for financial statements by 50%.
  • Used Azure Key Vault and Azure Active Directory (AAD) to enforce secure authentication and role-based access across pipeline and infrastructure resources.
  • Managed sprint planning, backlog grooming, and incident resolution through Azure Boards and JIRA, supporting agile delivery practices across cross-functional teams.
  • Environment: Azure DevOps, Terraform, ARM Templates, Azure Data Factory, Azure SQL, AKS, Docker, Synapse, Azure API Management, Python, PowerShell, ELK Stack, Grafana, Azure Monitor, Databricks, Azure Key Vault, AAD, JIRA

AWS Devops Engineer

GFL Environmental Inc
12.2022 - 08.2024
  • Company Overview: GFL Environmental Inc. is a diversified environmental solutions company offers a wide range of services in solid waste management, liquid waste management, infrastructure, and soil remediation.
  • Designed and implemented scalable CI/CD pipelines using Jenkins, GitLab CI, and AWS CodePipeline to automate deployment of waste management and remediation applications, improving release velocity and reducing manual errors across dev, staging, and prod environments.
  • Utilized Terraform and AWS CloudFormation to provision reusable infrastructure modules for GFL’s cloud-based platforms, enabling rapid deployment of microservices handling solid and liquid waste data analytics and reporting.
  • Managed containerized applications using Docker and orchestrated them using Amazon EKS and AWS Fargate, ensuring seamless deployment and auto-scaling of microservices supporting customer waste pickup scheduling.
  • Deployed and maintained AWS Lambda-based serverless applications for real-time soil testing and remediation tracking, integrated with API Gateway and AWS Step Functions for streamlined data processing and automated alerts.
  • Built monitoring dashboards with Amazon CloudWatch, Grafana, and Prometheus, enabling proactive performance monitoring and incident response across environmental IoT sensor systems and centralized waste collection platforms.
  • Managed secure object storage and lifecycle policies using Amazon S3 and Glacier, archiving compliance-related environmental records and daily backups of logistics and hazardous material handling data.
  • Configured Amazon RDS, DynamoDB, and Aurora for optimized database performance and high availability of customer service platforms and waste management scheduling tools across Canada and the U.S.
  • Implemented reverse proxy and load balancing using NGINX and Apache Tomcat, improving application responsiveness and managing traffic distribution for high-demand client portals and environmental service dashboards.
  • Developed infrastructure automation scripts using Python and Bash, automating backup checks, system patching, and waste analytics report generation across containerized environments and virtualized workloads.
  • Configured secure TCP/IP networks, HTTPS, SSH, and DNS settings, ensuring encrypted transmission and secure remote access to GFL’s hybrid infrastructure handling sensitive environmental and customer data.
  • Automated deployment workflows using AWS Code Deploy, Argo CD, and CircleCI, supporting agile development practices and faster rollout of updates to field inspection, compliance tracking, and remediation platforms.
  • Environment: CI/CD pipelines, Jenkins, GitLab CI, Terraform, AWS CloudFormation, Docker, Amazon EKS, AWS Fargate, AWS Lambda, Amazon CloudWatch, Grafana, Prometheus, Amazon S3, Glacier, Amazon RDS, DynamoDB, Aurora, AWS API Gateway, TCP/IP networks, HTTPS, SSH, DNS, Argo CD, CircleCI.

DevOps Engineer

Apotex Inc
10.2021 - 11.2022
  • Company Overview: Apotex Inc. is Canada's largest pharmaceutical company, manufacturing and distributing generic medicines.
  • Deployed and managed containerized pharma applications using Docker, Kubernetes, and OpenShift, ensuring 24/7 availability of critical laboratory and quality systems.
  • Automated infrastructure provisioning and software deployment using Bash, Python, and Terraform, significantly reducing manual effort in GxP-validated environments.
  • Integrated AWS cloud services such as EC2, S3, RDS, Lambda, and IAM to support research data lakes, drug batch traceability, and compliance workflows.
  • Implemented proactive monitoring and alerting using Nagios, Grafana, and the ELK Stack to support data integrity, anomaly detection, and operational health in manufacturing systems.
  • Designed and maintained RESTful APIs and managed secure integrations between LIMS, ERP, and MES platforms using API Gateway, ensuring seamless data flow across pharmaceutical value chains.
  • Maintained pharma-grade databases including MySQL, PostgreSQL, Oracle, and MongoDB to support drug formulation records, inventory, and lab data with high availability and backup automation.
  • Enforced data security policies using TCP/IP, VPN, firewalls, DNS, and SSH, and implemented Ansible and Puppet for patching, auditing, and configuration across regulated systems.
  • Built automated QA workflows using Selenium, JUnit, and Postman to validate application compliance with pharma regulatory standards like Health Canada, FDA 21 CFR Part 11, and GAMP 5.
  • Developed AWS Lambda serverless functions for batch record processing and integrated SQS, SNS, and CloudWatch for intelligent alerting and audit logging.
  • Environment: Docker, Kubernetes, OpenShift, Bash, Python, Terraform, EC2, S3, RDS, Lambda, IAM, Nagios, Grafana, ELKStack, REST APIs, API Gateway, MySQL, PostgreSQL, Oracle, MongoDB, TCP/IP, VPN, DNS, SSH, Ansible, Puppet, Selenium, JUnit, Postman

Devops Engineer

Intact Financial Corporation
10.2019 - 09.2021
  • Company Overview: Intact Financial Corporation is Canada’s largest provider of property and casualty (P&C) insurance company
  • Utilize tools like Terraform or Ansible to automate infrastructure provisioning and management consistent and reproducible environments, making it easier to scale and manage infrastructure resources.
  • Utilized Docker and Kubernetes to containerize and orchestrate insurance claim processing applications, enhancing scalability, resilience, and support for high-volume policy transactions.
  • Optimized and maintained insurance data repositories using PostgreSQL, MongoDB, and MySQL, ensuring efficient policyholder data storage, retrieval, and business continuity.
  • Automated infrastructure setup using Terraform and CloudFormation, enabling reproducible cloud deployments for underwriting platforms with reduced manual configuration.
  • Developed and integrated RESTful APIs and managed routing through API Gateway to securely connect policy services, claims systems, and third-party verification modules.
  • Deployed Nagios, Grafana, and Datadog for real-time application monitoring, improving response time and ensuring operational integrity of critical insurance services.
  • Configured SSL/TLS encryption, VPN tunnels, and firewalls to secure inter-service communication and safeguard sensitive customer data in regulatory-compliant environments.
  • Centralized system logs using ELK Stack (Elasticsearch, Logstash, Kibana) to streamline debugging, gain real-time insights, and ensure compliance across policy management applications.
  • Environment: Docker, Kubernetes, PostgreSQL, MongoDB, MySQL, Terraform, CloudFormation, RESTful APIs, Nagios, Grafana, Datadog, SSL/TLS, VPN tunnels, firewalls, Agile, CI/CD, Elasticsearch, Logstash, Kibana.

Cloud Engineer

HDFC Life Insurance
05.2018 - 06.2019
  • Company Overview: HDFC Life Insurance Company is one of the leading private-sector life insurance companies.
  • Built CI/CD pipelines using Jenkins, GitLab CI, and AWS CodePipeline to automate deployment for life insurance policy platforms, ensuring consistent integration and faster time-to-market across development cycles.
  • Deployed scalable applications using Docker and Kubernetes, enabling seamless orchestration of microservices for customer onboarding, premium processing, and claim management systems.
  • Configured secure API Gateways and implemented RESTful APIs to streamline integration between underwriting services, CRM platforms, and partner portals, improving data flow efficiency.
  • Strengthened platform security with SSL/TLS, VPNs, and firewall rules, and leveraged AWS Lambda for backend automation to support event-driven processing in insurance workflows.
  • Managed and optimized insurance data using PostgreSQL, MongoDB, MySQL, and Cassandra, ensuring high availability, scalability, and efficient processing for actuarial and customer data.
  • Automated cloud infrastructure provisioning using Terraform, CloudFormation, and Ansible, enabling consistent, secure, and scalable deployment of insurance platforms across multi-cloud environments.
  • Environment: CI/CD pipelines, Jenkins, GitLab CI, AWS CodePipeline, Docker, Kubernetes, API Gateways, RESTful APIs, Amazon CloudWatch, Datadog, Grafana, SSL/TLS, VPNs, firewall, AWS Lambda, PostgreSQL, MongoDB, MySQL, Cassandra

Education

Bachelors - Information Technology

Kakatiya University

Skills

  • Languages & Frameworks: Java, Spring Boot, Spring MVC, Hibernate, JavaScript, TypeScript
  • Frontend: AngularJS, ReactJS, Vuejs, HTML5, CSS3, Bootstrap
  • Backend: Nodejs, RESTful APIs, Microservices
  • CI/CD & DevOps: Jenkins, GitLab, Git CI/CD, Docker, Kubernetes, AWS Lambda
  • Databases: SQL, PL/SQL, Oracle, MongoDB, MySQL, PostgreSQL
  • Testing: JUnit, Mockito, Selenium, Postman
  • Cloud: AWS (EC2, S3, RDS), Azure, Google Cloud
  • Version Control: Git, Bitbucket
  • Monitoring & Logging: ELK Stack, Prometheus, Grafana
  • API Management: API Gateway (AWS), Apigee, Swagger
  • Data Serialization: JSON, XML, YAML
  • Project Management Tools: JIRA, Asana, and Trello
  • Configuration Management: Ansible, Chef
  • Architecture: Microservices, MVC, RESTful services, SOAP, Monolithic Architecture
  • Other Tools: Git, Jira, Confluence, Swagger, IntelliJ IDEA, VSCode, Eclipse
  • Testing: JUnit, Mockito, Selenium, Postman, Karma, Cypress, Jasmine, Jest
  • DevOps/CI-CD: Jenkins, Docker, Kubernetes, Maven
  • Networking: Load Balancers, VPC, VPN, DNS, SSL/TLS, TCP/IP

Languages

English
Full Professional

Certification

  • Microsoft Certified Azure DevOps Engineer
  • AWS Certified DevOps Engineer

Timeline

Azure DevOps Engineer

TD Bank
09.2024 - Current

AWS Devops Engineer

GFL Environmental Inc
12.2022 - 08.2024

DevOps Engineer

Apotex Inc
10.2021 - 11.2022

Devops Engineer

Intact Financial Corporation
10.2019 - 09.2021

Cloud Engineer

HDFC Life Insurance
05.2018 - 06.2019

Bachelors - Information Technology

Kakatiya University
Sai Chanakya Panikala