Summary
Overview
Work History
Education
Skills
Accomplishments
Timeline
Generic

Sahil Bhanot

Brampton,ON

Summary

Sahil Bhanot, Senior Cloud Engineer PROFESSIONAL SUMMARY A result driven Senior Cloud Engineer with an extensive experience of 13 years in developing, upgrading managing, supporting and hosting the microservice applications on the cloud using Java, Spring Boot, Apache camel framework, Docker, Kafka, Rabbit and IBM MQ, relational and NoSQL databases using Maven and Gradle build tools. Experienced in working on projects spanning across Canada, Asia-Pacific, Middle East and USA. PROFESSIONAL OBJECTIVE Aiming for a challenging career with the opportunities to acquire knowledge & a growth-oriented position that will utilize my vast, creative skills and exceptional ability to adapt to diverse array of software platforms and programming languages. Resourceful [Job Title] experienced in evaluating and assessing client requirements and implementing infrastructure to solve identified problems. Harnessed code and cloud-native technologies to create scalable and user-centric systems. Strong negotiator with excellent value-driven solutions.

Overview

13
13
years of professional experience

Work History

Senior Cloud Engineer

Royal Bank of Canada
02.2021 - Current
  • RBC Royal Bank wanted the one-stop-shop for all data within the Payments domain that can be used for receiving and storing the events as per the ISO standards without interfering in their transaction processing, modifying the structure as per the business needs and building the APIs exposing the payloads of payments data to users (both internal and clients)
  • The solution is named as PODS (Payments Operational Data store)
  • Objective: To develop a solution to store all the payment events asynchronously in an organized manner and as per the ISO standards
  • Modifying the structure of an event and store it as per the business needs
  • Developing the APIs to expose the payment data to users
  • Responsibilities:
  • Developing the cloud-based solution (Event driven microservices) to receive the payment events asynchronously, in near-real-time from various payment processing applications, without interfering their transaction processing and exposing the APIs with the full payment details to the users and following the Agile software development cycle
  • Receive the payment events from the upstream applications asynchronously via Kafka
  • Store the received events in an easy to access form, in addition to storing in the raw format; thereby supporting the data inquiry and audit needs
  • Draw the benefits of adopting emerging document store technology to store every changing data structure in mongoDB
  • Publishing the events to Kafka for the downstream applications
  • Developing the Restful APIs using Apache Camel and Spring boot with Gradle, that are utilized by various channels and other applications to access and expose the full payloads to the users
  • Developing the microservices in Spring Boot using the service discovery architecture
  • Hosting the developed applications on Pivotal Cloud Foundry via CLI and organized Jenkins pipelines
  • Utilizing the services offered by Pivotal in PCF and binding it to the microservice hosted on the platform
  • Manage the application autoscaling in PCF with the help of App Autoscaler
  • Documenting the Restful API with the help of Swagger to provide the rich UI to the users
  • Managing the sensitive properties and environment variables in cloud-native environments using Spring Cloud Config Server
  • This is done by binding the application to the Config Server
  • Establishing the setup in local machine with Docker orchestrating the development locally
  • Maintenance and upgradation of the applications as per the changing requirements
  • Resolving any defects as per the priority set by the business
  • Developing the applications to upload and download the events to the Amazon S3 bucket to reconcile with the stored payment events
  • Developing the scheduled tasks for the applications as per the business requirements
  • Testing the developed code by writing the Junit test cases with the help of Mockito framework
  • Writing the integration test cases using the Camel Spring test support with Junit5
  • Developing the code in accordance to the latest standards and maintaining the versions of the code with the help of GIT.

Senior Developer

Tata Consultancy Services, Shoppers Drug Mart
10.2019 - 02.2021
  • Shoppers Drug Mart (SDM) is a Canadian retail pharmacy chain based in Toronto, Ontario
  • It has more than 2000 stores over 9 provinces
  • Healthwatch is the system used at the stores for billing and dispensing the medicines as per the prescription provided by the physician
  • Objective: To develop and support the CSS (Cloud Storage Solution) for the Healthwatch application and resolving the existing defects in the system
  • Maintenance and upgradation of the system (Healthwatch application) by providing the best feasible solution
  • Responsibilities:
  • Building the cloud-based platform for storing the prescription images which results in faster retrieval and provides an optimal solution in monetary terms and efficiency
  • Developing the Restful API’s in Spring boot and hosting on to the Pivotal Cloud Foundry via CLI and Jenkins pipeline
  • Managing the sensitive properties and environment variables in cloud-native environments using Spring Cloud Config Server
  • Utilizing the services offered by Pivotal in PCF and binding it to the microservice hosted on the platform
  • Manage the autoscaling with the help of App Autoscaler
  • Documenting the Restful API with the help of Swagger as it provides the rich UI to test the API
  • Storing the prescription images to the Google cloud bucket and it’s corresponding metadata to MongoDB
  • Writing the Junit test cases using Mockito framework
  • Analyzing the existing defects based on the incidents raised and identify the pattern of the existing problem and provide the solution
  • Development of the code to resolve the existing defects followed by unit testing by deploying the code to the DIT servers with the help of CICD automated pipelines
  • Enhancement of the system by introducing the new functionalities to the system after analyzing and understanding the pain areas of the users
  • Developing the code as per the latest standards.

Technical Lead

Tata Consultancy Services, Ministry of Health
06.2017 - 10.2019
  • Healthcare
  • MOH (Ministry of Health) wanted to have a centralized appointment system (CAS) for the legal residents of KSA to book an appointment in PHC's (Primary Healthcare Centers)
  • PHC's can further refer a patient to any particular specialized hospital through CAS
  • The request flows from CAS to HIS (Hospital Information System) via CAS-interface engine
  • The CAS-Interface engine converts the CAS requests to FHIR requests and vice versa
  • Objective: To develop and support an integration engine between CAS and HIS
  • In addition to that, Onboard multiple HIS vendors with CAS
  • Responsibilities:
  • Building a CAS-Interface engine on Apache Camel framework integrated with Spring Boot to connect CAS with multiple HIS using FHIR
  • Developing an Interface Specification Document that can be considered as a standard to integrate CAS with Multiple HIS, incorporating all the message structures and protocols to be used during the information exchange
  • Created multiple API’s by using REST web services implemented in a microservice architecture
  • Developing and deploying/pushing the microservices applications in Pivotal Cloud Foundry (PaaS) cloud platform and CF CLI
  • Managing the autoscaling of the application by defining the rules
  • Modifying/Updating the CAS-Interface engine to support the dynamically changing requirements from the business side
  • Proposing a solution/process mutually agreed upon at both the ends (CAS, HIS) for the information exchange and also to cover specific scenarios
  • Onboarding multiple vendors/hospitals integrating with CAS, starting from the phase-1unit testing to the phase-3 E2E integrating testing
  • Coordinating and defining the process for the CAS, HIS and MOH teams for their readiness for integration
  • Preparing the KPI reports as per customer dynamic requirements
  • Migration of the data (Appointments, Schedules, and Patients) from CAS to newly deployed HIS
  • Developing, deploying, managing the microservices on the Predix cloud
  • Tracking HIS going live with CAS and reporting on a weekly basis
  • Involved in the support activities and solving the encountered issues.

Developer

Tata Consultancy Services, GE Healthcare
11.2016 - 05.2017
  • Enovia is a PLM (Product Lifecycle Management) tool, which is being used for logical creation of the entities and processes that are followed during the entity creation and also used as a repository for the project related documents and parts (entities) being used in the organization
  • There are different states of the particular entity (part/document) stored in the system
  • The stored entity has to pass through the different stages of the process defined in the system to reach the final stage or stage of completion
  • Objective: To test each and every method thoroughly in java using the TestNG framework by providing the inputs and verifying the results
  • Responsibilities:
  • Gathering the requirements from the client side
  • Analyzing the developed methods and writing the unit test case to check by providing the appropriate input and getting the desired output
  • Coordination with the other teams to check if the application is giving the correct results
  • Documenting all the test cases used for testing the developed methods
  • With the standard practices and policies

Developer

Tata Consultancy Services, GE Healthcare
01.2016 - 10.2016
  • DOC360 is an intranet-based Search application build on the platform Innovia which is being used within an organization
  • This application is used to search the parts as well as all the documents which are available in the system (present from the starting or being imported from the other systems) which in turn helps in time optimization and enhances reusability
  • Objective: To test each and every module thoroughly and then test all the combined modules as a whole application from the User Perspective by writing all the test cases
  • Responsibilities:
  • Studying the developed modules and writing the unit test case to check each and every module
  • Integrating the different modules and writing the integration test plans
  • Coordination with the other teams to check if the application is giving the correct results
  • Post production support
  • Documentation of all necessary documents of the application (User Manual, Release Notes, Troubleshooting Guide).

Tata Consultancy Services, GE Healthcare
03.2014 - 12.2015
  • Product Controllership (PC) is the system that connects all the systems with other systems (eDHR/ Non eDHR sites) and provide seamless flow of information from customer inquiry to customer shipments
  • The eDHR interface is developed on the Mule integration engine
  • Objective: To enhance a system that interacts with other systems (eDHR/ Non-eDHR) and provides smooth flow of the information and helps in completing the transaction
  • Responsibilities:
  • Study the requirements provided by the client and designing the forms in FTPC using Pnuts scripting
  • Developed eDHR-PeopleSoft interface and PeopleSoft-eDHR interface
  • Coordination with multiple teams for the smooth flow transaction
  • Assembling all the modules and providing the Integration Test Support.

Support Analyst

Infosys Technologies Limited, Eli Lilly
03.2011 - 01.2014
  • JIVE is an enterprise application used for employee connectivity, knowledge sharing in a secure communication hub within an enterprise
  • Objective: To set up a monitoring environment (Hyperic HQ) and providing the continuous end to end support for the Enterprise application (JIVE) hosted on the private cloud (Linux servers)
  • Responsibilities:
  • Ensuring the application is running properly without any hinderances
  • And provide the continuous support for the same
  • Resolving the encountered issues within the defined SLA in multiple ways for e.g
  • Inspecting each layer (web, application) of the application, traversing the log files, analyzing the configuration settings at OS/Application level
  • Updating the stakeholders after the regular interval of time about the situation in case any P1 issue is encountered
  • Generate daily and weekly statistics for the stakeholders
  • Creation of the custom scripts and scheduling with the help of cron job to optimize the generation of the daily and weekly reports
  • To ensure the high performance, availability, and continuous monitoring of the application, setting up the monitoring tool (Hyperic HQ) to monitor each and every parameter of the servers on which application is hosted
  • Enhancing the performance of the application through better observability via monitoring tools and automating the processes involved
  • Creation of the custom shell scripts triggered automatically from the monitoring tool to collect the data/parameters of the servers
  • Creation of the custom XML plugins that assist in registering the custom scripts in the tool
  • Addition of each and every server to the tool for enabling monitoring to them
  • Integration of the tool with the SMTP server to send an alerting email with the help of Postfix service to the whole support group whenever any of the parameters cross the defined threshold value or any particular kind of exception is encountered in the logs
  • Ensuring the monthly patch activities on the servers are performed correctly and the application is working as expected
  • Developing the code in java using JSF framework for the POC activities for multiple clients.

Education

Bachelor of Engineering - Computer Science

Maharishi Dayanand University
2010

High School Diploma (XII) -

Jankidas Kapur Public School , Central Board of Secondary Education
2006

Jankidas Kapur Public School, Central Board of Secondary Education
2004

Skills

  • KEY COMPETENCIES
  • Technologies/ Frameworks: Spring Boot, Apache Camel, Java, Restful web services, Microservices, Docker, Kafka, Rabbit MQ, IBM MQ, JPA, Mongo, PostgreSQL, SLF4J, API Gateway, Mockito, Junit5, TestNG, Jenkins, Git, FHIR, Linux, J2EE, JSF
  • Tools/Servers: IntelliJ, Eclipse, Gradle, Maven, Putty, Tomcat, Filezilla, WinSCP
  • Cloud Platforms: Pivotal Cloud Foundry, GE Predix
  • Continuous Deployment
  • Configuration and Management
  • Cloud Implementation
  • Application Development
  • Development Methodology Selection
  • Project Management
  • Design Patterns and Principles
  • Code Review Management
  • Project Specifications
  • Debugging and Troubleshooting
  • SDLC Processes
  • Restful Web Services

Accomplishments

  • Certifications: Pivotal Cloud Foundry Developer, Oracle Certified Professional Java SE 6 Programmer, Oracle Certified Professional Java Web Component Developer

Timeline

Senior Cloud Engineer

Royal Bank of Canada
02.2021 - Current

Senior Developer

Tata Consultancy Services, Shoppers Drug Mart
10.2019 - 02.2021

Technical Lead

Tata Consultancy Services, Ministry of Health
06.2017 - 10.2019

Developer

Tata Consultancy Services, GE Healthcare
11.2016 - 05.2017

Developer

Tata Consultancy Services, GE Healthcare
01.2016 - 10.2016

Tata Consultancy Services, GE Healthcare
03.2014 - 12.2015

Support Analyst

Infosys Technologies Limited, Eli Lilly
03.2011 - 01.2014

Bachelor of Engineering - Computer Science

Maharishi Dayanand University

High School Diploma (XII) -

Jankidas Kapur Public School , Central Board of Secondary Education

Jankidas Kapur Public School, Central Board of Secondary Education
Sahil Bhanot