Contractor
NITYO INFOTECH
- Designed and developed data ingestion pipelines by extracting data from SQL and AWS S3, processing it in Snowflake, and applying transformations before integrating it into production schemas
- Performed initial validations on ingested data, and optimized pipeline performance for cost and efficiency
- Automated job scheduling and monitoring using Control-M, ensuring smooth data processing workflows
- Led enhancements and fixes for existing datasets based on customer-reported issues
- Identified and optimized long-running jobs to improve performance and reduce costs
- Worked in an Agile environment with 2-week sprints, using JIRA for project tracking, and BitBucket for version control
- Contributed to the maintenance and enhancement of the Python-based framework, supporting the data pipeline
- Recently, I explored Generative AI applications for automating manual tasks and improving operational efficiency
- Trained non-technical users and answered technical support questions
- Created stored procedures for automating periodic tasks in SQL Server
- Developed and implemented data models, database designs, data access and table maintenance codes
- Optimized existing queries to improve query performance by creating indexes on tables
- Monitored data systems performance, identifying bottlenecks and implementing solutions to maintain system efficiency
- Optimized SQL queries and database schemas for performance improvements in data retrieval operations