Experience
2022 — Now
2022 — Now
Seattle, Washington, United States
At SoFi, I played a pivotal role in API development, leveraging technologies such as Python, AWS, and Docker. My contributions included leading the BillPay Switch project and creating essential APIs like the Token Service and Events API. I also enhanced backend services for Wire Transfer and ACH Transaction Simulation, focusing on security improvements.
2020 — 2022
2020 — 2022
United States
Used Azure cloud to build solutions for customers.
Completed Azure certification (DP 200) and PowerBI certification
Interacted with clients to understand their needs and provide useful solutions and insights
2020 — 2020
2020 — 2020
San Francisco Bay Area
Fellow, Insight Data Science, San Francisco
Created DevOps pipeline for databases by using EKS,ECR, CircleCI ,SQLAlchemy and PostgreSQL to manage database from
development to QA and from QA to production
Implemented the migration of databases with traceability to database changes and ability to roll back by creating tests in dev, QA and
production environments
2020 — 2020
Phoenix, Arizona
Responsibilities : Responsible for the regular data analytics and report generation of key performance metrics, assisting project teams with one off tasks when available, Assist with Continuous Delivery
2019 — 2019
2019 — 2019
East Palo Alto
Amazon Web Services, East Palo Alto : Software Development Engineer Intern (TensorFlow team) May-August 2019
1. Worked on SageMaker-Debugger, GitHub link- https://github.com/awslabs/sagemaker-debugger
which helps to analyze the tensors generated during training job of neural networks.
2. Created Index Writing class for that project so that the tensors that are being saved for analyzing can be directly fetched from the exact location it is saved. Also, created utility functions that can later be used to fetch files for exact step, integrated Index Writer with Index Reader. The speed to fetch tensor improved dramatically because of Index Writing.
3.Fixed some bugs for the project, like:
Detection of end of training job, raising run time error if same directory structure is being used, Updating tests for more generalized usage.
4. Created Code Integration/Code Deployment system to run all tests for each PR and upload reports and wheels for tool to S3, which can –
Run integration and unit tests for each Pull request, Publish the results of Pull Request to the corresponding Pull Request, create pip wheel package from alpha and master branches, publish pip-packages to corresponding s3 locations if tests are success. Build sends chime notification using lambda function to deep engine group about warnings /errors and info generated during the build as well as link to build logs.
4. Learned a lot about TensorFlow and about AWS tools like CloudWatch, lambda, Codebuild, Codepipeline, S3.
Education
Arizona State University
Master of Science - MS
Vellore Institute of Technology