At Google, my team and I have enhanced the data processing pipeline, catering to over 10 million users by implementing Pub/Sub technology, which significantly accelerated deployment times and bolstered team productivity.
Experience
2022 — Now
2022 — Now
Sunnyvale, California, United States
• Designed and implemented a highly scalable pipeline that leverages Pub/Sub technology to process data for an extensive user base of over 10 million customers
• Successfully tested and deployed DLP and Rules binaries to production automatically with a rollout plan in place to manage the scale of data processing
• Improved efficiency within the DLP team by updating documentation and eliminating manual testing, leading to a significant reduction in operating expenses
• Streamlined operations and reduced code deployment time from 2 weeks to 4 days, ensuring fast and reliable delivery of updates
• Reduced DLP Onduty work time by over 50%, enhancing team productivity and efficiency
• Addressed and minimized flakiness in webdriver tests, achieving a consistent testing process with nearly zero inconsistencies
• Created comprehensive Shift Left Guidance and DLP test strategy documents to guide future projects towards enhanced automation and testing standards
• Successfully eliminated the reliance on manual testing, paving the way for more efficient and effective testing strategies in upcoming projects.
• Led the re-architecture of the Bigquery export pipeline for activity metrics, enabling streaming-based exports for customers
• Created a streaming pipeline processing around 1 million qps of audit data, reducing audit event delays from 48-72 hours to p99 < 10 minutes
2018 — 2022
2018 — 2022
Bangalore
• Managed a team of four engineers responsible for overseeing three key products: the Master Rules List, Alert Center, and Bigquery export project.
• Seamlessly launched extractor pipeline processing over 2 petabytes of data and exporting billions of rows daily
• Transitioned BQ export project ownership and reduced dashboard pages from 60 to 1-2 per week
• Managed the Alert Center for G Suite, launching new features like Account suspension appeal action
• Led the Master Rules List project, enabling rules icon for all SKUs and significantly increasing weekly page views
• Designed and implemented rpc service framework for Customer Compliance Center, integrating with FileComp for rendering certificates
• Conducted data quality analysis for OAuth Token pipeline and implemented enhancements for OAuth Token dashboard including frontend and backend changes.
2018 — 2018
2018 — 2018
Bangalore
• Developed an analytic system for tracking task sequences
• Created hybrid filing feature allowing clients to file taxes using paper-filing or E-Filing methods for federal and State Tax filing
• Implemented API for sending serialized data using Django Rest framework and conducted integration and unit tests for testing functionality
• Utilized React and Redux for frontend development, and Python for backend tasks
2016 — 2018
2016 — 2018
Bangalore
• Designed and managed an application to perform risk review based on ad-hoc
queries.
• Developed APIs in Slang to fetch risk data from SECDB database and to
perform risk analysis.
• Developed a userinterface to perform risk review checkout. Used SUIT
datacube viewerto represent data.
• Wrote comprehensive regtests to test the functionality
2015 — 2015
2015 — 2015
Bengaluru Area, India
• Was part of Stork team which creates and maintains different channels to
send/receive messages.
• Developed a new REST API channel in Java to send messages on mobile
devices for Linkedin users in China. Used WeChat app as the support channel
to send/receive messages from app. Also, worked on sending notification
through chrome devices.
Education
Indian Institute of Technology, Guwahati
Bachelor’s Degree
2012 — 2016