Experience
2018 — 2020
San Francisco Bay Area
• Create and design data pipelines for enriching the product catalog
• Lead efforts to design and implement shelf space optimization of the product catalog at real time. This allowed fine-grained control over the contents of the product catalog in the
elastic search. Improved cache look up time by implementing sketching data structures.
• Experiment and evaluate product categorization using various machine learning models.
• Build tools to monitor and evaluate categorization performance
• Created visualizations and published metrics for different categorization models using Grafana, StatsD and Graphite.
• Supported efforts to create and onboard new revenue stream channels.
• Designed and created automated slack alert messages for the revenue stream pipeline integrated with grafana dashboards
• Designed a proof of concept based on word vector embeddings to create a related search recommendation engine
• Working on design / re-architecture of the machine learning pipelines
• Experiment and evaluate catalog attribute classification task using various machine learning models to set the ground up for guided / bubble search based user experience.
2017 — 2018
2017 — 2018
San Mateo County, California, United States
• Create API’s to serve requests for the Co-viewed, Co-bought and Top-K results (based on the dimensions).
• Designed , tested and deployed data pipelines for real time feed ingestion that powers everything from the machine learning
models to grafana dashboards.
• Build Flink jobs to do realtime extract, transform and load operation for data consumption by the downstream systems - Created ansible scripts for seamless deployment using Jenkins.
• Profiled the JAVA application for memory leaks and optimized the API’s for better throughput
2015 — 2017
2015 — 2017
New York, United States
• Build applications using various big data technologies like hadoop, hive, HBase and SOLR.
• Create and apply algorithmic patches/fixes for both real-time and batch components of the AMEX real-time offer and merchant Recommendation platforms.
• Designed a real time shopsmall merchant typeahead, name and keyword search for both secure and non-secure user experience.
• Implemented a peer scoring feature for the real time component of the collaborative filtering algorithm to remedy the cold-start problem.
• Created ETL’s for the batch component of the US merchant scoring application.
• Build control cell logic to assign scores to the merchants based on popularity and random scoring algorithms.
• Automated interest, commerce and preference graph computations. Designed a python script to extract data from cornerstone 2.0 repository using a headless browser.
• Redesigned the similarity matrix to incorporate algorithmic enhancements
• Designed RESTful web applications for shopsmall maps migration and built a location aware search functionality for the maps.
• Created scripts to automate the process of continuous deployment in the developer environment with the goal to improve developer experience in the team.
• As a part portfolio engineering group within the Digital Offer Ecosystems team (DOE) brought in a few of the best programming Practices within the US scoring team.
• Worked on designing data pipelines handling multiple TB’s of data for the personalization and LET re-engineering datasets.
• Designed JMeter performance testing scripts for stress testing to identify possible code/JVM parameter optimizations in the JAVA applications. Wrote scripts to scan through GC logs and aggregate statistics to identify performance bottlenecks.
Education
University of Florida