Experienced Software Engineer with a demonstrated history of working in the Security and Retail industry. Skilled in Java, Kafka, Spring Boot, Kubernetes, Docker, CI/CD, Redis, Lucene, Solr/Elastic Search, Web Applications, Amazon Web Services (AWS), Maven, and JavaScript.
Experience
2024 — Now
2024 — Now
San Francisco Bay Area
PayPal Complete Payments, Merchant–Partner Integrations, 3DS.
Java, Spring Boot, Rest API, Kafka, Postgres, AWS Lambda, Docker, Kubernetes, Helm, DynamoDB, and Secrets Manager.
2022 — 2024
2022 — 2024
Led the initiative to optimize inventory positioning by considering speed, cost, distance from the customer, and utilizing Java, Spring,
Kafka, AWS, and Gurobi optimization techniques. Sales data, inventory availability, and forecast data informed the process, resulting
in significantly improved efficiency and customer satisfaction.
Collaborated with the Data Science team to build a speed cost optimizer instead of the distance-based optimization, improving
inventory positioning accuracy.
Set up an Experimentation Platform using Optimizely to dynamically route the requests to different models, enhancing decision-
making processes and helping with historical testing to confirm the positioning results.
Lead the effort of implementing reverse logistics for Marketplace that could potentially lead to 20% more revenue for the company
using Java, Spring Boot, AWS, Rest API, Kafka & Postgres RDS
Designed and implemented a high-performance CDC solution using Debezium, Kafka Streams, and Kafka State Stores to publish real-
time change events from Postgres RDS to a Kafka Avro event for faster Inventory availability view.
2020 — 2022
2020 — 2022
Seattle, Washington, United States
· Working on the Inventory management team for enriching all the Inbound events, created/developed a unified approach to receive messages by consolidated different ways, reduced business losses by maintaining correct inventory sync using Spring Boot, Java, Kafka, Postgres Aurora, Dynamo DB, Gitlab CI/CD, Kubernetes, Helm, Docker, and Redis.
· Working on onboarding to Stock Ledger transactions to migrate towards Cost Accounting from Retail accounting.
· Set up a Redis Cluster on K8S using Helm 3 to cache the in-memory events and reduce any loss of data.
· Implemented partitioning strategies for Kafka Producer to streamline the outflux and also letting the consumers choose on what data they need reducing the business lag by 30%.
· Implementing tokenization & redemption for PCI compliant data of Customers and Vendors.
· Built a service for keeping a track of payloads & their statuses to ensure that they complete the end-to-end life cycle.
· Streamlined the CI-CD process for the org by properly leveraging Gitlab’s branching strategy & reduce the manual effort required for that. Also worked on MR guidelines that can be useful for developers as well as reviewers.
· Actively participated in different release cycles and improved the On-Call cycle by various recommendations on logging, setting up New Relic alerts, Splunk Dashboard & responsive logic for auto retries instead of manual work.
2016 — 2020
2016 — 2020
Dallas/Fort Worth Area
· Development and Deployment of a Security Analytics Software that transforms streaming data logs into actionable
security intelligence for Threat Detection using Spark Streaming, Kafka, Java, SOLR for real-time indexing, REDIS as
an in-memory cache, YARN as the resource manager and HDFS as Secondary Storage.
· Designed and Developed a data pipeline to publish real-time data to Kafka and store in Redis, the data is normalized,
enriched and stored in SOLR that helps in indexing and enabling full-text search.
· Worked on the Securonix Spotter, a natural language search engine based on Solr and HDFS to implement new
commands and operators for the investigators to query data to understand who was doing what, when, and why, with
all the relevant contextual information needed to be effective.
· Worked on improving the scalability of Solr by adding tuning parameters and indexing using document routing/implicit
sharding and implemented methods for recovery in case of failure to support HA. I also designed a system to index data
to multiple SOLR cloud at the same time thus achieving indexing speed of 500K EPS & also supporting multi-tenancy.
· Designed and Developed a centralized count monitor framework that shows all the counts of the records ingested to
different data pipelines such as Hadoop, Redis, SOLR using pub-sub messaging.
· Developed Responsive and cross-browser friendly interfaces using HTML5, JavaScript, jQuery and Bootstrap.
2015 — 2016
Greater Los Angeles Area
• Maintained the Election’s Web Page, planned, executed and marketed elections according to the committee’s code.
Education
Cal State LA College of ECST
MS
2014 — 2016
Gujarat Technological University (GTU)
Bachelor of Engineering (B.E.)
2008 — 2013