San Francisco, California, United States
• Designed an automatic cadence workflow to periodically deactivate offer bids based on rule engine execution. The technical challenges were from the complicated communication and synchronization between various data entities in the freight micro services landscape and the complexity of implementing the rule execution with high performance and corresponding testing hierarchy. The workflow then asynchronously sent the deactivation emails to all subscribers. Tech stacks covered all Uber popular tools such as Go, MySQL, gRPC, Cadence, ITEA integration test, Glue Dependency Injection framework, git etc.
• To support data analytical work, I partnered with data scientists and improved the event emitting contract by adopting Uber’s latest data pipeline toolsets to support fine-grained and seamless data emission, ingestion and analytics. Tech stacks covered core data pipelines technologies such as Kafka, arvo schema, Apache Hive, SQL.
• Finished several side quests besides my main tasks on back-end, such as volunteering to add interactive modules in our web application and carrier app to reflect my back-end changes, drafting cross-team design proposals, addressing tech debt by improving unit and integration test, E2E test and delivery etc.