People sometimes say that the purpose of source code is first to be read by humans. I agree; I've always thought that writing code is like writing an essay.
Experience
2016 — Now
Greater New York City Area
I'm currently taking a step back from leading engineering teams at Conductor in order to focus on cross-cutting, difficult-to-staff product and architectural issues. In the past year, I've built some targeted features to help our sales team close several strategic accounts; and I've ported our customer-facing application off of a legacy database and onto a reporting framework based on Reactive Streams, saving us many hundreds of thousands of dollars in COGS and scaling our reporting capabilities up by an order of magnitude.
Right now, I'm developing architecture and application prototypes to help guide the team through a large-scale migration to microservices.
2012 — 2015
2012 — 2015
Greater New York City Area
During my tenure as an Engineering Manager at Conductor, I established three different back-end / data engineering teams, supervising up to 11 direct reports. I defined a software development lifecycle process for a part of the engineering organization previously operating without one, and acted as an ad-hoc technical product manager to define architecture and requirements for back-end software services and data products. Over the three years I was in the role, I doubled the size of the team in a very competitive hiring market, ran a successful summer internship program, edited the engineering blog, and developed training materials to new hires on distributed computing technologies.
As a technical manager, I guided my teams through the design and implementation of some really exciting projects:
• Transitioning our data collection pipeline to an actor model based on Apache Kafka, which allowed us to dynamically scale the number of CPUs in our cluster to adapt to changes in data volume
• Creating a distributed rate-limiting system with locking based on Redis to control the behavior of our crawlers and maximize our utilization of constrained proxy resources
• Evolving our traffic analytics ETL services to increase their throughput by an order of magnitude - via Hadoop Map/Reduce and Hive - and to make them highly available and auditable, while extracting an SPI that's since been used to integrate three additional analytics vendors
• Scaling the performance of our report publishing infrastructure by a factor of 10 by transitioning to a new report generation framework based on Amazon Elastic Map/Reduce and S3, and safely migrating many terabytes of historical customer data to our new cloud data warehouse without interruption to service or degradation in data quality.
...and lots of other stuff!
2009 — 2012
2009 — 2012
Greater New York City Area
Conductor makes Searchlight, a web presence management application for marketers that helps them promote content via unpaid channels on the web. In my engineering role at Conductor, I worked across the stack, building out the data access layer for new features in Searchlight, and developing data visualizations for the application's front-end.
After moving to the platform engineering team, I was a key contributor on a project to transition the product's information extraction, recommendation generation, and ETL systems to Hadoop Map/Reduce and HBase, increasing the capacity of our report generation pipeline from the order of gigabytes to terabytes. I led the design and development efforts to integrate terabytes of traffic analytics data into the product via an automated collection and ETL pipeline and a sharded data warehouse.
In addition, I introduced continuous integration and static analysis to the build and deployment workflow, and established quality standards for source code that are still in place today.
2007 — 2009
2007 — 2009
Greater New York City Area
I joined Rebel Monkey to lead the server development for a cooperative, massively multiplayer online game - kind of a dream project for me. We failed to build an audience for the game, but its technology platform was arguably a success. The game world ran inside a specialized middleware container created by Sun Microsystems (right before the takeover by Oracle) that executed microtransactional fragments of game code with strong guarantees for durability and consistency.
As a smaller project, I helped build a paper-doll animation system in ActionScript to allow players to customize the appearances of their avatars with different outfits and hairstyles.
2003 — 2007
2003 — 2007
Greater New York City Area
In the days before Apache Hadoop, DataSynapse (now TIBCO) built a distributed computing platform to enable low-latency processing of compute-intensive across clusters of commodity hardware, for clients in financial services and scientific research.
I contributed to the Java and C++ client libraries and task execution engines, and led the engineering effort to build a .NET version of the same. This was a great job. I took from it that for a reliable distributed computing framework, the devil is in the details of fault tolerance, failover, and the quirks of the local operating system.
Education
Yale University
B.S.
2001 — 2003
Wesleyan University
Computer Science
1999 — 2001