The good thing about computers is they do exactly what you tell them to do. The bad thing about computers is they do exactly what you tell them to do.
Experience
2023 — Now
2023 — Now
New York, New York, United States
Navigating the tension between advising and building, drawn toward transformation via demonstration.
2018 — 2023
New York City Metropolitan Area
Having fun lately achieving ridiculous time-to-value on AWS via the Serverless framework.
* Emphasis the last couple of years on leading initiatives that iteratively build and deploy event-driven cloud solutions to achieve fast time-to-value at scale and accommodate end user feedback early in the development lifecycle. Most recently leveraging Serverless-on-AWS and AWS-native deployment automation to achieve full-stack – data store and pipelines, REST-based services layer, responsive user interface, non functional aspects - production pilot in 90 days for multi-billion dollar company’s greenfield ”real-time” inventory and asset tracking system.
* Automating data processing and model inference (machine learning) workflows in Databricks and AWS Glue.
* Led development of multi-tenant user web application tier of commercial financial forecasting software product. Designed and constructed data layer and constituent objects (MySQL) related to forecasting configuration and results. Established services layer and API data contract with UI in coordination with UI developers to meet end user requirements. Contributed to design of overall product (e.g., interaction points, orchestration) and deployment to production (Kubernetes via Openshift).
* Experience in the financial sector encapsulating models into Python analytics tier, constructed elements for that layer to interact with data store (Oracle RDBMS) and user interface (Node/Angular via Flask REST API). Primary contributor to technical documentation and hand-over (training sessions, etc.) of such solutions to the client.
2014 — 2018
Cleveland/Akron, Ohio Area
• Translate customer business needs into functional and non-functional requirements, technical specifications.
• Build (Lua) integration of central medical (HL7 v2/v3: orders, results, demographics, billing, prescriptions) applications with multiple outside systems (clinics, labs, etc.), often leveraging REST/SOAP-based APIs.
• Construct HL7 C-CDA (Meaningful Use Stage 2) documents from source data, transport (socket/database/file-based) between facilities and systems.
• Accommodate clinical workflows by leveraging application triggers to drive data operations.
• Serve as technical point of contact for production issues.
2015 — 2017
Greater Cleveland
• Leveraged Amazon Web Services (AWS) cloud infrastructure to collect (Python API) and store (DynamoDB) objects from multiple platforms (Google Analytics, social media, automated marketing).
• Designed structure and routines (Python: numpy / pandas) to correlate, aggregate, and maintain data to provide a single “view" across disparate sources.
• Prototyped via d3 (Javascript) visualization library.
• Deployed web services (Python) to expose visualization and operational (application) metrics.
• Streamlined the construction of web deliverables (HTML) by automating (Python) convergence of content (copy, imagery, code) originating from multiple departments.
• Prepared source data and developed dashboard (Tableau) for customer-facing presentation of information related to marketing campaigns.
• Acted as interim technical project manager to guide web development projects through delivery.
• Developed software development best practices for production of digital marketing content.
• Served as on-site system administrator (e.g., deployed/maintained Wordpress network installation).
2012 — 2014
Greater Cleveland
Designed integrated Streams ("real-time" computing platform) solutions to meet industry/company-specific (emphasis: telecommunications, a.k.a. telco) functional and non-functional requirements:
• Comprehensive (voluminous) cross-source (voice, data, IPTV, etc.) correlation of customer-generated events into unified “view” of customer behavior.
• Development of integration patterns between InfoSphere Streams and other products in the IBM Big Data portfolio: Operational Decision Manager (rules-driven analysis); Netezza/PureData (analytics appliance); SPSS (advanced analytics); WebSphere MQ (message queuing); Cognos (visualization).
• Integration with legacy (non-IBM) client products and systems (data warehouses, external APIs, etc.).
• Continuously informing client of details around how proposed solution will satisfy specific business needs, serve as principal technical contact for client during development.
• Design in-memory data aggregation and management patterns.
• Predictive (based on historical data) customer-product affinity.
• Rules-driven detection and management (aggregation, ticketing, etc.) of fraudulent customer behavior.
• Analysis and visualization of cellular network activity and quality of service.
• Accommodate massive data volumes through utilization of scalable (i.e., parallel processing paths) architecture.
• Preservation, maintenance, and persistence of current application state for recovery following system or application outages.
• Partnering with various adjacent development teams to define underlying infrastructure such as data models, distributed file systems, application hand-offs.
• Developing, reviewing, and testing of custom SPL (Streams processing language) functions.
Education
Ohio University
Bachelor of Science (B.S.), Civil Engineering
1995 — 2001