• Enabled Spark NLP library to optimize and run deep-learning inference using OpenVINO Runtime and showcased benchmarks demonstrating up to 40% improvement in inference speed over Tensorflow.
• Wrote Java Native Interface (JNI) bindings to expose the OV Runtime C++ API in Java and enabled support for several models including BERT, T5 to seamlessly leverage the integration through Scala and Python APIs.
• Developed nlp-benchspark: an extensible tool for benchmarking NLP inference with PySpark.
• Published technical blog posts in collaboration with Intel.
• My project was one of the ~40 selected for the Contributor Lightning Talk Series among ~172 projects.