Session Name: Exploring Cloud Native MLOps for GenAI Vector Search
ChatGPT has been in the center stage since early this year. We will first take a look into this exciting sub-new field of Generative AI, and understand what LLM and NLP are, and the challenges that all of these are presenting themselves. We will also highlight the importance of Vector Search, and what a Vector DB's role is helping with the embeddings and fast index pattern matches/searches. While it is exciting, we also need to ensure that the process of building, integrating, and continuous deployment are being handled in the most efficient way, and by leveraging the cloud native environment with Kubernetes, we will examine how the process can be optimized by leveraging on the serverless and event-driven nature of a typical cloud native environment. MLOps—machine learning operations, or DevOps for machine learning—is the intersection of people, process, and platform for gaining business value from machine learning. It streamlines development and deployment via monitoring, validation, and governance of machine learning models. With the rapid rise in popularity in GenAI, we will explore how the operational side things will be impacted and what MLOps will differ from DevOps.
Mary is a Java Champion and a passionate Senior Developer Advocate at DataStax, a leading data management company that champions Open Source software and specializes in Big Data, DB-as-a-service, Streaming, and Cloud-Native systems. She spent 3.5 years as a very effective advocate at IBM, focusing on Java, Jakarta EE, OpenJ9, Open Source, Cloud, and Distributed Systems. She transitioned from Unix/C to Java around 2000 and has never looked back since then. She considers herself a polyglot and loves to continue learning new and better ways to solve real-life problems. She is an active tech community builder outside of her day job, and currently the President of the Chicago Java Users Group (CJUG).