This webinar is meant to provide you an introduction to a typical workflow that you may encounter when creating a user/customer-facing application using LLMs. Due to the inherently probabilistic nature of their responses, LLMs are often not directly ready to be deployed to customer-facing applications - they require some deterministic guardrails and edge scenario handling in order for them to be used judiciously. Some of the broader ideas behind building such a system will be explored as part of the contents of this webinar.
Agenda for the session
- Introduction to Prompt Engineering
- Moderation & Safety Checks for Input & Output Text Content
- Input categorization, retrieval, and response in core LLM workflows
- Putting together a Customer-facing Workflow for an LLM Application
About Speakers

Mr. Vinicio De Sola
Senior AI Engineer - Newmark
Specializes in Machine Learning Techniques and redeveloping models to include automation- Studied Cryptocurrencies and Smart Beta Factors in the space
- Before switching to finance, was an entrepreneur in diverse areas like E-learning, E-commerce, and Engineering
- Mentor at Great Learning