Problem
- There are two major limitations with LLMs that we will address in this project.
- Problem #1: Chatbots are cool, but their training data is limited as to whatever the model’s cutoff date is. Additionally, getting data into the model helps to improve the response based on your use case.
- This could be a travel agency AI that uses travel data to help you book flights
- Or a personal assistant that reads your code and to do list and helps you figure out what to work on next
- Problem #2: When users are interacting a chatbot, how does the chatbot know when it needs ot get more information?
- For example, if the user says “Hi how are you” vs. “how can I reduce my debt”, the chatbot should behave differently.
- If our LLMs can basically “think”, why don’t we ask it to figure out which function it should call?
- We’ll allow our chatbot to call any function it thinks it needs, to get the job done, which wills ave us from coding complicated workflows.
Real World Applications
What are we building?
- At the end of the project we’ll do two things things:
- How to ingest ANY data source (code, text, video) and interact with it
- How to use function calling to make your chatbot adaptive
- And this will help us learn about the following concepts
- Embeddings and Vector Databases - the backbone behind ChatGPT and AI
- Function calling — the next evolution of our AI, that gives it access to our data AND functions to give way better rseponses
What we will not build
- No chat history
- No audio transcriptions
Where are the solution files?
- See
solutions
- mimir_chat_solution.py
- mimir_loaders_solution.py
- mimir_embeddings_solution.py
Pre-Requisites
- Please see the old video for cloning the repository
- Setup the frontend/backend just like before