So I have this feature where I will provide the llm a set of conditions lets say I am giving it a body description.
Using that body description the llm will analyze it and suggest suitable workouts.
But the catch is I have a list of 800 workouts and I want the generated response to be from the list.
One option is to send the document to the LLM , but doing it everytime is token consuming.
Are there alternate cheaper ways to do that ?
[link] [comments]