How to get structured output from a LLM
How to get structured output from a LLM

How to get structured output from a LLM

Hey everyone,

I am working on a practical LLM-based app and struggled with getting clean JSON output. I know I'm not alone in this.

To solve this I found- The Instructor library. It is solid for getting structured data from any LLM.
I put together a cookbook showing how to use it: https://git.new/PortkeyInstructor (The way I use it, to add interoperability and observability with 100+ LLMs)

Here's the link to Instructor docs- https://python.useinstructor.com/
Let me know your thoughts on this

submitted by /u/Pitiful_Yak_390
[link] [comments]