artificial Seven methods to secure LLM apps from prompt injections and jailbreaks /u/koryoislie January 27, 2024 January 27, 2024 submitted by /u/koryoislie [link] [comments] Share this: Share on X (Opens in new window) X Share on Facebook (Opens in new window) Facebook Share on LinkedIn (Opens in new window) LinkedIn Email a link to a friend (Opens in new window) Email Print (Opens in new window) Print
artificial 12 techniques to reduce your LLM API bill and launch blazingly fast products /u/koryoislie January 13, 2024 January 13, 2024 submitted by /u/koryoislie [link] [comments] Share this: Share on X (Opens in new window) X Share on Facebook (Opens in new window) Facebook Share on LinkedIn (Opens in new window) LinkedIn Email a link to a friend (Opens in new window) Email Print (Opens in new window) Print