I've been trying to research but can't find any definitive answers.
I tried MS Copilot for the first time today and its voice option is really impressive. I've read that there are AI text readers but I was hoping something exists like MS Copilot where it's the LLM AI with a voice.
Like with Claude I get frustrated by the limitation. I have a paid account with Claude and still hit limits in the chats. It's really frustrating.
I'm a hardware engineer so I don't have a good grasp on how all this works, but I know that some models for generative AI can be downloaded and run locally (e.g. stable diffusion). Is this the case with LLM? Is there a way to run something like MS Copilot locally so I don't keep hitting limits?
There were also a lot of frustrating limitations with MS Copilot that I kept hitting, where I would get a response like "I can't talk about that" even though it seemed I was asking fairly straightforward technical questions.
I can really see getting a lot of use out of an Alexa- or Siri-like platform where it just listens to what I say and speaks back to me. Does that exist yet and/or is that something I can setup locally?
[link] [comments]