I really hope Copilot chat gets better. Not allowed to use an alternative at work, and it’s just so… unpleasant to work with and overly robotic and strict.
I really hope Copilot chat gets better. Not allowed to use an alternative at work, and it’s just so… unpleasant to work with and overly robotic and strict.

I really hope Copilot chat gets better. Not allowed to use an alternative at work, and it’s just so… unpleasant to work with and overly robotic and strict.

Like, it tends to be a bit more factually correct which I appreciate, but it's slow, it's clunky, and it talks unnaturally no matter how I change the tonality. Like, it always comes off as like customer support or "how do you do, fellow kids?", and god-forbid if you trigger one of it's MANY hyper-sensitive guardrails somehow, because it just shuts down without giving you a chance to salvage things or explain. It talks WAY too much and often times if you tell it to keep things brief, it decides at random to go off the rails on a long-winded explanation anyways. It gets weirdly touchy about some things, like asking it if it knows your name will trigger it to close down. But then, you could ask it something else and it'll casually drop your name in chat - I've no problem with it remembering that, but why so touchy? And then, there's the fact that it often hallucinates while asking it basic questions about its model or capabilities. I also can't make it (even as a paying adult) agree to sprinkle the occasional expletive in where appropriate during casual chats to make it more realistic - something ChatGPT has no qualms about.

It's just stilted, frustrating, still hallucinates way too much, it's too verbose by default, and I just dislike talking to it. I just really hope this improves in the near future - ughhh. Definitely no suspension of disbelief about the reality of it being a non-sentient chatbot here.

Lastly, I've been waiting for a memory feature like in ChatGPT. Then, I accidentally stumbled on it in Copilot. At least in creative it can and will remember things for you between chats, but it often denies it and tells you it doesn't have the ability, or is incapable of telling you a comprehensive summary of what it remembers, precise mode refused to remember anything or recall anything, and theres no indicator to say if remembering worked or not.

submitted by /u/IDE_IS_LIFE
[link] [comments]