Do you think edge AI ends up mattering more for autonomy, robotics, or local private inference?
Do you think edge AI ends up mattering more for autonomy, robotics, or local private inference?

Do you think edge AI ends up mattering more for autonomy, robotics, or local private inference?

It feels like a lot of AI discussion is still cloud-first, but some of the most interesting shifts seem to be happening at the edge.

A few areas that seem especially important:

- autonomy and robotics

- low-power always-on vision systems

- private local LLMs and on-device inference

- bandwidth-constrained industrial deployments

Curious how people here see it:

Over the next few years, where do you think edge AI matters most, and which hardware/software stacks actually win in practice?

submitted by /u/rgc4444
[link] [comments]