HNNewShowAskJobs
Built with Tanstack Start
Local agents will win(twitter.com)
26 points by eddynosaur 2 days ago | 9 comments
  • embedding-shape2 days ago

    But wait, the Architect diagram says "Claude", and later referencing a MCP package from/for Anthropic/Claude Code. Is this ultimately just running CC in that VM? I'm not sure this is "local" as people typically understand it. Are they calling it local because the agent harness doesn't run "in the cloud"?

  • fragmede2 days ago

    > The agent runs locally.

    Oh the agent running locally, not the LLM itself. We'll see. The amount of prompting I do with Claude code via the app and not at my desk is way more than I ever would have thought. Flash of inspiration for a thing while I'm on the bus? Open the app on my phone and start a session from my phone, for me to check when I next get to a computer.

  • betterer2 days ago

    The title is technically correct; eventually models will run on local machines. We're just at another cycle of terminals not yet being powerful enough and needing to connect to a server.

    • mobiuscoga day ago |parent

      No big tech or government wants capable models to run locally, even if they could.

      There will likely be some local devices, but the majority of power will be gated behind money and control.

  • xnx2 days ago

    Distributed agents may win because they're much harder to block (in the same way "residential proxies" are hard to block).

  • 2 days ago
    [deleted]
  • an0malous2 days ago

    Post looks a little AI generated

    • jkhdigital2 days ago |parent

      more than a little… at least there’s no gratuitous emojis

  • throwaway3141552 days ago

    Guy clearly hasn't used a coding agent e.g. Claude Code.