HNNewShowAskJobs
Built with Tanstack Start
NYT to start searching deleted ChatGPT logs after beating OpenAI in court(arstechnica.com)
50 points by miles 2 days ago | 14 comments
  • bn-la day ago

    > Instead, only a small sample of the data will likely be accessed, based on keywords that OpenAI and news plaintiffs agree on. That data will remain on OpenAI's servers, where it will be anonymized, and it will likely never be directly produced to plaintiffs.

    I’m glad about this detail.

  • sandspar2 days ago

    What is the NYT's stance here? Is it pure spite? I guess their lawyers told them this is the winning move, and perhaps it is. But it just seems so blatantly wrong.

    If you look at Reddit's r/ChatGPT, you'll quickly notice that the median use of ChatGPT is for therapy.

    Is the NYT really ok with combing through people's therapy logs?

    • tashoecrafta day ago |parent

      I find it interesting you are blaming the NYT on this and not ChatGPT for keeping these logs in the first place. If openAI didn't keep logs, then there would be nothing to search, and a more harmful actor couldn't accomplish something far more nefarious. Saying that there could be confidential information in the logs, so that means we shouldn't access it, should also mean the logs shouldn't be kept.

      • nrdsa day ago |parent

        As explained in literally the first paragraph of TFA, the court ordered openai to start keeping these logs. They didn't do it by choice.

      • a day ago |parent
        [deleted]
      • ycombinatrixa day ago |parent

        Did you consider reading the article before writing out this comment? Please do next time.

    • goatlover2 days ago |parent

      Is there an expectation of privacy using ChatGPT? Do users think nobody is ever going to be looking at their logs?

      • conception2 days ago |parent

        If you are a paying member and are not sharing prompts, yes?

        • noman-land2 days ago |parent

          Stop having this expectation. It's factually incorrect.

          • edgineera day ago |parent

            When my expectation of privacy is violated, I'll have learned a little more about such violations, but I won't drop my expectation not to be violated.

            • edgineera day ago |parent

              What I mean is there are two meanings to "expectation of privacy": the Bayesian prior, and the legal stance. I have an expectation of privacy in my home but I still close the shades.

    • jaimex22 days ago |parent

      They don't care. This is purely for a business upper hand.

      OpenAI should probably encrypt the chats and lock itself out going forward. Collect whatever metrics they need on the fly before locking.

      • heavyset_go2 days ago |parent

        OpenAI would never lock themselves out of free training data.

  • vintermann2 days ago

    Now is the time to go have a chat with ChatGPT about how much NYT sucks. Maybe it can help come up with insulting things to call their lawyers too.