A shocking story was promoted on the “front page” or main feed of Elon Musk’s X on Thursday:

“Iran Strikes Tel Aviv with Heavy Missiles, read the headline.

This would certainly be a worrying world news development. Earlier that week, Israel had conducted an airstrike on Iran’s embassy in Syria, killing two generals as well as other officers. Retaliation from Iran seemed like a plausible occurrence.

But, there was one major problem: Iran did not attack Israel. The headline was fake.

Even more concerning, the fake headline was apparently generated by X’s own official AI chatbot, Grok, and then promoted by X’s trending news product, Explore, on the very first day of an updated version of the feature.

  • DeceptichumEnglish
    arrow-up
    30
    arrow-down
    0
    ·
    6 months ago
    link
    fedilink

    It’s pretty, trending is based on . . . What’s trending by users.

    Or as the article explains for those who can’t comprehend what trending means.

    Based on our observations, it appears that the topic started trending because of a sudden uptick of blue checkmark accounts (users who pay a monthly subscription to X for Premium features including the verification badge) spamming the same copy-and-paste misinformation about Iran attacking Israel. The curated posts provided by X were full of these verified accounts spreading this fake news alongside an unverified video depicting explosions.

      • DeceptichumEnglish
        arrow-up
        12
        arrow-down
        0
        ·
        6 months ago
        link
        fedilink

        It does say it’s likely hyperbole, so they probably just tazed and arrested the earthquake.

        Also I’m impressed by the 50,000 to 1,000,000 range for officers deployed. It leaves little room for error.

        • PopSharkEnglish
          arrow-up
          2
          arrow-down
          0
          ·
          6 months ago
          link
          fedilink

          I wonder if the wide margin is the AI trying to formulate logic and numbers in the story but it realizes it doesn’t know how many officers would be needed to shoot the earthquake since it would logically depends on the magnitude of the earthquake which the AI doesn’t know so it figures well alright tectonic plates are rather resistant to firearms discharge and other potential law enforcement tactics so it starts high at 50,000 but decides 1,000,000 is a reasonable cap as there just can’t be more than that many officers present in the state or country

    • aceshighEnglish
      arrow-up
      7
      arrow-down
      0
      ·
      6 months ago
      link
      fedilink

      Wow. What a world we live in.