• Fargeol@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    2 days ago

    By training an algorithm that will have an impact on said statistics. Not only the algorithm can cheat (see Goodhart’s law), but it can repeat biases that led to these statistics (like those law enforcement algorithms that became racists)

    • Apytele@sh.itjust.worksOP
      link
      fedilink
      arrow-up
      13
      ·
      2 days ago

      My favorite was the one they were training to detect cancer in imaging scans but they forgot to edit out the info stamp in the corner so it just started flagging all the scans from the cancer center!

        • Ideonek@lemm.ee
          link
          fedilink
          arrow-up
          3
          ·
          2 days ago

          A similar case was with scans from “mobile scanners”. Since those are used on patients to sick to be transported, their cases were disproportionaly “malicious”. Model was effectively optimozed to detected if scaner was stationary or mobile.

    • Ideonek@lemm.ee
      link
      fedilink
      arrow-up
      3
      ·
      2 days ago

      You are describing Google Ads right now. Algorithms are better and better in reaching to poeple that are already on the purchase patch. It’s like giving a restaurant flayers to people that are waiting for a weiter to show them a table.

      Aren’t our ads amazing? Look, almost everyone who saw them made the purchase!

      Analytics that ignores Goodharts law ruin everything. Movies, HR, Marketing (not much to ruin left, but you get the point), performancet review, recommendations…