• elshandra@lemmy.world
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Well AI fact, in this use has always been made up of a combination of man’s fact and fiction. Nobody’s been smart enough to make an AI that can reliably separate the two, to my knowledge.

    • FlaminGoku@reddthat.com
      link
      fedilink
      arrow-up
      0
      ·
      5 months ago

      It’s all about cleaning datasets. For forecasting models, you need to occasionally remove certain historical data to increase accuracy.

      The same could work here, but it’s obviously at a significantly larger scale and crosses into every interest and discipline.

      I believe the solution is curated data models with the top members of the applicable field determining validity or a stack overflow model.

      We should basically have a “clean” copy of the internet that is always 3-6 months behind as it is only added with quality data.