Hi all!

As many of you have noticed, many Lemmy.World communities introduced a bot: @MediaBiasFactChecker@lemmy.world. This bot was introduced because modding can be pretty tough work at times and we are all just volunteers with regular lives. It has been helpful and we would like to keep it around in one form or another.

The !news@lemmy.world mods want to give the community a chance to voice their thoughts on some potential changes to the MBFC bot. We have heard concerns that tend to fall into a few buckets. The most common concern we’ve heard is that the bot’s comment is too long. To address this, we’ve implemented a spoiler tag so that users need to click to see more information. We’ve also cut wording about donations that people argued made the bot feel like an ad.

Another common concern people have is with MBFC’s definition of “left” and “right,” which tend to be influenced by the American Overton window. Similarly, some have expressed that they feel MBFC’s process of rating reliability and credibility is opaque and/or subjective. To address this, we have discussed creating our own open source system of scoring news sources. We would essentially start with third-party ratings, including MBFC, and create an aggregate rating. We could also open a path for users to vote, so that any rating would reflect our instance’s opinions of a source. We would love to hear your thoughts on this, as well as suggestions for sources that rate news outlets’ bias, reliability, and/or credibility. Feel free to use this thread to share other constructive criticism about the bot too.

  • ZombiFrancis@sh.itjust.works
    link
    fedilink
    arrow-up
    36
    arrow-down
    8
    ·
    4 months ago

    I’m gonna be Left-Center on this with reliable credibility that the bot is useless at best.

    It is reporting on the source, not the content, of what is posted which is already going to be a problem for discourse.

    If there are media sources that are known or proven to be a problem, I would find it preferable the bot just alert that and ignore anything else.

    • jeffw@lemmy.worldOPM
      link
      fedilink
      arrow-up
      8
      arrow-down
      18
      ·
      4 months ago

      I appreciate the joke lol. But on a serious note, it sounds like you’re saying it’s not actually 100% useless, just that it’s being deployed too widely. Any specific suggestions on what the bot should say on those questionable sources?

      • ByteOnBikes@slrpnk.net
        link
        fedilink
        arrow-up
        12
        arrow-down
        1
        ·
        edit-2
        4 months ago

        My main issue is that it doesn’t provide any real value.

        If I see a Guardian/BBC news article about international events, I’ll give it a lot of trust. But when it’s talking about England, my eyebrows are raised. Calling it Left/right/center doesn’t help a reader understand that.

        Worse it hot garbage like The Daily Mail. They have no fact check or provide real journalism. It means nothing to me what it aligns to.

        Then the bottom of the barrel is some random news site that was spun up a month ago like Freedom Patriot News. Of course we know where it lands in the political spectrum. But it’s extreme propaganda.

        The challenge here is that trust has become subjective. Conservatives don’t trust CNN. Democrats don’t trust Fox News. It becomes difficult to rate the quality of the organization in a binary way.

      • ZombiFrancis@sh.itjust.works
        link
        fedilink
        arrow-up
        11
        arrow-down
        2
        ·
        4 months ago

        Current ownership and governance of the media outlet, generally speaking. Noting if an outlet is state owned or public traded, etc might help.

        Does the bot even tell the difference between an opinion piece and investigative journalism?

        If a source is a proven misinformation generator then noting the proof with direct links to evidence, cases, rulings, etc. However those sources tend to disappear quickly and are constantly being generated. It is whack a mole and generates an endlessly outdated list.

        The problem is it likely isn’t any information a bot can just scoop up and relay, and instead requires research and human effort.

        • jeffw@lemmy.worldOPM
          link
          fedilink
          arrow-up
          5
          arrow-down
          10
          ·
          4 months ago

          MBFC does link to articles that are examples of misinformation. And no, the bot cannot tell if something is an opinion piece or not.

          Interesting suggestion about state-owned media, hadn’t heard that before. Thanks for that