• Electricd@lemmybefree.net
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    4
    ·
    edit-2
    2 days ago

    the harder it is for AI developing companies to improve on previous models.

    They all use each other’s data to improve. That’s federated learning!

    In a way, it’s good because it helps have more competition

    • BlackRoseAmongThorns@slrpnk.net
      link
      fedilink
      English
      arrow-up
      3
      ·
      2 days ago

      I was talking about ai training on ai output, ai requires genuine data, having a feedback loop makes models regress, see how ai makes yellow pictures because of the ghibli ai thing

      • Electricd@lemmybefree.net
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        2 days ago

        Sure, that mainly applies when it’s the same model training on itself. If a model trains on a different one, it might retrieve some good features from it, but the bad sides as well

          • Electricd@lemmybefree.net
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            1
            ·
            edit-2
            2 days ago

            If they weren’t trained on the same data, it ends up similar

            Training inferior models with superior models output can lower the gap between both. It’ll not be optimal by any means and you might fuck its future learning, but it will work to an extent

            The data you feed it should be good quality though