• LarmyOfLone@lemm.ee
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    4 days ago

    At least it’s open source and you can run it locally, and it’s more energy efficient. Give credit where credit is due. A truly independent AI group could simply fork and review it. And contrary to fascist talking points not all Chinese Researchers bow to the CCP.

    • Deceptichum@quokk.au
      link
      fedilink
      English
      arrow-up
      1
      ·
      4 days ago

      You can run Llama models for free locally, pretty much every AI player release stuff out there for everyone to use for free. It’s surprisingly “open” like that.

      And you cannot run the real R1, that still requires a huge investment. It can run reduced quants, barely any different to the other current public offerings.

      • LarmyOfLone@lemm.ee
        link
        fedilink
        English
        arrow-up
        3
        ·
        edit-2
        4 days ago

        It seems you can run the full R1 on a 96GB ram gaming rig even without GPU. I have little practical experience because of my old PC but it seems both improvements to efficiency to run locally (at least for a group of people) and research about bias or poisoning is being done.