It’s all made from our data, anyway, so it should be ours to use as we want

  • Bronzebeard@lemm.ee
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 day ago

    Making it open source doesn’t change how it works. It doesn’t need the data after it’s been trained. Most of these AIs are just figuring out patterns to look for in the new data it comes across.

    • NoForwardslashS@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 day ago

      So you’re saying the data wouldn’t exist anywhere in the source code, but it would still be able to answer questions based on the data it has previously seen?

      • Bronzebeard@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        8 hours ago

        Most AI are not built to answer questions. They’re designed to act as some kind of detection/filter heuristic to identify specific things about an input that leads to a desired output.

        • NoForwardslashS@sopuli.xyz
          link
          fedilink
          English
          arrow-up
          1
          ·
          24 hours ago

          So then why, if it were all open sourced, including the weights, would the AI be worthless? Surely having an identical but open source version, that would strip profitability from the original paid product.

          • Bronzebeard@lemm.ee
            link
            fedilink
            English
            arrow-up
            4
            arrow-down
            1
            ·
            21 hours ago

            It wouldn’t be. It would still work. It just wouldn’t be exclusively available to the group that created it-any competitive advantage is lost.

            But all of this ignores the real issue - you’re not really punishing the use of unauthorized data. Those who owned that data are still harmed by this.

            • stephen01king@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              ·
              21 hours ago

              It does discourages the use of unauthorised data. If stealing doesn’t give you competitive advantage, it’s not really worth the risk and cost of stealing it in the first place.

              • Bronzebeard@lemm.ee
                link
                fedilink
                English
                arrow-up
                1
                ·
                8 hours ago

                If you can still use it after you stole it, as opposed to not being able to use it at all… Then it does give you an incentive

                • stephen01king@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 hours ago

                  If you did all the work and potentially criminal collection of data, but everyone else gets the benefit as well, that is not an incentive. You underestimate how selfish corporations can be.

                  OpenAI wouldn’t stay at the forefront of LLM if every competitor gets to use the model they spent money on training.