• FlexibleToast@lemmy.world
      link
      fedilink
      arrow-up
      25
      ·
      1 year ago

      From my understanding, Sam Altman is the one pursuing profits, and the non-profit board is the one that was overseeing it being done “safely.” If this is the case, it is the non-profit board that should be rallied for.

      • MojoMcJojo@lemmy.world
        link
        fedilink
        arrow-up
        1
        arrow-down
        3
        ·
        1 year ago

        It’s like GMO crops. Currently we have not figured out better methods to feed the planet at scale. If anyone knows how to get the billions of dollars needed to build, run, maintain, and constantly improve such a massive super computer for the world to use, today, by all means let us know. To be clear, I agree with you, but a project this big, without something like international funding (LHC or CERN or ISS) its just not going to happen. As far as I know, for profit is currently the most effective way to funnel resources into a project.

        • FlexibleToast@lemmy.world
          link
          fedilink
          arrow-up
          14
          ·
          1 year ago

          But that’s what this is. OpenAI is both for-profit and non-profit. It has a profit arm that made the huge deal with Microsoft and ensures research continues, but there is the non-profit board that oversees them to make sure it’s done “safely.” If when the non-profit board makes a correction it gets immediately dismantled, then it was all for show and really the profit side is actually unchecked.

          • MojoMcJojo@lemmy.world
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            Great point. I don’t understand what happened yet for all of this to implode so quickly, but something this important is due to hit every emotional fault we have as humans. I think this showed how weak the board was for something like this to happen so quickly. A governing board is supposed to slow down rash decisions, not generate them. Either something drastic happened, or someone acted rashly. This should be a deliberate and rational endeavor. The people building AGI should never be surprised by the decisions being made, if not for the benefit of humanity, at least for the billions of dollars on the line. WTF OpenAI, get your shit together.

            • FlexibleToast@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              1 year ago

              A governing board is supposed to slow down rash decisions, not generate them.

              Yes, this is the weird part. The board seemed to do what it was designed to do but did it in the worst way possible.