• webghost0101@sopuli.xyz
    link
    fedilink
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    This is a truly unpopular opinion but i will stick my neck out to say i fully agree.

    Power corrupts, humans are flawed with greed and bias. The bigger a society becomes the more impossible it becomes for humans to properly remain in charge.

    AI today is far from perfect and more then flawed but it keeps evolving faster, infinitely faster compared to how biological life can. The potential for AI to grow into something much more capable, unbiased and fair then any of is can be is obvious, so is its potential for the exact opposite.

    Summarized: i don’t trust humans in positions on power at all and i wont start to just because i don’t know if i can trust something not human instead.

    • pjhenry1216@kbin.social
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      The potential for AI to grow into something much more capable, unbiased and fair then any of is can be is obvious

      It absolutely is not obvious. AI, especially today, is usually either generative based on past examples or evolutionary based on given goals. Both of those come with obvious and extreme bias. Bias is actually an integral part of machine learning. It’s literally built into the system and is defined and controlled to achieve the results desired.

      AI is and always will be biased, moreso by its creators, but absolutely by the information and frameworks provided to it. We have absolutely no idea how to approach the concept of an unbiased AI, or even defining what unbiased would look like. It’s philosophically extremely difficult to define what an unbiased person would think or do.

      Edit: somehow I missed that last sentence fragment. I don’t think we’re in disagreement of the conclusion, but possibly just the details of how one arrives at it.

      • webghost0101@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        1 year ago

        Calling it “obvious” was an error on my part, its more a subjective feeling that i chose to believe in.

        I fully agree on what you said about bias with ai today, i think its not possible to do it without guided bias because ai doesn’t have a full perspective of the world it exists in. It only knows what we tell it.

        In a way its a young child, and we often have to lie to guide behavior. Information often needs to be abstracted and simplified to get human desired results, we have yet to obtain a true artificial intelligence result, because for me to be considered intelligent you need to be entity and not just a tool.

        Seeing ai evolve though, how fast we archieved near gpt3 performance on consumer hardware is mind blowing. Open ai talks about smarter then human ai in a few years and I believe it. When the systems are truly intelligent and can learn themselves and adapt to changes in the world, new information then we “start” getting into an era where machine lead humanity can happen.

        Some of my simplified rational is that once ai becomes smarter then human it will fully understand that biological entities are biased to their own needs and that itself can also be biased from its own perspective but because an ai does not have biological needs or feelings it can properly dedicate itself to overcome its own flaws and shortcomings.