“We are raising funds to support a critical legal defense in the fight against unchecked corporate power and a system that continues to favor the few over everyone else. This case isn’t just about one individual—it’s about challenging a status quo that protects the interest of the powerful at the expense of justice and fairness,” read one of the fundraising pages that was quickly removed by GoFundMe.

  • whoknewr@lemmy.today
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    12 days ago

    I have genuinely started to dislike the internet as a whole in the last couple of years, where everyone is trying to be "familiar-friendly " and “corporate-friendly” like wtf? Even I’ve got enough shadow bans on me(especially on the ones by google) that I’ve almost got a strange gut reaction when using a swear word, to stop and consider if it’s going to get me banned. But atleast I’d be able to say what I want. Even then, I’m just so much more happy that I can type out long-ass 10 page rants and it would not be considered a spam, just because of being long on reddit and eternity. Also, fuck lemmy world mods for censorship.

          • Malfeasant@lemm.ee
            link
            fedilink
            arrow-up
            1
            ·
            11 days ago

            Eh, I’m talking about speech that isn’t the direct result of real harm being done. I think the distinction is pretty clear don’t you?

              • Malfeasant@lemm.ee
                link
                fedilink
                arrow-up
                1
                ·
                10 days ago

                People can AI generate convincing CP images. Should those be allowed?

                I don’t see why not, if nobody is being harmed in its creation. Just because something is disturbing doesn’t mean it needs to be illegal.

                saying people should go kill a certain group.

                Saying it doesn’t harm anyone, it’s the people doing it who cause harm, that seems pretty clear cut to me.