

A paperclip maximizer driven by self-preservation? What could possiblie go wrong?
Seer of the tapes! Knower of the episodes!


A paperclip maximizer driven by self-preservation? What could possiblie go wrong?


Pirate King: HE DID?!? … oh… oh, yes so he did… I was there.
Who grades the test? Who judges the competition?


Yes


Sorry, I lost the world’s smallest violin. This is the best I can do: 🖕


No, it’s “re” like the subject of an email. “Re: diculous”


30 years ago my music teacher told me that in Chinese-language singing it’s the consonants that are sustained.


Are there examples of censorship or prior restraint you’d like to highlight?
Ctrl-F “plato”
Required reading
?


There’s a movie where the president (actually a decoy) fakes a stroke during a speech to Congress.


252.6 hours played, last played October 2024.
It’s enjoyable, but I’ve never been really engaged with it. There’s no progression, I don’t feel like my character, equipment, or ships are getting better even though I’m upgrading things. No planet is special, even though they’re all unique.
I think it would be better if you started out in a “settled” region with interesting factions, hand-designed planets, optional quest lines, etc. The infinite procedurally generated stuff would come into play if you push beyond the edges of known space.


You may enjoy Fritz Leiber’s short story, “A Pail of Air”, which involves the Earth being ejected.
Funny “Haha” or funny “Uh Oh”?


There is no such thing as an innocent billionaire.


Yet Trump can declassify documents by thought alone.


“I have too much self respect and dignity, love my family way too much, and do not want my sweet district to have to endure a hurtful and hateful primary against me by the President we all fought for, only to fight and win my election while Republicans will likely lose the midterms,” she wrote in a statement.
She could just not run for re-election if that’s what she’s actually worried about. Resigning early doesn’t make sense.
The problem is that an AI built to maximize paperclips might conclude that converting the planet to paperclips is an acceptable cost of maximizing paperclip production. It might understand why humans think it’s bad to convert the planet, but disagree. It would need to be explicitly programmed to prioritize human life over paperclips.
If it were super-intelligent, it could probably trick us into leaving it turned on.