It’s not always easy to distinguish between existentialism and a bad mood.

  • 21 Posts
  • 704 Comments
Joined 3 years ago
cake
Cake day: July 2nd, 2023

help-circle


  • (1) A lot of the stuff Netflix axed as soon as they hit three seasons instead of paying the guild mandated raises, like Santa Clarita Diet. Also on the subject of Netflix, the cancellation of Dark Crystal: Age of Resistance after one season was completely unforgivable, a really high quality labor of love that didn’t deserve to be done in like this.

    (2) Recency bias but Redliners by David Drake, I guess, military sci-fi should fit the style. It’s the story of a mid-alien-war settler expedition in non-enemy territory that on the dl was supposed to work as a more active means of reintegrating it’s guard detachment of ptsd’ed out veterans (the titular redliners) back into society, that goes immediately, terribly, awry.

    (3) Matrix and Oldboy for perfecting various aspects of modern film-making into cultural milestonehood, but realistically any aspirant would probably be far better served by binging MST3K and Garth Marenghi’s Darkplace.

    (4) Star Trek, probably. Its parodies already seem more in the spirit of the original than the current series anyway.





  • I checked it out because I was curious if CEV was some international relations initialism I’d never heard of, turns out its just My Guess About What He Wants in rationalese.

    Excerpt from the definition of Coherent Extrapolated Volition, or how to damage your optical nerve from too much eye rolling:

    Extrapolated volition is the metaethical theory that when we ask ā€œWhat is right?ā€, then insofar as we’re asking something meaningful, we’re asking ā€œWhat would a counterfactual idealized version of myself want* if it knew all the facts, had considered all the arguments, and had perfect self-knowledge and self-control?ā€ (As a metaethical theory, this would make ā€œWhat is right?ā€ a mixed logical and empirical question, a function over possible states of the world.)

    A very simple example of extrapolated volition might be to consider somebody who asks you to bring them orange juice from the refrigerator. You open the refrigerator and see no orange juice, but there’s lemonade. You imagine that your friend would want you to bring them lemonade if they knew everything you knew about the refrigerator, so you bring them lemonade instead. On an abstract level, we can say that you ā€œextrapolatedā€ your friend’s ā€œvolitionā€, in other words, you took your model of their mind and decision process, or your model of their ā€œvolitionā€, and you imagined a counterfactual version of their mind that had better information about the contents of your refrigerator, thereby ā€œextrapolatingā€ this volition.









  • Luckily we should be getting trickle down free will, since all universes are (of course) able to develop technology to perfectly simulate universes of lesser complexity, which seems to imply the existence of a special universe of ultimate complexity where all others emanate from, possibly in line with ain soph or equivalent mystical concept.

    I don’t know how that squares with that blabbing about the tegmarkian multiverse that supposedly posits that mathematically simple universes ā€œexist ā€˜moreā€™ā€, which siskind probably just included to reinforce his consciousness as a non-physical, mathematical object premise.