Encultured AI was started in 2022 by regulars of AI doomsday site LessWrong to work out new safety benchmarks for existential risk — whether the AI would turn us all into paperclips. [LessWrong, 20…
Why so general? The multi-agent dynamical systems theory needed to heal internal conflicts such as auto-immune disorders may not be so different from those needed to heal external conflicts as well, including breakdowns in social and political systems.
This isn’t, an answer to the question why so general? This is aspirational philosophical goo.
“multi-agent dynamical systems theory” => you mean any theory that takes composite view of a larger system? Like Chemistry? Biology?Physics? Sociology? Economics? “Why so general” may as well be “why so uncommitted?”
I feel bayesian rationalism has basically missed the point of inference and immediately fallen into the regression to the mean trap of “the general answer to any question shouldn’t say anything in particular at all.”
Yeah, that’s a good call out, I do feel the meta is good obsession is borderline definitely cultish.
There’s a big difference between a committed scientists doing emperical work on specific mechanisms saying something like “wow, isn’t it cool how considering a broader perspective of how unrelated parts work together to create this newly discovered set of specifics?” and someone who is committed anti-institutional saying “see how by me taking your money and offering vague promises of immortal we are all enriched?”
This isn’t, an answer to the question
why so general
? This is aspirational philosophical goo. “multi-agent dynamical systems theory” => you mean any theory that takes composite view of a larger system? Like Chemistry? Biology?Physics? Sociology? Economics? “Why so general” may as well be “why so uncommitted?”I feel bayesian rationalism has basically missed the point of inference and immediately fallen into the regression to the mean trap of “the general answer to any question shouldn’t say anything in particular at all.”
Saying anything in particular makes you open to fact-based criticism — I mean, it is object-level and bad, instead of meta-level and good.
Yeah, that’s a good call out, I do feel the meta is good obsession is
borderlinedefinitely cultish.There’s a big difference between a committed scientists doing emperical work on specific mechanisms saying something like “wow, isn’t it cool how considering a broader perspective of how unrelated parts work together to create this newly discovered set of specifics?” and someone who is committed anti-institutional saying “see how by me taking your money and offering vague promises of immortal we are all enriched?”
Don’t bother me with facts, I’m trying to think!