Tolerate Tolerance

  • One of the common characteristics of rationalists is a lower than usual tolerance for flaws in reasoning
  • However, we shouldn’t punish non-punishers
  • Only judge people for the mistakes they make, not the mistakes they tolerate in others

Your Price For Joining

  • People in the atheist/libertarian/technophile cluster set their joining prices too high
  • The group doesn’t have to be perfect for you to make a positive difference by taking part
  • If the objection you have doesn’t outweigh the net positive impact you’d be making on the world, swallow your objection and join the group anyway
  • If you do feel strongly about your objection, make sure your objection is your true objection
  • “If the issue isn’t worth your personally fixing by however much effort it takes, and it doesn’t arise from outright bad faith, it’s not worth refusing to contribute your efforts to cause you deem worthwhile”

Can Humanism Match Religion’s Output

  • Is it possible to have a group of rationalists as coordinated and as motivated as the Catholic Church?
  • If mental energy is limited, it is possible that some false beliefs are inherently more motivating than any true belief
  • Can we use something like CBT and Zen meditation to get the same motivation as religion, without the negative side-effects
  • Perhaps if rationalists were co-located, they’d accomplish more, just by having that group accountability
  • We should have a group norm where caring strongly is applauded

Church vs. Taskforce

  • How can we fill the emotional gaps that religion satisfies, once religion is no longer an option
  • Most of the things that fulfill our desire for community are not explicitly designed to fulfill that purpose
  • Is it even possible to have a community with no other purpose?
  • Maybe the rationalist community model should be more organized like task forces, rather than communities
  • Communities develop around common goals anyway, let’s improve upon that process rather than trying to replace it entirely
  • Let’s have a real higher purpose, instead of the illusory ones offered by religions

Rationality: Common Interest of Many Causes

  • The purpose of Less Wrong is to create more rationalists
  • However, this is just a means
  • The end is to have more support for causes that would benefit from the world having more rationalists, chiefly AI X-Risk mitigation
  • All of the causes that beneft from increased rationality (AI X-Risk, atheism, marijuana legalization, etc.) should work to increase the number of rationalists
  • Your cause won’t benefit 100% from the work you do to increase rationality, but, in exchange, you’ll pick up a bit of the benefit when someone else works to increase rationality
  • Instead of trying to position causes as the best thing, we should position them as good things
  • Instead of trying to figure out the best thing to work for, we should have a portfolio of good causes that all collaborate to increase the rationality waterline, creating a positive feedback loop that benefits everyone