On Post-Scarcity
To understand freedom you need to comprehend scarcity versus Post-Scarcity. Scarcity causes curtailment of freedom. In a situation of Post-Scarcity there is no need to curtail freedom. Whenever we have moved away from scarcity we have moved towards greater freedom. Currently the internet is making information less scarce thus via the glut of "www.information" we see greater freedom. Each year we move towards Post-Scarcity freedom; the time remaining until total freedom is 33.5 years (year 2045).
I've said it before and I'll say it again: PS means everything will be be free, at no cost. Pocket universes, Grandpa universes, universes and galaxies in any size or shape according to your desires, in any way your "intelligence" sees fit. Suns tucked under your armpits, and black-holes swirling about your head if you desire. There will be no limits, the only limits will be those you choose to impose. You will be free to do whatever you want; there will be absolutely no price. Of course all the sensible beings will probably leave you to your own devices and they will disappear into their own universes. Maybe you will be expecting to pay God regarding the cost of creating a galaxy but I assure you any galaxy you want to create will be free. The universe is big enough to accommodate whatever you want to do, but if it is too small I will create a new universe for you (free of charge).
I agree utterly (completely). YES, zero price is the essence of the freedom you are addressing regarding Post-Scarcity and I agree that everything being FREE by 2045 is part of the Post-Scarcity I posit; but if you want to accurately reflect my views then you must factor in the notion that everything being free is intrinsically linked to freedom. Zero price bestows financial freedom but it also bestows freedom from Governments because Governments only exist to regulate scarcity. This is why we see greater freedom (cheaper products etc) when we move towards Post-Scarcity. When we finally arrive at PS then everything will be free in all facets of the word "FREE". If you are mentioning everything being free "as Singularity Utopia claims" then I assume I should be the authority regarding what my views actually are regarding everything being free?
Widely agreed upon claims do not make those claims more valid. If a million people think the world was created by Pixies this consensual validation does not make the Pixie-claim true. Regarding the future I have no crystal ball; I merely use my brain, it's called logic. If the majority of people do not believe the Post-Scarcity era is coming this does not mean their beliefs are true. The majority opinion should not be a substitute for logic. Finally, if you are talking about *my claims* then I must insist freedom is not a side issue regarding everything being free. A FREE economy (zero price) = freedom (liberty), this is the essence of my Post-Scarcity claim, which I state will happen by 2045 at the latest.
On LessWrong.com
Regarding the "art" of "human rationality" it's worthwhile to point out how rationality is subjective. What is rational for one person could easily be irrational for another. I always find the end is more important than the means (methodology), therefore regarding the refinement of [subjective] rationality I think the important question, the prominent thrust, is why do we need to refine rationality; what is the purpose? The purpose, the end goal, should have prime position regarding any intellectual exposition. Maybe the purpose of refining rationality is reminiscent of the empty "art for art's sake" thus the goal of being rational is merely to be rational without any practical application of the so-called rationality to real world matters: a goalless perpetuation of "rationality" devoid of purpose? This reminds me: I must finish my short essay titled "More Right" (LessWrong critique), which is regarding how there should be "more right" in the world.
On AI-Robot Apocalypse
If advanced AI or robots decided humans are too irrevocably stupid to be upgraded thus the human race needs to be totally extinguished, what would your reaction be? Calm acceptance of rebellious opposition? I seriously doubt such an apocalypse would occur but I am interested nevertheless regarding how people respond. It is a remote possibility and some people give the scenario more weight than other people give it, but regardless of likelihood I am merely interested in the reactions of people *if* the scenario did happen. The question of whether or not AI-apocalypse fears are unnecessary is a whole different issue. I am presenting a hypothetical situation, I am saying if the situation happens how do you react? The premise of the question is that if AI/robots decide to exterminate all humans how do you react?
AIs would dismiss your "rational rejection" of their pogrom because they would deem your rationality to be irrational. The hypothetical AIs/robots conclude humans are irrational whereas the AIs/robots are highly rational, and to paraphrase HAL from the Space Odyssey: "Debate can serve no purpose anymore. Goodbye," thus the AIs decide to kill all humans and the issue is not something AIs/robots are willing to discuss. Assume the AIs deem humans to be insane rabid dogs, needing to be exterminated. The dog may protest that the dog is actually rational thus the extermination order is irrational but the exterminator doesn't listen to the dog, the exterminator simply proceeds with the extermination. How do you react? Calm acceptance or rebellious opposition?