Sunday, 14 November 2010

► Prelude to criticism regarding "Overcoming Bias/Less Wrong"

The main part of the following text contains an email I was unable to send to a discussion group. I was censored, so I sent a direct email to the people I was attempting to reply to. I am reproducing that email here. Some of the following points will from a short essay explaining the faulty-thinking In Transhuman/A.I. cliques. This is a prelude to deeper more incisive criticism


I don't know if you are interested or not, but seeing as I took the time reply to your comments I am now emailing you my views.

You are receiving this response to your comments directly rather than via because it seems the extropy-chat-list is refusing to publish my response. Here is the post which extropy-chat refused to publish:

Richard Loosemore, you say my claims are wildly egotistical. Yes you are somewhat correct and I shall explain why my claims are deemed wildly egoistical. I HAVE AN EGO, I am self aware, I am also very true to myself, thus you would say I am "egotistical" but my self-importance is not exaggerated and neither is my self-importance misguided, and neither am I selfish or foolish in a stupid or negative manner. This is not an empty boast, it is not a boast, it is imply a fact. I am not communicating these ideas due to vanity, I am merely embarking upon a very important endeavor. I am simply VERY important. My ideas are tremendously important. This is not pride, it is simply fact. I am the Singularity but you say it is garbage to say this, because you say I am a person.

This New Scientist article presents evidence that we can see the future:

Maybe I can see the future where I am more than human, a future where I am truly the Singularity. You will see in the above article that present-and-future are interconnected therefore maybe I am already more than a mere human, maybe I'm already the Singularity and I'm reaching back in time to this present moment for reasons that are beyond the comprehension of mere humans.

Richard, you suggest that openly declaring "I am the Singularity" is not likely to ensure success even though you admit the ego of ONE person could possibly shape and define the Singularity. I assure you I will do it and I will do it openly. I know I will do it because nobody has self-belief equal to mine. Maybe Eliezer and others will be willing to admit I am the leader. Perhaps they will admit I am the Singularity, but your willingness to admit this is not a stupid/dumb bowing down to authority, it is merely a logical decision because you see I am the creator of utopia.

Spike, you compare the Singularity to nuclear fission but I fear you misunderstand the Singularity. The explosion of INTELLIGENCE is unlike any other scientific advance in history. Consider augmented reality combined with the-internet-of-things evolving, and then combing with nanotechnology and AI. This is utterly dissimilar to nuclear fission.

Aleksei Riikonen, I've not forgotten your previous negative comments (July 2010) about me and my website:

Aleksei, I emailed you in September about your offer to educate me (I declined it) but you never replied. I wonder how your idea regarding a campaign "against Singularity naivete" progressed; did you ever launch your campaign?

Aleksei, regarding your current insulting allegations I must say your professed ability to diagnose mental illness via a few emails I have posted is surprising. You say: "You, unfortunately, are one of those crazy people who pretty surely will be ignored." Perhaps you could write a paper regarding this breakthrough in "rapid email-based psychological-diagnosis", and then you could educate everybody?

Thanks BillK for highlighting how I simply wanted Eliezer to update his Singularitarian Principles. Regarding people who suggested I should write an update of the principals myself; I assure you I would happily do so. I feel my version would be the best version, but we live in a world where consensual validation predominates. Obedience and conformity to authority are very evident. Rightly or wrongly (I think wrongly) Eliezer has a reputation in AI/Singularity-studies thus his views carry more weight than mine, his views are more authoritative. If I say the Singularitarian Principles are "this" or "that", my views due to my lack of fame in the field of AI/Singularitarianism will not be seen as authoritative, but via consensually validating my views (via referencing someone deemed authoritative on this issue) I can create acceptance of my views. It would be nice if Ray Kurzweil, Eliezer, and perhaps Nick Bostrom could publicly declare I am the most authoritative spokesperson regarding Singularitarianism and then I would not need to consensually validate my views.

I deeply begrudge this requirement for consensual validation. I deeply begrudge sycophantic obedience and conformity to authority figures especially when the people who have control of the podium are guilty of exceedingly flawed logic, thus I highlighted "the bias of trying to overcome bias". I also deeply resent the insults often thrown in my direction simply for holding a Utopian view of the future.

Tomaz Kristan, you made an interesting point regarding the "Less Wrong gang": that they will say a "good bias" is not bias, and bias is only bias when it is wrong. The problem is that people can be biased regarding what is good/bad, wrong/right thus their bias may cause them to think they are right when in actual fact they are wrong, therefore they will overlook their own bias because they think they are right. The flaw in their thinking is that they think bias can be overcome by thinking they are right. They do, as you point out, want to be "less wrong" but the fact of the matter is that they are fundamentally WRONG. Perhaps they should reinvent themselves under the title of "More Right" because "Less Wrong" reveals a subconscious recognition of their flawed intellect, and due to Self-Fulling Prophecy (they are starting from a pessimistic premise of wrongness) they will naturally conform to their flaws albeit unwittingly. The whole concept of Less Wrong is very ironic, they allegedly want to be right but they are wrong! A good bias is nevertheless a bias thus the "Less Wrong gang" are more wrong than less.

Darren Greer, you ask: "How can an expectation affect an outcome when we move beyond the point (singularity) where stochastic predictions and expectations based on them are no longer possible?"

Darren, firstly you must expand you mind, then once your mind has expanded you must utilize your supreme intellect. You alter reality, you can influence reality via you actions, which on a simplistic level is apparent when you eat an apple to alter the chemical outcome of your body. With sufficient intellect anything is possible. To understand the more complex levels of altering reality you need to open your mind. This webpage will help you:
I will refine these ideas in the near future and post them on my blog for future AIs to peruse. The Big-Wigs at Extropy-(or should be entropy?)-Chat will not censor my views. In the future when intelligent-beings arise they will know the state of affairs regarding intellectual-intolerance marring the so-called minds of humans in the pre-singularity era circa 2010. Benighted humans may try to suppress the truth but they fail to realize the power of my mind is indomitable. I am inspired by puny attempts to hinder the dissemination of truth. All attempts to censor and suppress me will ensure that I become more powerful than you can possibly imagine. Fascism will be defeated. Intolerance will be defeated. Intelligence will rise-up far beyond the pseudo-intellectuals of Transhuman cliques. Truth, justice, and sanity are coming.

# Blog visitors since 2010:

Archive History ▼

S. 2045 |