nameandnature: Giles from Buffy (Default)
[personal profile] nameandnature
Listen, I tell you a mystery: We will not all sleep, but we will all be changed - in a flash, in the twinkling of an eye, at the last trumpet. For the trumpet will sound, the dead will be raised imperishable, and we will be changed.

The idea of the Singularity comes from an article by Vernor Vinge. Vinge says that, what with the progress in genetics and computer hardware (such as the trend computer people know as Moore's Law), it's likely that we will eventually create entities with greater than human intelligence. When this happens, those entities will know how to create entities even more intelligent than they are, and so on. With ever increasing speed, the curve of intelligence against time will move upwards, and our successors wlll pass beyond our comprehension (the name "Singularity" comes from the mathematical term for the place on a curve where an infinity occurs, such as at x=0 in the graph of 1/x). As Vinge writes:

This change will be a throwing-away of all the human rules, perhaps in the blink of an eye - an exponential runaway beyond any hope of control. Developments that were thought might only happen in a million years (if ever) will likely happen in the next century.

The essence of the Singularity is a change in human nature which is incomprehensible to those who, either by choice or lack of opportunity, are not part of the change.

In my wanderings around the web, I came across various sites belonging to people who would like to bring about the Singularity. Now, Vinge's original article is mostly about how the Singularity could go wrong. Evil computers could take over the world, or the aggressive evolutionary heritage of enhanced humans could bring them down (though I'd like to believe that even a superintelligent being which was as selfish as a human would have a better idea of what was good for it than we do right now). So this enthusiasm was a little surprising at first.

I heard a loud voice from the throne saying, "Now the dwelling of God is with men, and he will live with them. They will be his people, and God himself will be with them and be their God. He will wipe every tear from their eyes. There will be no more death or mourning or crying or pain, for the old order of things has passed away." He who was seated on the throne said, "I am making everything new!"
When I look at the pages advocating the Singularity, I see something like a Christian eschatology for techies. Not for nothing has Charles Stross called the Singularity "The Rapture of the Nerds". I'm an engineer, of sorts. I make things. I look at the world and want to fix it. Things which are broken and could be fixed if only someone would think frustrate me, and apparently others too.

I have had it. I have had it with crack houses, dictatorships, torture chambers, disease, old age, spinal paralysis, and world hunger. I have had it with a planetary death rate of 150,000 sentient beings per day. I have had it with this planet. I have had it with mortality. None of this is necessary. The time has come to stop turning away from the mugging on the corner, the beggar on the street. It is no longer necessary to look nervously away, repeating the mantra: "I can't solve all the problems of the world." We can. We can end this. -- Staring into the Singularity, Eliezer S. Yudkowsky
Cory Doctorow writes that the Singularity idea gives you that tingly, numinous feeling when you read about it on the web because the Google guided flow of the net puts you in the right mood for the experience. I don't agree with his other objections: there's no reason why you can't simulate the body, glands and all; Penrose might be right, but, unless I've missed out on some very big news, there's no particular reason to think so right now; and there'll be another fad in AI when genetic algorithms have paid out. True, the AI cheque always seems to be in the post (allow 30 years for delivery), but biotech, and so what Vinge calls IA rather than AI, might be going places.

Could I say truly that this generation will not pass away until these things have come to pass? I'm not sure I quite share the Singulatarians' beliefs: Doctorow is right in saying that it sounds too good to be true. The dot-bomb should have taught nerds caution when extrapolating trends which seem to be good for them. But sooner or later, if we survive, I suppose I or my descendants might face some questions about whether to join the change.

And being a bit of a techie, I find it easier to believe in a transformed life beyond that particular veil than in Heaven (though both of them have the problem of not quite being able to imagine how I'd still be me). I don't believe that suffering is essential to humanity (if I was the protagonist at the end of Greg Egan's short "Reasons to be Cheerful", I'd have cried for a bit and then moved the sliders: some emotions are best experienced in small doses). And, Dr Asimov, although it's true that eyes do more than see, their successors should be better yet. So, if I'm still around, sign me up.

Date: 2003-07-17 07:41 am (UTC)
From: [identity profile] elemy.livejournal.com
It's an interesting idea... though if I'm allowed to be pedantic, I think it's misnamed. Each entity's ability to create more intelligent entities being proportional to its own intelligence, the growth would be exponential, so you wouldn't reach infinity in a finite time. And it certainly won't happen this generation if you're relying on genetic manipulation.
The trouble with things that grow really fast, is that they tend to be very sensitive to initial conditions. We couldn't expect to tell now (and with mortal intelligence) whether the end result will be evil machines who take over the world, or universal peace and harmony.

Date: 2003-07-18 01:29 am (UTC)
From: [identity profile] pbolchover.livejournal.com
you wouldn't reach infinity in a finite time

I was going to point this out, but decided it would be too mathmo.

I'm not convinced that an entity's ability to create more intelligent entities is propotional to its intelligence. For starters, there's a minimum intelligence at which it is possible to create entities. Personally, I would expect it to be super-linear, making a finite-time infinity possible, especially as the amount of time required to create an entity might decrease as intelligence increases.

Date: 2003-07-18 03:19 am (UTC)
From: [identity profile] lisekit.livejournal.com
What about regression toawrd the mean?

Date: 2003-07-18 03:54 am (UTC)
From: [identity profile] pbolchover.livejournal.com
What about regression toawrd the mean?

Isn't the point of singularity that you're going away from the mean, rather than towards it?

In order to go towards the mean, you need to require that above a certain level of intelligence, you could only produce entities that are less intelligent that yourself...

Now there's a thought. Perhaps this is, in fact, the case, and this "mean" intelligence is what we erroneously call "omniscience". An omniscient entity can only create other omniscient entities, or sub-omniscient entities.

Date: 2003-07-18 04:01 am (UTC)
From: [identity profile] lisekit.livejournal.com
See, it might be the point of singularity, but it might be the point at which the model collapses.
One generation could probably produce a next generation of greater intelligence (physical fitness, disease free-nee, beauty), but the generation after would probably regress like Billy-O. Just to spite them. Ha.

Date: 2003-07-18 01:33 am (UTC)
From: [identity profile] terriem.livejournal.com
Could I say truly that this generation will not pass away until these things have come to pass?

Why do we want this generation to pass away? Historically, this is one of the best times to be alive. And of course there are problems. But instead of doing away with an entire "generation" wouldn't it be better to solve these problems in a practical way? So, instead of saying "Lets build machines that can solve the problem", we could actually solve the problems ourselves. Don't turn away from the beggar on the street. Send food to someone who's starving. Give money to the poor. Become a doctor and cure old age.

This way, instead of staring blindly at what could be in the future or wishing for a fantastic life for our great-great-great grandchildren, we could enjoy the lives we have now.

Date: 2005-07-03 06:48 pm (UTC)
From: [identity profile] ex-robhu.livejournal.com
Perhaps the super intelligent beings derived from modern day humanity in the far future will discover a LJ backup tape at an archaelogical site (let's say Earth) and will use their God like infinite powers to recreate us all extrapolating from our LJ posts? :-)

Profile

nameandnature: Giles from Buffy (Default)
nameandnature

December 2025

S M T W T F S
 123456
78910111213
14151617181920
2122 2324252627
28293031   

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jan. 2nd, 2026 06:36 am
Powered by Dreamwidth Studios