Click here for the articles on singularity.
Singularity Websites http://bit.ly/idBrBC
Origins of Singularity
"The AI does not hate you, nor does it love you, but you are made out of atoms which it can use for something else."Eliezer Yudkowsky
Since civilizations face always problems that require continuously greater activity, it is likely that super civilizations will undertake activities and construct structures of a very large scale. Properties and means of detections of such superstructures and activities are discussed.According to this model we are a type 0 organization not even to Type 1 yet. Once the technological singularity arrives we would achieve quickly type 1 status.
|Irving John Good|
|designed by Russian mathematician Nikolai Kardashev in 1964|
Let an ultraintelligent machine be defined as a machine that can far surpass all the intellectual activities of any man however clever. Since the design of machines is one of these intellectual activities, an ultraintelligent machine could design even better machines; there would then unquestionably be an 'intelligence explosion,' and the intelligence of man would be left far behind. Thus the first ultraintelligent machine is the last invention that man need ever make.
"God is what mind becomes when it has passed beyond the scale of our comprehension." Freeman DysonBut there are earlier exponents of this view. Robert Thornton a religious writer in Michigan, published a newspaper The Primitive Expounder which like many others of the time in America, discussed modern developments and interpreted them through the Bible, wrote concerning the newly invented mechanical calculator,
...such machines, by which the scholar may, by turning a crank, grind out the solution of a problem without the fatigue of mental application, would by its introduction into schools, do incalculable injury, But who knows that such machines when brought to greater perfection, may not think of a plan to remedy all their own defects and then grind out ideas beyond the ken of mortal mind!
One conversation centered on the ever accelerating progress of technology and changes in the mode of human life, which gives the appearance of approaching some essential singularity in the history of the race beyond which human affairs, as we know them, could not continue.The idea was further developed by Vernor Vinge who in the 1983 issue of Omni Magazine, who actually used the term singularity in regards to intelligent machines when he wrote,
We will soon create intelligences greater than our own. When this happens, human history will have reached a kind of singularity, an intellectual transition as impenetrable as the knotted space-time and the center of a black hole, and the world will pass far beyond our understanding. This singularity, I believe, already haunts a number of science-fiction writers. It makes realistic extrapolation to an interstellar future impossible. To write a story set more than a century hence, one needs a nuclear war in between ... so that the world remains intelligible.Vinge when on to write an article in 1993 entitled, The Coming Technological Singularity in which he concluded,
Which is the valid viewpoint? In fact, I think the new era is simply too different to fit into the classical frame of good and evil. That frame is based on the idea of isolated, immutable minds connected by tenuous, low-bandwith links. But the post-Singularity world _does_ fit with the larger tradition of change and cooperation that started long ago (perhaps even before the rise of biological life). I think there _are_ notions of ethics that would apply in such an era. Research into IA and high-bandwidth communications should improve this understanding. I see just the glimmerings of this now, in Good's Meta-Golden Rule, perhaps in rules for distinguishing self from others on the basis of bandwidth of connection. And while mind and self will be vastly more labile than in the past, much of what we value (knowledge, memory, thought) need never be lost. I think Freeman Dyson has it right when he says : "God is what mind becomes when it has passed beyond the scale of our comprehension."What is the Future for Humanity?
This is the BIG question. There are two basic camps when it comes to this question. The optimistic camp and the pessimistic camp. It seems that so far everything that mankind creates always has a dual effect - positive and negative. Why should we believe that these supercomputers independently intelligent from us should follow any different pattern? If this technological singularity does arrive, it will inevitably lead to conflict between humans and these artificial super intelligences.
Critics of Technological Singularity
Steven Pinker a well known author and thinker as well as professor of psychology at Harvard University is skeptical stating,
There is not the slightest reason to believe in a coming singularity. The fact that you can visualize a future in your imagination is not evidence that it is likely or even possible. Look at domed cities, jet-pack commuting, underwater cities, mile-high buildings, and nuclear-powered automobiles--all staples of futuristic fantasies when I was a child that have never arrived. Sheer processing power is not a pixie dust that magically solves all your problems.
I am a skeptic. I don't believe this kind of thing is likely to happen, at least for a long time. And I don't know why I feel that way. The development of humans, what evolution has come up with, involves a lot more than just the intellectual capability. You can manipulate your fingers and other parts of your body. I don't see how machines are going to overcome that overall gap, to reach that level of complexity, even if we get them so they're intellectually more capable than humans.
It might happen someday, but I think life and intelligence are far more complex than the current singularitarians seem to believe, so I doubt it will happen in the next couple of centuries. [The ramifications] will be enormous, since the highest form of sentient beings on the planet will no longer be human. Perhaps these machines--our 'children'--will be vaguely like us and will have culture similar to ours, but most likely not. In that case, we humans may well go the way of the dinosaurs.
The singularity I'm interested in will come from biology rather than machines. We won't be building things; we'll be growing and cultivating them, and then they will grow on their own.
As the next level of organization kicks in, the current level is incapable of perceiving the new level, because that perception must take place at the new level. From within our emerging global cultural, the coming phase shift to another level is real, but it will be imperceptible to us during the transition. Sure, things will speed up, but that will only hide the real change, which is a change in the rules of the game. Therefore we can expect in the next hundred years that life will appear to be ordinary and not discontinuous, certainly not cataclysmic, all the while something new gathers, until slowly we recognize that we have acquired the tools to perceive that new tools are present - and have been for a while.He continues about the coming singularity,
In a thousand years from now, all the 11-dimensional charts at that time will show that "the singularity is near." Immortal beings and global consciousness and everything else we hope for in the future may be real and present but still, a linear-log curve in 3006 will show that a singularity approaches. The singularity is not a discreet event. It’s a continuum woven into the very warp of extropic systems. It is a traveling mirage that moves along with us, as life and the technium accelerate their evolution.
If you wish to read a good summary of these views you may go here.
List of Singularity Leaders
|Hugo de Garis|
Raymond Kurzweil American author, inventor and futurist