Yavuz Demirci
Singularity describes the merging of human and computer intelligence and the rise of super-intelligence as a result. Proponents of the idea of singularity try to posit it as the next step in human progression, where humans will cease to exist as currently constructed and will instead transcend our given form and become a hybrid race that is part computer, part human. Singularity has been portrayed in popular culture in several movies, the most popular of which are the Terminator and Matrix movies.
History of discussion about singularity
Vernor Vinge, a science fiction writer,  first wrote about the vision of technological singularity and coined the term  in 1993. He wrote, "Within thirty years, we will have the technological  means to create superhuman intelligence. Shortly after, the human era will be  ended."
  Ray Kurzweil, inventor and futurist, is a  fervid proponent of technological singularity. Kurzweil predicts the timeline  of singularity as follows:
- By 2019, a $1000 PC will have the computing power of the human brain. It will be capable of performing 20 million billion calculations.
 - By 2029, a $1K PC will be a thousand times more powerful than the human brain; the human brain itself will be successfully reverse engineered.
 - 2045 is singularity: machines will have surpassed humans in intelligence and in fact will have created next-generation robots even smarter than themselves. We should either merge with our creations or step out of their way. Immortality!
 - By 2055, $1K of computing power will equal the processing power of all the humans on the planet.
 
In 2011, Ray Kurzweil sponsored a  movie/documentary about singularity, titled "Transcendent Man," which  has been screened in five major cities in the U.S., as well as London. In  December 2012, Kurzweil was hired by Google as a director of engineering to  "work on new projects involving machine learning and language processing."
  In 2000, Bill Joy, a well known computer  scientist and the primary figure behind the BSD operating system (on which  MacOS was built on) and the widely used Java programming language, joined this  discussion. In a Wired magazine  article, "Why the future doesn't need us," Joy declared, in what some  have described as a "neo-Luddite" position, that he was convinced  that growing advances in genetic engineering and nanotechnology would pose severe  risks to humanity.
Arguments and counterarguments about the feasibility of singularity
Proponents of singularity often cite  Moore's law to support their claim. Moore's law states, crudely, that the  capacity of computer chips doubles every two years. That is, the speed and  capability of computers grows at an exponential speed. Such an exponential  growth is a powerful enabler. Consider the series 1,2,4,8,16,32... The small  increments in the beginning may be misleading about the overall speed of the  series’ growth. The 20th element in this series would be 1 million. The 266th  element in this series is 1080,  which is more than the number of atoms in the universe.
  Proponents of singularity argue that thanks  to this exponential growth, the processing powers of computers will reach such  high levels in the next few decades that it will be possible to simulate the  human brain in high fidelity. The workings of each neuron in the brain will be  simulated in real time, achieving a full simulation of the brain. At that  point, the computer will essentially have the equivalent of human intelligence.  In the succeeding years, with the increase in capacity, the computer  intelligence will be several folds ahead of human intelligence.
  Opponents of the feasibility of singularity  cite that exponential growth is hard to sustain. Exponential growth is seen in  the beginning of a series, but then due to limitations/adversities, most series  will level off and stay constant. An example of this is the population of  rabbits. Initially, the increase is exponential; however, due to scarcity of  food sources and an abundance of predators, the population stabilizes around a  constant. Therefore, opponents of singularity argue that the exponential progress  of computer processing speeds will similarly hit a brick wall. At the chip  level, physical issues such as heating will make exponential speedup  unsustainable. At the cluster level, latency, consistency, and scalability  issues will also prevent exponential growth.
  Underlying all of Kurzweil's ideas  regarding the progress of technology and the singularity is the Law of  Accelerating Returns. This Law states that technological progress occurs  exponentially instead of linearly, meaning that each new advancement enables  several higher advancements instead of just one higher advancement, and,  concordantly, every year brings more useful inventions and discoveries than  were made in the last. The first generation artificial intelligence (AI)  approaches failed, but simulating a human brain may work if we know the  workings of the brain in excruciating detail. As a promising development,  recently, “deep learning” and “deep neural networks” technologies achieved  great success in image and speech recognition tasks.
  However, the opponents of singularity like  to point out that the workings of the brain as a whole are still a big mystery.  We have information about the rough mechanism of how a neuron works. An excited  neuron can transmit a signal to a neighboring neuron through its synapses. But,  there is no clear explanation about how thought occurs from this process.  Brain-scanning techniques are improving, as they are based on computers, but  the brain may throw us more complex surprises as we learn more about it. 
  In fact, much of the brain power comes  about through organic materials, and the very low-level analog physical  interactions between these materials. These physical phenomena could be close  to impossible to model/simulate in a digital environment. Henry Markram, lead  researcher of the "Blue Brain Project" for simulating mammal brains  at the molecular level, has stated that "it is not [their] goal to build  an intelligent neural network." He claimed, “[That would] be very  difficult because, in the brain, every molecule is a powerful computer and we  would need to simulate the structure and function of trillions upon trillions  of molecules as well as all the rules that govern how they interact. You would literally  need computers that are trillions of times bigger and faster than anything  existing today." 
  Another relevant question is whether we can  develop the parallel processing architectures needed to support the parallel  processing that goes on in the brain. The brain uses far more parallel  processing than exists in most classical computing designs.
  Even if a computer successfully simulates  the human brain, whether such a computer design will be “scalable” to two  times, ten times, or even one hundred times the brain’s normal power is an  unknown; for the human brain’s computation power may be inherently unscalable.  Also, if a computer models the human brain, human emotions would also be modeled.  Would the resulting computer be stable? As it scales up, would it become existential  and suicidal, or perhaps become an arrogant killer?
The aftermath of singularity
Several questions are raised about the  aftermath of singularity. Can a downloaded personality replace the spirit? How  does this equate to living forever? Singularity promises are similar to  claiming that you can live forever by cloning yourself. One copy dies, but  another digital copy survives. But it is clear that the copies are different  entities. 
  And it is also clear that this is not true  immortality. If we stretch singularity's approach to immortality a little  further, we can argue that humans can achieve immortality through their work or  art. And to this idea Woody Allen provided the best response: "I don't  want to achieve immortality through my work. I want to achieve it by not  dying."
References
- Vernor Vinge, “The Coming Technological Singularity: How to Survive in the Post-Human Era”, 1993, available from https://www-rohan.sdsu.edu/faculty/vinge/misc/singularity.html
 - Ray Kurzweil, ”The singularity is near: When humans transcend biology”, Penguin books, 2005.
 - Bill Joy, “Why the future doesn’t need us”, Wired 8 (04), 2000.
 - Henry Markram, “The blue brain project”, Nature Reviews Neuroscience, 7 (2), 153--160, 2006.
 
 
