John von Neumann designed the basic architecture of all modern personal computer back in the 1940s and 1950s. Is it not time to move on?
Through engineering breakthroughs, this architecture has been enhanced and tricked into producing faster and faster results. This is a testimony to the genius of certain electrical engineers. But there is a wall that limits order-of-magnitude speed increases. For this we will need a cognitive computer - one that can think like the human brain.
Who was John von Neumann? Any mathematician knows him well, but members of the general public may not. For these, we include a short video on his life done by an admirer. If you cannot see the embedded video, here is the link: http://youtu.be/kklb1J6Ij_U.
John von Neumann and others during the 1940s are responsible for what the computer is today. By that we mean that he and others laid out the basic structural design of the computer. A very concise explanation of the limits of the von Neumann computer architecture is found in the SyNAPSE project's own website.
The vast majority of current-generation computing devices are based on the Von Neumann architecture. This core architecture is wonderfully generic and multi-purpose, attributes which enabled the information age. The Von Neumann architecture comes with a deep, fundamental limit, however. A Von Neumann processor can execute an arbitrary sequence of instructions on arbitrary data, enabling reprogrammability, but the instructions and data must flow over a limited capacity bus connecting the processor and main memory. Thus, the processor cannot execute a program faster than it can fetch instructions and data from memory. This limit is known as the “Von Neumann bottleneck.”
Surely there must be a less primitive way of making big changes in the store than by pushing vast numbers of words back and forth through the von Neumann bottleneck. Not only is this tube a literal bottleneck for the data traffic of a problem, but, more importantly, it is an intellectual bottleneck that has kept us tied to word-at-a-time thinking instead of encouraging us to think in terms of the larger conceptual units of the task at hand. wikipediaThe classic paper that produced our modern computer design was published in 1946 by von Neumann, Arthur Burks and Hermann Goldstine, titled, Preliminary Discussion Of The Logical Design Of An Electronic Computing System. We do not wish to get too technical but if the reader will bear with us, we shall describe the basic architecture of a von Neumann computer. An excellent summary of a von Neumann computer is found in an essay by H. Norton Riley, professor of the Computer Science Department, California State Polytechnic University in 1987, titled The von Neumann Architecture of Computer Systems. We shall quote extensively from it,
To von Neumann, the key to building a general purpose device was in its ability to store not only its data and the intermediate results of computation, but also to store the instructions, or orders, that brought about the computation. In a special purpose machine the computational procedure could be part of the hardware. In a general purpose one the instructions must be as changeable as the numbers they acted upon. Therefore, why not encode the instructions into numeric form and store instructions and data in the same memory? This frequently is viewed as the principal contribution provided by von Neumann's insight into the nature of what a computer should be.The genius of the von Neumann architecture back then is the weakness of today's computers. There four basic weaknesses in the von Neumann architecture that inhibits modern computers from achieving greater speed and power. The first one is,
...the fact that instructions and data are distinguished only implicitly through usage. As he points out, the higher level languages currently used for programming make a clear distinction between the instructions and the data and have no provision for executing data or using instructions as data.The second is, "...that the memory is a single memory, sequentially addressed." The third weakness "...is that the memory is one-dimensional."
...these are in conflict with our programming languages. Most of the resulting program, therefore, is generated to provide for the mapping of multidimensional data onto the one dimensioned memory and to contend with the placement of all of the data into the same memory.As for the fourth weakness, the problem,
...is that the meaning of the data is not stored with it. In other words, it is not possible to tell by looking at a set of bits whether that set of bits represents an integer, a floating point number or a character string. In a higher level language, we associate such a meaning with the data, and expect a generic operation to take on a meaning determined by the meaning of its operands.The implications of these weaknesses are significant.
One facet of this is the fundamental view of memory as a "word at a time" kind of device. A word is transferred from memory to the CPU or from the CPU to memory. All of the data, the names (locations) of the data, the operations to be performed on the data, must travel between memory and CPU a word at a time. Backus  calls this the "von Neumann bottleneck." As he points out, this bottleneck is not only a physical limitation, but has served also as an "intellectual bottleneck" in limiting the way we think about computation and how to program it.
In our next installment of this series we shall explain how these new cognitive computers revolutionize and surpass von Neumann computers.