Cognitive computing is the current buzz word, but cognitive computing has a problem lurking in the near future, and the problem is Moore's law.
Moore's law predicts and theoretically limits the number of semiconductor junctions that fit into a microchip along with the rate at which this will increase over time. Today's computer processors have already almost reached it. What that means is that current manufacture and design techniques are soon going to limit the complexity of microprocessors in relation to the energy they consume. As devices cram more junctions into their package, heat and power become a problem. Modern laptops are a good example of how heat build up begins to cause problems even under moderate use.
Hardly a day goes by without some claim by the big players about how their "Artificial Intelligence" is getting better. In reality, what this means is they are also getting "bigger". Bigger to accommodate the increase in data storage required to process more computations, and bigger to accommodate the increase in physical hardware required to scale the processors required to handle the increased number for computations per second.
Cognitive computing is the artificial intelligence equivalent of using more hammers to crack more nuts as fast as possible at the expense of either accuracy or energy. It uses massive data sets which it generally has to learn in advance to solve problems in realtime by performing millions of separate analytical abstractions. It would be like having to watch every window in your house for an intrusion, running from room to room as fast as you can. No matter how fast you run, you can never be in two rooms at once, so you hope you can run fast enough that the time between rooms is faster than the ability of an intruder to climb in the window. Not that smart huh? What we have learnt here is that cognitive computing – as useful as it obviously is, has some serious drawbacks.
Our brain does not keep growing to store more information, and that is a big clue as to alternative methods of computing approaches.
When a human processes memory, apart from the obvious generation of heat that happens – what we also notice is that our brain does not keep growing to store more information, and that is a big clue as to alternative methods of computing approaches. We also know that by and large we are pretty damn good at remembering and recalling most significant things we have seen. In fact by contrast, the brain often reconfigures or “prunes” itself – a sort of defragmentation if you will.
The quest to build “Artificial Life” that can be integrated into existing or future computing systems is now my primary focus.
A significant aspect of human life over many other constructions is our seemingly good ability to make choices by free will. In my publication “The Third State” I put forward the idea that memory is demonstrably not an accurate process. In fact, to satisfy the observed nature of the brain it absolutely cannot be “photographic” or anything near approaching it. This ability to choose and make mistakes is I believe the very essence of life that must be replicated and integrated with current cognitive computing if it is to achieve the ability to have instincts that develop independently of the computers program. The quest to build “Artificial Life” that can be integrated into existing or future computing systems is now my primary focus.
Scot has over 25 years’ experience developing and deploying bespoke machine learning technology within commercial and industrial software and hardware systems. His research field is simulated quantum tunnelling and probabilistic memory models for the implementation of high performance Quantum Neural Networks and extreme data compression techniques