There a process to all things: every action can be broken down into a series of tiny steps that one must take in order to get from a state of incomplete to complete. Brushing your teeth involves hundreds of these – walking over to the sink is repetitively putting one foot in front of the other, opening the cabinet, raising your arm, opening your fingers, moving your hand over the brush, closing your fingers over the brush, and so on and so forth until one’s teeth are free of the irksome remnants of lunch everyone was too shy to tell you about that day.

Breaking down this process into emulatable pieces in essential in learning, as any computer (neural or mechanical) requires each step of the process to be codified in order to convey information and turn all the individual data units (“raise hand towards brush”) into useful information to inform a process. But between the two processing and thinking models most present in our lives, your brain and your standard computer have fundamentally different designs: Computers dissect programs into binary, simple step-by-step instructions, and when processes are coded in, computers can perform them with enormous speed and efficiency. Computers also have the benefit of holding processing power, coded into algorithms, and data separately, but as of thus far, they are limited to logical and mathematical processing. This is based on the architecture of a computer’s design, where binary code is used to process data that is more or less “pre-digested” and fed to a machine to allow for processing. Quantum computing, the capacity to process data in multiple states simultaneously, is emerging, it is not yet commonplace enough to be found in your day-to-day life (unless, of course, you work in the field or build these systems for a living).

Brains, on the other hand, have the integrated benefit of a natural neural network. If machines are limited to the knowledge level of Bloom’s Taxonomy (a set of hierarchical models used to classify learning by level of complexity), with a simple “remember and recall” aim, then the human brain can reach the levels of knowledge, comprehension, application, synthesis and evaluation of new information, thereby allowing the brain to codify new information, independently understand it’s value in context, and apply it to new models in creative and unique ways. Brains can use content-addressable memory, meaning it forms patterns and networks within itself to connect data into information automatically, and can compute in parallel, whereas a modern machine is inherently modular. The architecture is also more layered and complex – short-term memory and RAM may appear similar at first, given that both require power to process immediately, but RAM holds data separate to that already codified, whereas short-term memory is inter-layered with connections to long-term memory, allowing for the formation of new neural pathways and networks as data is processed. Brains also hold no distinction between hardware and software – the mind emerges from the brain, and any change to one is accompanied by changes to the other.

Modern AI encounters these differences regularly – systems or devices are being designed to act intelligently using a computer architecture system designing to process data, not comprehend or apply it in novel ways. One solution currently being developed is the  Neural Network: computer systems designed to make connections based on probability. Neural Networks have an embedded feedback loop to sense whether decisions it makes, based on predicting connections with a high probability of correlation, are accurate or not; from this feedback, it can modify it’s approach in the future by weighing certain characteristics more than others to find a “correct” answer. Neural networks have shown the capacity to model complex non-linear relationships, including language modelling (which includes recognition of human speech).

Where the greatest limitation for computers lies may be in the parallel element of Bloom’s Taxonomy: computers may seek to replicate the cognitive domain, but the emotive and action-based learning opportunities for humans poses other unique challenges. Artificial Intelligence (AI) and Machine Learning (ML) (AI being the concept of creating machines that can carry out tasks in ways we would consider “smart”, and ML being a field of AI considering how machines can learn from data) may face difficulties in attaining the levels of characterizing information with an independent schema and of adaptation of skills or processes in unfamiliar environments, two key elements of human learning by which intelligence is measured. Creativity is a key indicator of intelligence, and one machines may need to learn to do if they seek to be truly viewed as “intelligent” beyond the operations of processes designed specifically for them.

Learning is made even more complex because of how much we simply do not know about the human brain. Scientists do not yet understand how information is encoded or transferred from cell to cell within the network of our neurons. So far, no common neural “code” or language used to aggregate data to inform processes has been found. And a lack of real understanding exists as to how the encoding of information differs within parts of the brain itself. Some of this difficulty lies with the type of language we use to describe computers – we view them as modular objects, therefore the notion of “processing” and “storage” is thought of separately – but much as we cannot imagine a new color or conceptualize what non-carbon lifeforms on other planets would look like, we may not comprehend a system if we look to break it down into individual components that are not individualized with the system we examine, or are inherently not what we understand them to be. Therefore, despite the comforting notion that there is a process to everything, the idea that what our minds and our brains define as a process may differ, making slightly eerie the concept that the brain is responsible for designing the computer in the first place, and we may not be wired at all like what we can imagine wiring to be. Best not to consider it while brushing one’s teeth.

 

 

Advertisements