Real Neural Networks Will Replace Artificial Neural Networks
For Immediate Release: April 27, 2022
The design intent, or purpose of an Artificial Neural Network (ANN) is to simulate (or imitate) the computational and information processing capabilities of a Real Neural Network (RNN). The key difference is that a RNN consists of and is powered by actual living neurons connected together such that they form and operate as a neural network – and example being the human brain – whereas an ANN consists of and is powered by silicon chips designed to simulate the behavior and capabilities of real neurons. No matter how sophisticated an ANN might be, it has inherent physical and operational limitations relative to a RNN of a comparable size and design.
With the forthcoming availability of a family of Tissue Operating Devices (TOD™) powered by millions of living neurons in multiple parallel sets of RNNs, BCM Industries (BCM) is poised to revolutionize the world of computing, data processing, information management, big data, deep learning, natural language processing, and augmented intelligence (AI) – replacing attempts to simulate human intelligence. So called “artificial” intelligence will rapidly be replaced by real neural intelligence, to augment, enhance and extend human intelligence in a wide variety of technical applications. TOD™ provided RNN offers massively expandable scope for processing power and speed, improved accuracy of results, and major time and cost benefits.
Artificial Neural Networks
ANNs are integrated hardware and software systems which attempt to mimic the activity of living neural structures such as brains. The term ANN is used interchangeably with terms such as deep learning, neural networks, and artificial intelligence. As illustrates in the Figure, ANNs include processing nodes, layers, and controlled data flow processes.
ANNs require large numbers of digital processors functioning in parallel, architecturally organized into layers and nodes. The first layer (or tier) of processors receives the raw input data. Each following layer receives output from the preceding layer as its input. The last layer in the ANN system generates the final output, or results, to the user or application program.
The ANN architecture is restrictive by design. Each node within a layer in the processing chain can only ‘know’ or act upon the limited information within its sphere. An individual node has limited access rights, and input-output data communications defined by specific relationships with other nodes. No node within the system has access rights to the overall ANN system, or its processing mission – the set of tasks to be performed, as requested and driven by the user or application program.
ANN processes require each note to make processing decisions based on inputs from previous layers, preset processing rules, input and results weighting tables, and other user-established rules. These rules determine where and what is transmitted (passed on) as nodal output data. Some users apply fuzzy logic, evolutionary algorithms, and gradient-based training within their nodal rules to enhance ANN performance.
All ANN processing programs require users to make assumptions and define rules that drive the decision logic for each of the thousands of nodes. These pre-set assumptions and decision rules automatically introduce biases into the resulting ANN processes. Furthermore, these biases tend to be compounded with increasing complexity of the processing tasks and algorithms.
Operations: Hundreds to thousands of user pre-defined software processing nodes operating in user pre-defined processing layers.
Processing Layers: There is one input and one output layer, and as many sequential additional layers as pre-established by the user’s system design. Layers are populated with software processing nodes.
Processing Node: User defined software processing tasks with pre-established logic and algorithms that accept inputs from nodes in proceeding layers and pass results as outputs to nodes in the next pre-set layer, or in the final stage, to the user.
Inputs: User defined data inputs delivered to one or more user defined nodes in the input layer.
Outputs: ANN produced output data is sent from one or more user defined nodes in the output layer to the user.
Real Neural Networks
Alternativity, as depicted in the Figure, RNNs are living neural systems, constructed of millions of individual neurons, each which having the built-in natural instinct to connect and communicate with millions of other neurons, to address specific tasks and solve problems.
Consisting of actual living neurons, RNNs are designed by nature to access the Assembled Knowledge (AK) existing within their network, and to apply that developed and evolving AK to perform consolidated Adaptive Thinking (AT). The unification of these two functions – AK and AT – inexorably results in intelligence, intuitive neural intelligence, and all that real intelligence is capable of – learning, imagining, creativity, dreaming, and profound insight and enlightenment.
Operations: Millions of naturally networked living neurons performing RNNs processing tasks.
No Processing Layers: There are no layers or phasing of processing tasks within RNNs. Each and every neuron is able to support and engage in all tasks and address all data inputs and outputs simultaneously. All neurons in a RNN will naturally perform all tasks in parallel, as a single united, comprehensive, and holistic processing entity. No specific user-provided processing instructions are required. Following user-provided training of the RNN, only general operational guidance is required from the user or application program.
No Processing Nodes: RNNs have no processing nodes. However, one could logically equate a single neuron to a universal processing node that immediately addresses all inputs and outputs processing.
Inputs: Each and every sensor neuron within the RNN will naturally receive data sensory inputs in unprocessed analog signals (not digital) from sensor ports, such as optical, inferred, RF, audio, and much more, without user direction. Neurons are naturally driven to join other neurons to form RNNs. Processor neurons address digital inputs which are received through one or many neural control rods, or from one or more authorized sensor ports, or from other individual neurons in the neural network.
Outputs: Each output from a RNN is delivered by and through neural control rods, which read and interrupt network pulse actions, and deliver the resulting outputs to the required destination.
For additional information on TOD™ design, ANN, RNN, Neuron Learning, Assembled Live Neuron Knowledge, Adaptive Thinking, Intuitive Neuron Intelligent and related subjects, visit the BCM Industries website or contact a BCM representative.