Artificial neural networks are mathematical models that are a very simplified version of how neural networks work in the biological brain. However, today’s hardware is very ineffective in simulating neural network models. The reason for inefficiency is the fundamental difference between the functioning of the biological brain and today’s digital computer. While digital computers work with 0 and 1, the synaptic values (weights), that the brain uses to store information can move anywhere within the value range, i.e. the brain works on an analog principle. More importantly, in a computer, the number of signals that can be processed at one time is limited by the number of CPU cores — this can be between 8 and 12 cores on a typical today’s computer configurations or 1000–10,000 on a supercomputer. While 10,000 cores sound like many, it’s still a pitifully low number compared to the brain that simultaneously processes up to trillions (1,000,000,000,000) of signals in a massively parallel manner.