Neuromorphic computers spike architecture
ML - The way the world works - analyzing how things work - A podcast by David Nishimoto
Categories:
spiking networks the brain has very low power density and very low frequency. deep learning uses in-memory computing. spike-timing dependent plasticity timing between the spikes is how learning is taking place in the synapse. The synapse is connected to the pre-synaptic neuron and a post-synaptic neuron. The Input spikes from the pre-synaptic neuron and the post synaptic neuron affect the weights on the synapse. The delay between the spikes determines who the synapse learns. The synapse will depress from negative current with voltage flowing from pre neuron to the post neuron. if the pre to post voltage is positive you have excitation of the synapse. feedforward network creates a perceptron. multiple synapse connect to a post neuron. the perceptron is self learning. the artificial neurons learn like the biological neurons. the artificial neuron can memorize objects. recurrent networks represent feedback systems. in the recurrent network all neurons talk with each other. the recurrent network has an inhibitor synapse and an excitor synapse. hebbian learning occurs: neurons that fire together wire together. spatiotemporal networks allow for sequence learning and recognition. spatiotemporal networks learn sequence over time. potentiation is "the increase in strength of nerve impulses along pathways which have been used previously, either short-term or long-term".