About Deep learning

About Deep learning

Deep learning? I’ve heard of it

EE837A_2015_lecture1_part2

Five most probable labels of each ImageNet classification result (Krizhevsky, 2012)

Deep learning is definitely a hot issue in these days. It suddenly appeared and started devouring classical image processing area. It is really good at object recognition and face recognition. For those tasks, Deep learning performs better than classical image processing algorithms, and even better than humans.

Images that combine the content of a photograph with the style of several well-known artworks. (A Neural Algorithm of Artistic Style, 2015)

Images that combine the content of a photograph with the style of several well-known artworks. (A Neural Algorithm of Artistic Style, 2015)

Currently, Deep learning is far beyond the image classification. A recent work extracted artistic style of a painting and applied it to another painting by using Deep learning.


 

What is it?

eJpbb

The key idea of Deep learning is imitating neural networks in animals. Eighty-six billion neurons are in a brain of us, and each of them does simple operation. Each neuron has input cables (dendrite) and output cables (axon), and they are connected to other neurons’ axons and dendrites, respectively. What a neuron do is activate its axon if the activation level of its dendrites is high enough. The activation of axon will be delivered to connected dendrite through a synapse.

A perceptron

A perceptron

Multi-layer perceptrons

Multilayer perceptrons

The recent spotlight on Deep learning has been less than ten years, but the history of it starts from 1958. In 1958, F. Rosenblatt brought an idea of a perceptron which calculates weighted sum of inputs and activate the output when the sum is high enough. In 1986, A concept of multilayer perceptrons is introduced. The big progress is that the perceptron network is trained by back-propagating error signal. A supervisor compares the output with the expected output and reduce or increase weights of wrong or correct edges. (We are going to talk about the backpropagation details in later posts).

The idea of multilayer perceptron is cool; however, the computing power of at that time was not sufficient for complicated networks. As a result, the researchers focused on shallow networks which have only one or zero hidden layer (layers between the input layer and the output layer). Exploration on Deep network mainly started from 2006, with assist of GPGPU acceleration.

Leave a Reply