Network Types

Activation Functions

Logistic -

https://www.coursera.org/learn/neural-networks/lecture/wfTkN/learning-the-weights-of-a-logistic-output-neuron-4-min

Softmax - converts real numbers into discrete categorical distribution

https://www.coursera.org/learn/neural-networks/lecture/68Koq/another-diversion-the-softmax-output-function-7-min

Avoiding Overfitting / Improving Generalisation

https://www.coursera.org/learn/neural-networks/home/week/9

Hopfield Nets

https://www.coursera.org/learn/neural-networks/home/week/11

Energy Function

`E = - sum_(i) s_i b_i - sum_(i<j) s_i s_j w_(ij)`

The bias term is like a unit that is always on

Energy Gap

The amount unit i will change the global energy E

`Delta E_i = E(s_i = 0) - E(s_i = 1) = b_i + sum_j s_j w_ij`

Update Rule

When using 0 and 1 for binary thresholds

`Delta w_(ij) = 4(s_i - 1/2) (s_j - 1/2)`

When using 1 and -1 for binary thresholds

`Delta w_(ij) = s_i s_j`

Binary Stochastic Units

`P(S_i = 1) = 1 / (1 + e^(-(Delta E_i) / T)`

This idea of using a temperature is called simulated annealing

Boltzman Machines

Boltzman machines are stochastic Hopfield nets with hidden units

© Will Robertson