==install==
*pip install
*conda install


==basics==



==neural theory==
*dropout: https://www.cs.toronto.edu/~hinton/absps/JMLRdropout.pdf
*cross entropy: http://colah.github.io/posts/2015-09-Visual-Information/
*softmax: https://en.wikipedia.org/wiki/Softmax_function
*max spooling:
*convolution:
*conv stride:
*conv padding:

==ML learning==
*perception learning(liner separable): w' = w + yx


*k-nearest neighbour (Euclidean Distance)
*linear classifier Wx+b
*Hinge Loss(SVM) Loss=Sigma(max(0,sj-sy+1))
*Weight Regulariztion (L2) R(w)=sigma(k)simga(l)|Wk,l|
*Cross-Entropy Loss(Softmax) Classifier = sigma(correct(normalize(exponential)))=0~inf
*mini-batch gradient descent (maybe with momenton)
*learning rate decay
*computational graph
*back propagation (calculating gradient)  (add gate = gradient distributor, max gate = gradient router, mul gate = gradiet.. 'switcher')
*Jacobian Matrix (gradient matrix (n input,m output, m*n sized) but you don't need to form.)

===activation function===
*sigmoid activation function.
*tanh activation function
*ReLU activation function, max(0,x)
*Leaky ReLU activation function, max(0.1x, x)
*Maxout activation function
*ELU activation function

==dataset==
http://www.iis.ee.ic.ac.uk/icvl/ges_db.htm

http://www.cs.nyu.edu/~roweis/data.html


upsd digits: http://cs.nyu.edu/~roweis/data/_old_list


KITTI CARS: http://www.cvlibs.net/datasets/kitti/eval_object.php