Tensorflow for Deep Learning
3 or more Hidden Layer then you have a deep network Activation Function Hyperbolic Tangent: $$tanh(z)$$, where $$z = wx+b$$ cosh(x) = \frac{e^x + e^{-x}}{2} sinh(x) = \frac{e^x - e^{-x}}{2} tanh(x) = \frac{sinh(x)}{cosh(x)} The graph looks like this Rectified Linear Unit(ReLU): relatively simple function: max(0, z) Cost Function Quadratic Cost $$C = \sum(y-a)^2/n$$, where $$a$$ is [...]