Alec Radford’s animations for optimization algorithms[FW]

Alec Radford has created some great animations comparing optimization algorithms SGDMomentumNAGAdagradAdadeltaRMSprop (unfortunately no Adam) on low dimensional problems. Also check out his presentation on RNNs.

Noisy moons: This is logistic regression on noisy moons dataset from sklearn which shows the smoothing effects of momentum based techniques (which also results in over shooting and correction). The error surface is visualized as an average over the whole dataset empirically, but the trajectories show the dynamics of minibatches on noisy data. The bottom chart is an accuracy plot.”

Beale’s function: Due to the large initial gradient, velocity based techniques shoot off and bounce around – adagrad almost goes unstable for the same reason. Algos that scale gradients/step sizes like adadelta and RMSProp proceed more like accelerated SGD and handle large gradients with more stability.”

Continue Reading

OSX app 启动后在程序坞不显示

用回mac之后发现各种神奇的问题,比如cocos运行之后在程序坞中找不到,开的窗口多了之后每次想找到他得翻半天(主要是不会各种神奇的快捷键)。

要解决这个问题也不难,退出cocos,在启动台内找到cocos直接拖到程序坞。然后重新启动cocos应该就能出现了