Catastrophic forgetting is a key issue in continual learning paradigm. Training algorithms, like FORCE, seem to be able to bypass this to some extent. Chen and Barak applied fixed point analysis to explicitly show the change of fixed point structure of networks during training in continual learning scenario. Their work provide intuitions about how learning algorithm and the order of task sequence affect the training in continual learning.

  • FORCE: slower convergence in one trial, faster convergence in multi trials.
  • FORCE: less biological plausible, more powerful in sequential learning.
  • LMS (least mean square): smaller learning rate, less interference.