The article discusses the challenge of catastrophic forgetting in training artificial neural networks, where a network forgets how to perform the original task after learning a new one. It introduces Elastic Weight Consolidation (EWC) as a method to address this issue. EWC is a promising approach that helps neural networks retain knowledge from previous tasks while learning new ones. This technique can improve the performance and stability of neural networks in sequential learning scenarios.
Source link
Source link: https://towardsdatascience.com/continual-learning-a-deep-dive-into-elastic-weight-consolidation-loss-7cda4a2d058c