New 'Forgetting' AI Model Learns Languages Faster and More Flexibly
-
Researchers created a more flexible AI language model that can "selectively forget" some of what it has learned in order to more easily adapt to new data or tasks.
-
Forgetting helps the model learn new languages with less data and computing power than usually required.
-
Periodically resetting parts of the model during initial training makes it better at adapting later on.
-
Forgetting may work because deeper levels of the network capture abstract concepts common across languages.
-
More flexible "forgetting models" could bring the latest AI advances to more languages beyond English and Spanish.