« Emergent Abilities of Large Language Models » : différence entre les versions
Aucun résumé des modifications |
Aucun résumé des modifications |
||
Ligne 5 : | Ligne 5 : | ||
[1] We will consider the following general definition of emergence, adapted from Steinhardt (2022) and rooted in a 1972 essay called “More Is Different” by Nobel prize-winning physicist Philip Anderson | [1] We will consider the following general definition of emergence, adapted from Steinhardt (2022) and rooted in a 1972 essay called “More Is Different” by Nobel prize-winning physicist Philip Anderson | ||
[1] Emergence is when quantitative changes in a system result in qualitative changes in behavior. | |||
Les paramètres qui servent à changer d'échelle pour les modèles sont : la taille du dataset, le nombre de paramètres du modèle et la puissance de calcul requise pour l'entraîner (les deux derniers étant souvent corrélés). | |||
[2] Although scaling is mainly conducted in model size (with similar architectures and pre-training tasks), these large-sized PLMs display different behaviors from smaller PLMs (e.g., 330M-parameter BERT and 1.5B- parameter GPT-2) and show surprising abilities (called emergent abilities) in solving a series of complex tasks. | [2] Although scaling is mainly conducted in model size (with similar architectures and pre-training tasks), these large-sized PLMs display different behaviors from smaller PLMs (e.g., 330M-parameter BERT and 1.5B- parameter GPT-2) and show surprising abilities (called emergent abilities) in solving a series of complex tasks. |
Version actuelle datée du 27 avril 2023 à 16:23
On entend par "Emergent Abilities of Large Language Models" une capacité présente dans un LLM qui ne se retrouve pas dans un modèle similaire mais plus petit. Ce qui veut dire aussi qu'on ne peut pas prévoir (extrapoler) cette nouvelle capacité uniquement à partir de celles d'un modèle plus petit.
[1] We consider an ability to be emergent if it is not present in smaller models but is present in larger models.
[1] We will consider the following general definition of emergence, adapted from Steinhardt (2022) and rooted in a 1972 essay called “More Is Different” by Nobel prize-winning physicist Philip Anderson
[1] Emergence is when quantitative changes in a system result in qualitative changes in behavior.
Les paramètres qui servent à changer d'échelle pour les modèles sont : la taille du dataset, le nombre de paramètres du modèle et la puissance de calcul requise pour l'entraîner (les deux derniers étant souvent corrélés).
[2] Although scaling is mainly conducted in model size (with similar architectures and pre-training tasks), these large-sized PLMs display different behaviors from smaller PLMs (e.g., 330M-parameter BERT and 1.5B- parameter GPT-2) and show surprising abilities (called emergent abilities) in solving a series of complex tasks.
[2] For example, GPT-3 can solve few-shot tasks through in-context learning, whereas GPT-2 cannot do well.