login::  password::




cwbe coordinatez:
101
63533
63608
8771344
8772421

ABSOLUT
KYBERIA
permissions
you: r,
system: public
net: yes

neurons

stats|by_visit|by_K
source
tiamat
commanders
polls

total descendants::0
total children::0
1 K

show[ 2 | 3] flat


https://venturebeat.com/2020/07/24/ai-weekly-the-promise-and-shortcomings-of-openais-gpt-3/

Emily Bender is a professor, a linguist, and a member of the University of Washington’s NLP group. Last month, a paper she coauthored about large language models like GPT-3 argued the hype around such models shouldn’t mislead people into believing the language models are capable of understanding or meaning. The paper won an award from the Association of Computational Linguistics conference.

“While large neural language models may well end up being important components of an eventual full-scale solution to human-analogous natural language understanding, they are not nearly-there solutions to this grand challenge,” the paper reads.