The most important research paper of the past 10 years is the Google transformer paper ("Attention Is All You Need") and it was written by non-academics and published in an open-access journal.His point is that non-academics are doing better research than academics.
It is true that there is nothing to stop non-academics from doing and distributing cutting edge research, but this is not an example. The work was done by PhD computer scientists and published at an academic conference.
To an outsider, getting a PhD must seem like a big waste of time. But nearly all the good research is done by PhDs.
The paper was one of the more important papers leading to AI large language models, and Google later got a patent on it. Some claim that the T stands for the transformer of this paper. OpenAI did not claim that until GPT-3. Curiously, Google only claimed it as an encoder-decoder model, and not the decoder model that the LLMs use. Others figured out how to use it for products like ChatGPT.
No comments:
Post a Comment