The latest NLP revolution

Read an interview with Noam Shazeer who helped spark the latest NLP revolution. “He developed the multi-headed self-attention mechanism described in “Attention Is All You Need,” the 2017 paper that introduced the transformer network. That architecture became the foundation of a new generation of models that have a much firmer grip on the vagaries of human language. Shazeer’s grandparents fled the Nazi Holocaust to the former Soviet Union, and he was born in Philadelphia in 1976 to a multi-lingual math teacher turned engineer and a full-time mom.” Read more

This entry was posted in Natural Language Processing. Bookmark the permalink.

Leave a comment