Language Modelling at Scale

In this blog, they discuss 3 new papers. They trained a variety of language models from 44 million to 280 billion parameters.

5 Likes

I’m not sure if they have a plan to release it as an API.

Also, they chose to call it Gopher :joy::joy::joy: are you kidding me?

2 Likes

Also, they chose to call it Gopher :joy::joy::joy: are you kidding me?

Perhaps it is eco-friendly to recycle names that have already been used?

1 Like

So it is the reincarnation of the predecessor of the WWW. All right, then.

1 Like

It will be followed by future models like sloth, beaver, hedgehog, and lemur

2 Likes

I eagerly await Sloth™. :sloth:

4 Likes

Brb, gonna bring back ‘Commodore’ from the dead right quick

3 Likes

Could be a call back to the early days? Gopher, Veronica, Archie, Lynx browser… those were the days! :wink:

2 Likes

This is the issue with these advertisements (I mean papers). It does look good on paper, but where is the code or API as OpenAI has done? Until then, it is ether. I think some companies, like Google, will likely wrap the functionality into their existing product line (e.g. Home Assistant, Google Office).

2 Likes

This does bring back old memories–my first interface to the web.

I’m thinking of Gopher as the mascot of the Go language lol

1 Like