Python Daily
2.57K subscribers
1.48K photos
53 videos
2 files
38.9K links
Daily Python News
Question, Tips and Tricks, Best Practices on Python Programming Language
Find more reddit channels over at @r_channels
Download Telegram
[N] OpenAI releasing the 345M model of GPT-2 and sharing the 1.5B model "with partners working on countermeasures"

OpenAI has decided to adopt a staged release approach to their GPT-2 language model.

Announcement on Twitter: https://twitter.com/OpenAI/status/1124440412679233536

The following quotes are from the update on their blog: https://openai.com/blog/better-language-models/#update

#Staged Release

>Staged release involves the gradual release of a family of models over time. The purpose of our staged release of GPT-2 is to give people time to assess the properties of these models, discuss their societal implications, and evaluate the impacts of release after each stage.

>As the next step in our staged release strategy, we are releasing the 345M parameter version of GPT-2. This model features improved performance relative to the 117M version, though falls short of the 1.5B version with respect to the ease of generating coherent text. We have been excited to see so many positive uses of GPT-2-117M, and hope that 345M will yield still more benefits.

>While the misuse risk of 345M is higher than that of 117M, we believe it is substantially lower than that of 1.5B, and we believe that training systems of similar capability to GPT-2-345M is well within the reach of many actors already; this evolving replication landscape has informed our decision-making about what is appropriate to release.

>In making our 345M release decision, some of the

/r/MachineLearning
https://redd.it/bkejvb