GPT-2 1.5B or a 3rd-party model trained with at least 40GB of text will be released before 2020
Created by AlexLamson on 2019-05-29; known on 2020-01-01; judged right by AlexLamson on 2019-09-15.
- AlexLamson estimated 70% on 2019-05-29
- PseudonymousUser estimated 65% on 2019-05-29
- pranomostro estimated 60% on 2019-05-31
- PseudonymousUser estimated 15% on 2019-08-06
- AlexLamson said ““To train our GPT-2 model we created a 37 GB WebText dataset” (https://nv-adlr.github.io/MegatronLM)) Dang, so close” on 2019-08-15
- AlexLamson said “https://github.com/salesforce/ctrl/blob/master/README.md” on 2019-09-15
- AlexLamson judged this prediction right on 2019-09-15.
- AlexLamson said “Postponed:https://www.usatoday.com/story/sports/olympics/2020/03/23/olympics-2020-ioc-member-tokyo-games-postponed-dick-pound-coronavirus/2899848001/” on 2020-03-23