475
Discussion[D] GPT-3, The $4,600,000 Language Model(self.MachineLearning)
submitted 5 years, 7 months ago* (edited 18 hours, 40 minutes after) by mippie_moe to /r/MachineLearning (3m)
since 5 years, 7 months ago
8 of 8
Tip Reveddit Real-Time can notify you when your content is removed.
your account history
Tip Check if your account has any removed comments.
view my removed comments you are viewing a single comment's thread.
view all comments


just to put your numbers in perspective:
your "doesn't seem that outrageously high" is >120 fully funded 3 year PhD positions in Denmark.
Hi,
you might have missed the relevant context of my reply:
A budget that is larger than the yearly budget of whole CS departments is outrageously high.
Yes, a lot of what you mention is outrageous. But it is more so outrageous if it happens within the same field. E.g. as an experimental particle physicist i can expect my research to be expensive and thus i can expect to also be granted more Money by funding agencies (or access to those facilities at reasonable prices).
This does not happen at ML. most of this research will not be reproducible by independent parties. And given the extend of errors, under-reporting and misreporting in this field, this is bad for science.
I gave examples from the same field, I am talking about the same fields where academia funding is much smaller (and includes many more people, as a counterbalance to that) than industry.
It's a very commonly used standard example accelerator for deep learning workloads. The top 2 supercomputers in the world on the Top500 list for the past 2 years were built with V100s. They are absurdly expensive, but they are (for now) a definitive standard in high performance computing.