haxor@derp.fooMB to Hacker News@derp.fooEnglish · 11 months agoNY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comexternal-linkmessage-square10fedilinkarrow-up135arrow-down14file-text
arrow-up131arrow-down1external-linkNY Times is asking that ALL LLMs trained on Times data be destroyedtwitter.comhaxor@derp.fooMB to Hacker News@derp.fooEnglish · 11 months agomessage-square10fedilinkfile-text
minus-squareLvxferre@lemmy.mllinkfedilinkEnglisharrow-up8arrow-down2·11 months ago 2 seconds later someone can train a new one “Training” datasets: GPT-3 - 300? 500? billion tokens Bard - 1.5 trillion words GPT-4 - 13 trillion tokens Does this look like the amount of content that you’d get in two seconds??? Maybe they should learn to code like those coal miners they pitied. And maybe you should go back to Reddit.
“Training” datasets:
Does this look like the amount of content that you’d get in two seconds???
And maybe you should go back to Reddit.