Pro@programming.dev to Technology@lemmy.worldEnglish · 2 days agoThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgexternal-linkmessage-square58linkfedilinkarrow-up1313arrow-down119cross-posted to: [email protected]
arrow-up1294arrow-down1external-linkThe Collapse of GPT: Will future artificial intelligence systems perform increasingly poorly due to AI-generated material in their training data?cacm.acm.orgPro@programming.dev to Technology@lemmy.worldEnglish · 2 days agomessage-square58linkfedilinkcross-posted to: [email protected]
minus-squareGrandwolf319@sh.itjust.workslinkfedilinkEnglisharrow-up16·1 day agoMaybe, but even if that’s not an issue, there is a bigger one: Law of diminishing returns. So to double performance, it takes much more than double of the data. Right now LLMs aren’t profitable even though they are more efficient compared to using more data. All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
minus-squareAItoothbrush@lemmy.ziplinkfedilinkEnglisharrow-up8·1 day agoIts very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
minus-squarerottingleaf@lemmy.worldlinkfedilinkEnglisharrow-up9arrow-down1·1 day ago All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb. Seemed superficially obvious. Human brain is a system optimization of which took energy of evolution since start of life on Earth. That is, infinitely bigger amount of data. It’s like comparing a barrel of oil to a barrel of soured milk.
minus-squareRaptorBenn@lemmy.worldlinkfedilinkEnglisharrow-up4·1 day agoIf it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.
Maybe, but even if that’s not an issue, there is a bigger one:
Law of diminishing returns.
So to double performance, it takes much more than double of the data.
Right now LLMs aren’t profitable even though they are more efficient compared to using more data.
All this AI craze has taught me is that the human brain is super advanced given its performance even though it takes the energy of a light bulb.
Its very efficient specifically in what it does. When you do math in your brain its very inefficient the same way doing brain stuff on a math machine is.
Seemed superficially obvious.
Human brain is a system optimization of which took energy of evolution since start of life on Earth.
That is, infinitely bigger amount of data.
It’s like comparing a barrel of oil to a barrel of soured milk.
If it wasn’t a fledgingling technology with a lot more advancements to be made yet, I’d worry about that.