voidxMtoFuturology@futurology.todayEnglish·7 months agoAI Companies Running Out of Training Data After Burning Through Entire Internet(futurism.com)external-linkarrow-up1194arrow-down14message-square53fedilink
arrow-up1190arrow-down1external-linkAI Companies Running Out of Training Data After Burning Through Entire Internet(futurism.com)voidxMtoFuturology@futurology.todayEnglish·7 months agomessage-square53fedilink
minus-squareCanadaPlusEnglisharrow-up4arrow-down0·7 months agoedit-27 months agolinkfedilinkWell, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.
Well, it’s established wisdom that the dataset size needs to scale with the number of model parameters. Quadratically, IIRC. If you don’t have that much data the training basically won’t work; it will overfit or just not progress.