• General_EffortEnglish
    arrow-up
    1
    arrow-down
    0
    ·
    6 months ago
    link
    fedilink

    In theory there’s an inflection point at which models become sophisticated enough that they can self-sustain with generating training data to recursively improve

    That sounds surprising. Do you have a source?