I’m rather curious to see how the EU’s privacy laws are going to handle this.

(Original article is from Fortune, but Yahoo Finance doesn’t have a paywall)

  • Strawberry@lemmy.blahaj.zone
    link
    fedilink
    English
    arrow-up
    3
    ·
    10 months ago

    You could certainly break up training data, but breaking up the models into mini models based on which training data is used wouldn’t work with neural networks trained using gradient descent. Basically whatever the state of the model is it depends on the totality of the training data that it has been trained on (and the order) and it isn’t possible to go and remove the effect of a specific training data point without then retraining for all of the data that followed that data point (and even that assumes you were storing a snapshot of the model before every single training data point, which I doubt anyone does)

    However, that’s no excuse and it is of course possible to entirely retrain a network using a clean dataset and that is what these companies should do