Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, lots of activity in the space. I thought you were saying it was a dumb problem, but I was wrong.

I think this is a great paper.



yup, if you look at drop out, what it does and why, you can see additional interesting results along these lines

(drop-out was found to increase resilience in models because they had to encode information in the weights differently, i.e. could not rely on single neuron (at the limit))


I suppose, except that for a model of 7B parameters, the number of combinations of dropout that you'd be analyzing is 7B factorial. More importantly, dropout has loss minimization to guide it during training, whereas understanding how a model changes when you edit a few weights is a very broad question.


the analysis is more akin to analyzing with & without dropout, where a common number is to drop a random 50% of connections during a pass for training, thus forcing the model to not rely on specific nodes or connections

When you look at a specific input, you can look to see what gets activated or not. Orthogonal but related ideas for inspecting the activations to see effects




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: