Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Most commenters here don't know that Boltzmann machines and associative memories existed in condensed matter physics long before they were used in cognitive science or AI.

The Sherrington–Kirkpatrick model of spin glass is a Hopfield network with random initialization.

Boltzmann machine is Sherrington–Kirkpatrick model with external field.

This is price in physics given to novel use of stochastic spin-glass modelling. Unexpected, but saying this is not physics is not correct.



I'm a condensed matter/statistical physicist and am very aware of the connections to statistical physics I still think that the committee has completely lost it with this choice. There is a sharp line for me between things that are inspired by physics and those that are physics (and I really don't buy that physics is anything physicists do) -- and this clearly falls on the "inspired by" side.


I know plenty of physicists that would be very pissed off by all this drama.

There is so much more than fundamental physics and there is much more in physics than breakthrough discoveries. Medical physics just to name a fun field everybody always forgets about has been studying and using neural networks since about forty years. Applied physics, biophysics, atmospheric physics. Even particle physics is mostly data science these days.

This idea that physics should only be about fundamental theories and discoveries is really detrimental to the field and leads to the false idea of stagnation that permeates this whole thread.


So if they used a genetic algorithm, they could have got the prize for biology?


There is no nobel prize for biology


I see lots of potential here.


Agree completely, being in this field.

However, it is weird for the committee to give a prize for theoretical physics without an experiment. It is doubly weird when they already made this "mistake" in 2021 with Parisi, who was the odd one out among the geophysicists, and are giving another prize in spin glass/stat phys... why?


In summary, it's definitely related to physics, but kind of weird choice.

Why David Sherrington and Scott Kirkpatrick did not share the price for the Sherrington–Kirkpatrick model? Hopfield is referencing their work?

Multiple theoretical physicists working with black holes (Hawkin's and others) didn't get Nobel, because black holes were not confirmed or theory could not be tested.


The methods may be inspired by physics, but they have made no contribution to understanding physical laws or phenomena.

It's mathematical/CS work. The connection to actual physical laws or phenomena is even more tenuous than the prize for exoplanets a few years ago.

The Nobel prize physics committee has made itself a joke, and probably destroyed the credibility of the prize.


> but they have made no contribution to understanding physical laws or phenomena.

Neural networks are used in tons of data pipelines for physics experiments, most notably with particle accelerators.

The Nobel Prize is also occasionally awarded to engineers who develop tools that are important parts of experiments. 2018 for example was awarded for chirped pulse amplification, which is probably best known for being used in LASIK eye surgery, but it is also used in experimental pipelines.


> Neural networks are used in tons of data pipelines for physics experiments

With this argument you could even say Bill Gates should get an award for inventing Windows and popularized the desktop computer... Or at least Linus Torvalds since those pipelines are probably running Linux...


No you couldn't. Windows doesn't have any bearing on outcomes, whereas machine learning methods directly impact the data and probability inference.


Yeah, well, those pipelines are running on HPCs that are using linux. Particle physicists kind of hate Windows.


The techniques highlighted in this prize are not really that useful for deep learning.


You mean besides bringing it into existence at all?


Please explain how Hopfield network influenced modern deep learning models based on supervised differentiable training. All the "impactful" architectures such as MLP, CNN, Attention, come from a completely different paradigm, a paradigm that could be more straightforwardly connected to optimization theory.


They did not bring it into existence. The MLP is older than the Hopfield network. The invention that made it practical was back propagation, which wasn't used here at all.


> and probably destroyed the credibility of the prize.

From now on I'll always see it as just another nobel peace prize.

This is beyond ridiculous.


Also double descent was discovered already by physicists in 80s-90s


In curious what's the context for this?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: