Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

It's okay, humans make these kind of mistakes too.

A friend of mine has a young son (~2-3 years old), and a cat named Mono.

Her son knows the cat is named Mono - he plays with her everyday.

But when they go out for a walk, any 4 legged animal he sees is also a "Mono".

Fortunately for her son, his developing, extremely plastic brain will soon know how to differentiate Mono the cat from a random dog on the street (unlike neural networks which we will need to entirely redesign, rebuild, and retrain to get similar progress).



It could be that your friend's son has an overly generous sense of what Mono is. It could be that he knows precisely which individual entity is Mono but misunderstands what "Mono" denotes/refers to. It could be somewhere in between.

If a neural network has a representation of entities in the world apart from language referring to those entities, that's would be awesome. I'm guessing we're not there yet though.


It could also just be limited vocabulary and that his outside use of "Mono" means something like "a creature which is somewhat similar to Mono".


It could be he knows who Mono is and he knows "Mono" just refers to Mono, but there are a lot of other things in the world that he wants to talk about and he knows the grownups will be able to piece thing together. I do that often enough. "Fred or whoever, that guy, did that thing ..." If neural networks were doing that, that would be more awesome still.


Your friend's son is over-extending "Mono" to other similar animals. This is actually a normal part of a child's linguistic development.

My son also does the same thing with "dodo" for dinosaur. At first, he applied it only to big, scary dinosaurs that made large sounds—later on, it applied to any animal that had some large body and jaws, e.g. a shark. Finally, he learned to differentiate between the majority of animals (save for a few big, scary-looking animals that are still "dodo") and can name sharks, dinosaurs, and bears separately!


So in other words, "dodo" is (slowly) going extinct from his vocabulary?


Yes. I don't know what the exact term for the process is, though, since I'm not knowledgeable about linguistics and childhood development beyond what I've learned from raising my son.

It's very exciting to watch and hear the progress!


I don't actually know the terminology, either; I was just making a really bad pun. ;p


>"unlike neural networks which we will need to entirely redesign, rebuild, and retrain to get similar progress"

I don't think you would need to redesign or rebuild anything for that. You would need to train the network on additional examples, which I supposed you could call retraining (although it need not be from scratch), but that is the same in the case of the child.


Yeah but it's good to keep in mind what kinds of errors a neural network can make. It will help in training and debugging them. Also it helps to keep a little skepticism now that "deep learning" gets waved at everything like a magic wand.


> unlike neural networks which we will need to entirely redesign, rebuild, and retrain to get similar progress

You could argue that the child's neural network will be redesigned and rebuilt as his brain matures.


They're not the same type of mistake. I don't remember the deteails, but when I was reading papers on caption generation, the part of producing coherent sentences seemed more like a hack that happens to kinda work (usually) rather than a robust solution.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: