I find it frightening that in the U.S. at least, basic research seems to have seriously declined at the government and large company level. Bell Labs is gone, and most companies now are focusing on applied rather than fundamental advances.
And before people start piling on large companies and the government with the usual rant, try to figure out how you could invent the laser, fiber optics, tcp/ip, lunar landing, orthgonal frequency division multiplexing , etc, with 2 college dropouts well-versed in rails and javascript.
I think there are cycles where technology get's to a state where basic research has huge value until you have enough basic research to to build a wave of new technology's. Once you have fully exploited the last wave of technology you are ready to build the next level of new technology. But, you rarely change technology's while there is still room to grow the old tech. Example: Modems starting at 300bps to 56k used the same basic technology, then DSL, then Fiber.
IMO, there is value to understanding the limitations to existing technology before you start doing basic research.
Yes. New technology is invented in response to problems with the older technology, and if you haven't capitalized on the old technology's potential, you'll likely just be reinventing the old solutions instead of coming up with new, better ones.
Interesting. A counter anecdote would be the advances in physics in the early 20th century, when we were still far from finished innovating with Newtonian mechanics. I've always doubted the whole "necessity is the mother of invention" thing; I don't think relativity was discovered because it became necessary to discover it. On the other hand, when it comes to technological innovations (semiconductors, transistors, lasers, etc.,) I just don't know enough scientific history to assess your hypothesis.
I think the limitations of Newtonian physics where only discovered as people built devices that broke them. There is a great time line for that period, but when you consider the instruments required to make the discovery's you see they are based on specific assumptions about how the world operates. It's only when their design reaches their limits that meaningful discovery's can be made.
"The Dutch physicist Heike Kamerlingh Onnes finds that mercury loses its electrical resistance at temperatures near absolute zero. This low temperature effect is observed in other materials as well."
That takes advanced equipment to reduce things to that temperature, an expectation that resistance is effected by temperature, and an formula that works at higher temperatures to notice the discontinuity.
PS: Consider all the benefits high speed computing has provided when designing aircraft. You need a lot of wind tunnel / real world tests to build a model, but with that model and lot's of computing power you can design aircraft out to the limits of your simulation. At which point you need to collect more data.
And before people start piling on large companies and the government with the usual rant, try to figure out how you could invent the laser, fiber optics, tcp/ip, lunar landing, orthgonal frequency division multiplexing , etc, with 2 college dropouts well-versed in rails and javascript.