If all one has been exposed is a development world of web-based consumer-facing front-ends (CFUI), it is hard to imagine that the majority of software lies elsewhere. Hard to imagine that anything else is important. It leads to a viewpoint that modern programming is mainly about UI interacting with a database.
I'd say you have to group business-facing front ends with consumer-facing front ends, as well as big chunk of mobile development, because all of these are done by the same people from the same programming culture, with the same technologies, patterns and ways of thinking.
And with that, I'd argue this is majority of software.
By what measure? Majority of individual 'products'? Majority of programmers? Majority of investment? Majority of revenue?
I think that at best what you're describing is a thin simple majority by some of those metrics. But there are so many developers working on factory automation, networking, gaming, small embedded systems (anything with a chip in it, from washing machines to cars), not to mention the huge frameworks that power am off the simple apps you are taking about - Linux, Windows, Chrome, Firefox, Android, iOS, the major SQL DBs, the major NoSQL DBs, the language tools for all of these, the web servers, the distribution frameworks at Google/Facebook/Microsoft/Amazon and so on.
While the number of core technologies is obviously easily dwarfed by the number of applications, the number of people working on each of these core technologies is so large that I can't believe your assumption.
Also, some of these industries, particularly gaming, and maybe networking as well, are so huge in terms of revenue, that I don't think CRUD apps could win on that side either.
I'd say majority of programmer-hours, because anything else is a metric that's too unreliable / easy to fudge. "Individual products" is an arbitrary marketing distinction (e.g. a company I worked for could create and sell new products by tweaking few lines in a config file; every customer would get their own mix and match of software pieces). And neither investments nor revenue are directly correlated with the amount of software you have.
Note also that out of the things you mentioned, "factory automation, networking, gaming, small embedded systems", perhaps only small embedded systems projects aren't partly done in web technologies. Gaming definitely is (in AAA games, all the UI that's not the part of core gameplay - e.g. launchers, lobby, marketplace - is likely to be done in Electron or some embeddable webview library). Factory automation, if two companies I worked with over the last decade are any indication, is slowly moving towards webapps for everything. So I imagine networking on the admin side.
It's basically too cheap to make a shit, bloated but somewhat pretty UI with JavaScript. Everyone's doing it, everyone keeps hiring for it.
Using web technologies is not the same as doing crud web apps. Games for example are a very different kind of programming. And even if network admin and factory automation apps have web front-ends, that's still a minuscule part of the code.
Games are very different programming. Code surrounding games - lobby, game server lists, marketpaces, IAP - this is pure CRUD, and it's entirely unrelated to the game code.
I think there's a strong overtone of that in many places, including in business and popular culture. Most peoples' interactions with software are phone apps and web fluff.