Mickey was selling foods under that trademark in Paraguay since 1835. When did Disney start competing in that market segment with their Mickey Mouse brand?
Ack, yes, I meant 1935. I'm on mobile and typoed that in a way the spell checker couldn't catch...
Even now in the US I have yet to see Disney selling rice under the Mickey Mouse brand in a way that could be at all confused with the Mickey rice in Paraguay that I grew up with.
> the foundation uses 5-digit dates to address the Year 10,000 problem
That just sets you up for the year 100,000 problem. The objectively better idea is to just parse years as regular integers, something that I'm confident people will figure out in the remaining 8 millennia until the problem actually hits.
but the truth is more subtle, love it or hate it the five digits are part of a campaign to encourage people to think about humans in timescales a tad longer than a few decades at most.
It just sounds like a way for people to try showing how smart they are while in reality it just adds confusion to completely irrelevant discussions, such as this one.
I think the campaign does opposite of what it says. Using 5 digits limits your dates to dates to a maximum of year 99999. While not giving a fuck, using the digits needed to represent a year, works for all years. No one is clamping their years to 4 digits when they say '2024', they are just using the necessary digits needed for that year
> a way for people to try showing how smart they are
given the choice between trying to demonstrate intelligence, and trying to demonstrate viciousness, i prefer the former. i guess your preference is different
Why start counting 02024 years ago then? is pretty much a small scale arbitrary starting point. Humans have existed for longer than that, they could say we are in the year 300000.
How does promotion by a single non-profit talking shop, however celebrated or intellectual were its founders, make it a "valid notation"?
Five digits don't even make sense in practical anthropic terms, since we're obviously headed for a shitload of trouble long before the hundreds digit next wraps.
To add some context as to why businesses wouldn't survive it, that war happened between 00001864 and 00001870, and wiped more than 00000050% of the male population (estimates vary).
> Step one, double the limit to alleviate immediate customer pain.
I've been oncall for systems where that would not work.
Doubling the memory means you need twice as many machines. Depending on the service, that could require significantly increased network bandwidth. Now the network is saturated and every node needs to queue more data. Now latency and throughput are even worse, and even more requests are being dropped, so you automatically double the limit again...
While that all may be true (but are indications of a poorly architected system), my code would still work. It would double the limit and then page someone. If they logged in and saw all those failures, then they could address those issues.
The whole point is that having an around the world follow the sun team would not alleviate those issues or make anything better.
> I've seen plenty of really scalable systems being built by a small core team of people who know what they are doing.
There is huge difference between building a system that could theoretically be scaled up and actually scaling it up efficiently.
At small scales, it's really easy to build on the work of others and take things for granted without even knowing where the scaling limits are. For example, if I suddenly find I need to double my data storage capacity, I can drive to a store and come back with a trunk full of hard drives the same day. I can only do that because someone already build the hard drives, and someone stocked the nearby stores with them. If a hyperscaler needs to double their capacity, they need to plan it well in advance, allocating a substantial fraction of global hard drive manufacturing capacity. They can't just assume someone would have already built the hardware, much less have it in stock near where it's needed.
It's kind of like how an implicit side effect in "normal" FP languages is that memory is being allocated.
The nix garbage collector will even free up disk space by deleting unreferenced packages just like a garbage collector freeing memory by deleting unreferenced values.
I never thought of it as backwards. Defining functions before calling them makes as much sense as defining terms before using them or assigning variables before reading them.
Is that having GCed all but the current generation of packages? Or filtering for a single nixpkgs version some other way?
Even then, I would expect NixOS to do some unsharing of shared libraries, so it would probably be best to use the full path to determine how much each shared object is actually shared.
For example, I have 87 different `libzstd.so.1` in my `/nix/store` right now.
Static linking does not preclude separate compilation though. Go and Rust both use separate compilation with static linking by default (at least last I used them... it's been a while). I know Rust at least used to support dynamic linking too -- it just wasn't the default. And C also supports static linking of course.
> In our analysis, βmiddle-incomeβ Americans are adults whose annual household income is two-thirds to double the national median, after incomes have been adjusted for household size.
And they seem to use "middle-income" and "middle-class" interchangeably.
Note that this definition is quite different than what may have been used in your history/economics classes in school/college.