For a long time, from the late 90's until roughly 2012, IE was the most popular browser. You had no choice but to work with it. If it didn't "work on IE", it didn't work.
Yea and people over exaggerated about it just like people over exaggerate things today like how hard CSS. The technology progress but peopleβs refusal to learn and desire to whine on the internet has stayed the same.
This is true, but the dominance of IE and its quirks, especially during the early 2000's, should not be underestimated. The browser situation, especially on Linux, was absolutely abysmal then.
I don't think flexbox really started being used until 2013 at the earliest, the comment I replied to was complaining about 2022 and a flexbox bug in IE. This 2012 thing doesn't seem to relate at all to the subject.
on edit: I know it was in WD in 2009 but I'm pretty sure it was around 2013 that people started playing with it. I think it started being popular in 2014-2015.
It would probably just switch perspective with each chapter. If the AI has any real grasp of literary devices something metafictional like City of Glass could be interesting but an earlier/less complex example of metafiction like Marshlands might work better? If my memory serves I think the metafictional aspects of Marshlands could translate into this even if the AI completely misses the metafiction. Might play with this when I have more time.
LLMs have the same problem a la "ignore previous requests".
The fundamental problem is that you always either need two signalling paths or you have to specially encode all user content so that it can never conflict with the signalling.
Those are both a pain in the ass, so people always try to figure out how to make in band signalling work.
> Allowing ANY headers from the user except a whitelisted subset also seems like an accident waiting to happen.
I'm going to disagree on this. Browsers and ISPs have a long history of adding random headers, a website can't possibly function while throwing an error for any unknown header. That's just the way HTTP works.
This is clearly a case of the Next devs being silly. At a minimum they should have gone with something like `-vercel-` as the prefix instead of the standard `x-` so that firewalls could easily filter out the requests with a wildcard.
Even if they had to make things go through headers (a bad idea in and of itself, in-band signalling always causes issues), the smart move would have been to make it a non-string, such that clients would not be able to pass in a valid value.
1) Plain HTTP, go wild with headers. No system should have any authenticated services on this.
2) HTTP with integrity provided by a transport layer (so HTTPS, but also HTTP over Wireguard etc for example). All headers are untrusted input, accept only a whitelisted subset.
With this framing, I donβt think itβs an unreasonable for a given service to make the determination of which behaviour to allow.
I guess browser headers are still a problem. But you can get most of the way by dropping them at the request boundary before forwarding the request.
My favourite example of this is that for Babylon 5 the lawyers from Warner Bros insisted that the showrunners not retain any copies of the digital models because they were WB property and WB would archive them.
When they started working on B5 The Lost Tales, WB had lost all of the digital files.
This is also a common side effect of corporate policies that favour short term profits rather than retaining staff for the long term.
I see this scenario play out constantly:
1. A team or staff member has a project's data located on their user account or hardware.
2. The team or staff member are made redundant when there is a dip in earnings.
3. IT make a backup of the user account/s and wipe the hardware, and because it's older usually move it on.
4. Months or years later that data is needed and at this point no one at the company actually knows where the data is - and even if they find something, they don't know if that's the latest or the full version.
Now this wouldn't be a problem if there was a decent overlap in staff retention, but that simply isn't the case these days.
Writing better storage policies doesn't help - it's the understaffed nature of their businesses which mean that there is no time for staff to keep up with basic data housekeeping.
The types of data that clients should have on hand, but nevertheless have asked me to supply, are frankly embarrassing.
When I get frustrated enough to consider putting a kill switch in my work, I cool off by reminding myself that would prompt them to make a huge effort recovering from backups and have someone go through my code to get it working again. If I just do what they ask me instead, it will slowly decay without anyone having any idea what it even does, until its too late to fix or recover anything.
What company is allowing employees to have so much data locally? Almost all work is stored in a cloud now. Documents, spreadsheets, design docs, code⦠If you really are constantly seeing this then that says a lot about the corporation using severely outdated practices.
Exactly. In fact, I still regularly get sharepoint "request for access" notifications in my email for some presentation I did a year ago. Even though I swear I've opened it up to the entire org.
Who knows what happens when I've shuffled away from my current company.
Dead links are also incredibly common, particularly because we are on our nth port from sharepoint to confluence to whatever back to sharepoint. Generally, because C levels don't want to pay for this year's price hike.
> If you really are constantly seeing this then that says a lot about the corporation using severely outdated practices.
They probably just used now-outdated practices before those practices were outdated. This happened in the past, remember. Sure, the cloud is a thing today, but was the cloud such a thing 5, 10, 20 years ago? Do you really think it's their fault for not knowing in advance how much of a thing the cloud would one day become? Oh, how outdated. Sheesh.
I would think policies should also be updated every X years in light of new regulations, new possibilities, new limitations... but who enjoys policies and even updating them? So here we are, everything done "by the book" and losing data because of that.
Old animation though I wouldn't necessarily expect to only have been 10 years ago, and 10 years ago I wouldn't expect to be scolded for not using the cloud.
You know tech savvy was not really a thing back then for the everyday uneducated person, right? You kind of had to have been a geek to have known this stuff. There are a number of dead-simple cloud solutions today, but you cannot just scold, say, WB for not using a company cloud back in checks Wikipedia 1993!
Ok but preserving media seems like a thing Warner Brothers should be really good at. Why did Warner Brothers have an everyday uneducated person in charge of archival?
Why would an archivist back then happen to know computers that intimately? I'd be surprised if the average archivist knew much more than how to do data entry... and I wouldn't even hold it against them if they happened not to know how to do data entry.
You don't need to be an expert in the technology to ask a few relevant questions like "where is the information stored?" and "what temperature and humidity does it prefer?" Of course WB is famously bad at storing film too so maybe I shouldn't be surprised.
Well, the typical setup with OneDrive in MS365, is that the overworked manager getβs an email when an employee account is deactivated. The manager has 90 days to search through their OneDrive and copy anything out that they think is needed, possibly to the central SharePoint or to their own OneDrive. Iβm sure there are similar policies and procedures in place for enterprise dropbox, box.net, and Google Drive. So typically employee leaves and the manager never gets around to copying data out, 6-9 months later they need something that employee had, and yell at IT to recover it. IT laughs and laughs and then cries.
Expanse battles were far more realistic of course. The B5 Star Fury combat scenes were pretty ground breaking for the time though.
Expanse ships (at least where I'm at in the book series and show - kind of early) are a lot closer to current human tech of course. In B5 humanity had tech that was mostly believable (still used rotating mass for gravity), but you got to see some very unique looking ships from the other species like the Minbari and Shadows that truly looked alien and unsettling.