Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Analysis of Chromium issue 1196683, 1195777 (2021) (iamelli0t.github.io)
74 points by luu on Sept 28, 2022 | hide | past | favorite | 33 comments


As a casual observer of Firefox development (well, maybe not casual, I skim through the pushlogs occasionally), I find it interesting to compare this to some Firefox security bug guidelines.[0][1]

Although it looks like in these cases the issues were reverse-engineered in <1 day (making release timing less of a solution), two items in those that stand out to me as potentially helpful are "land tests in-tree later" and "bundle with other changes in the same area."

[0] https://firefox-source-docs.mozilla.org/bug-mgmt/processes/s...

[1] https://firefox-source-docs.mozilla.org/bug-mgmt/processes/f...


These kinds of security-by-obscurity guidelines are well-known and oft-touted, but only debatably useful when dealing with a codebase as closely monitored and security-sensitive as V8. Many professionals instead recommend full disclosure as early and as immediately as possible, including publishing CVEs and vulnerability warnings so that affected users can take appropriate steps as soon as a patch is available or deploy mitigations (for example, in very severe cases, disabling the V8 JIT, which is a well-known RCE mitigation).

Also, in this case, both bugs would need to be combined with a zero-day Chrome sandbox escape vulnerability to achieve exploitability.

If anything, I'm more worried that the fixes weren't backported more urgently and that CVEs do not seem to have been assigned. But this seems to have been a success for defense in depth, at least.


Former Firefox developer here.

The reason for omitting tests is because they are essentially the exploits themselves; omission raises the barrier to entry, as somebody has to be skilled enough to craft an exploit based on the source code change.

While security fixes are typically pushed out on all supported branches within a very short window of time, anything that buys time is useful imho.


Sibling comments address this pretty well, but the ideas I mentioned still seem like good ideas for a few reasons:

- The article mentions the sample exploit is based on the test case

- The point of obfuscation isn't to prevent exploitation, it's to delay it

- A sibling mentions the delay between patch and release as an issue; because of the only ~1 day delay here it seems like this was intentionally merged right before release, so buying even a day (or two or three, to account for time to update) would be enough to mitigate most of the impact

There's a big difference between reading a snippet of JS and understanding the inner workings of a JIT, especially under time pressure.


Generally, security fixes like this are back-merged to the stable branch almost immediately, i.e. within hours. The issue is that the stable branch isn't integrated into Chromium, built, and released into stable until the next spin, which could be a couple days or even weeks.


the amount of effort put into optimizing javascript engines blows my friggin mind every time i see something like this.


The principal sin, IMHO, is that JavaScript technically has only one number type, double. But doubles are pretty slow, particularly if they are used as array indexes, so JS engines do a ton of numerical analysis (range analysis) to determine when it is safe to do integer math. That reasoning is extremely tricky and I've made gazongas amounts of bugs myself. (I'm the original author of the SimplifiedLoweringPass, which has evolved significantly since I last worked on that part of the code--circa 2015). Unfortunately static analysis of any kind really doesn't help...the implementation language (C++) isn't really at fault...because the bugs are manifesting at the compiler IR level, i.e. how it reasons about JS values in the representation of JS code.


Javascript has two number types. `number` and `bigint`.

https://tc39.es/ecma262/multipage/numbers-and-dates.html#sec...


BigInt isn't super relevant here though, it's new and a distinct type so it doesn't come into play for any typical computations. You can't use it as the size of an array, etc.


Technically it’s number. But bigint can be much larger than a 64-bit integer, so from the CPU’s perspective it’s an array rather than a single integer.


This is also why wasm in the browser is not specially "safe". It's the same jit optimization and execution as the JavaScript pipeline. Therefore vulns in that are sometimes useable in wasm. Speed is a tradeoff. And is why if you have a system that uses v8's or others wasm runtimes, it must be able to update.


Looks well written/researched, but this page has broken scrolling that makes it very frustrating to read.


Please don't complain about tangential annoyances—things like article or website formats, name collisions, or back-button breakage. They're too common to be interesting.

https://news.ycombinator.com/newsguidelines.html


It frustrates me too! No scrollbar either. I noticed once I clicked on the main content area it scrolled normally (by which I mean with the keyboard), but still... unnecessary frustration.

Is there some reason this sort of user-hostile design is becoming more and more common?

Is there something that I can do, client side perhaps, to fix it? I don't know anything about how modern web works, but perhaps there's an element I can remove with ublock origin to restore sane behavior?

I'm aware of the HN guidelines re: complaining about tangential annoyances, so apologies in advance if this (my) comment is "part of the problem", but I suspect the readership here would be the sort of people who might have an answer.

Edit: Making matters even worse, I can't even save the page as a PDF for reading later! It only captures what's visible at the time. :( Somebody outta do something about this...


I noticed the wonky scroll behavior, but it's debatable to call it broken.

On most web pages, you can scroll the entire document by putting the cursor anywhere on the page.

On this web site, you can only scroll if your cursor is within the <main> element, which has a rather narrow width. So most of the page area is not scrollable. Also, the CSS hides the scrollbar so that no affordances are given.


The author was probably trying to make the header and sidebar sticky by doing that, there are better ways though (such as literally the position: sticky CSS property)


I found a fix!

https://github.com/t-mart/kill-sticky

Just add a little bit of code to a bookmark button, it works wonderfully.

Click the button and your scrollbar returns, the keyboard works again, _AND_ the page renders to a PDF beautifully.

You're welcome. :)


I noticed the same thing, though then I realized that I have to have my cursor over the main content area. Not a great thing, still.


yet another JIT related vuln?

c'mon, will we ever stop using JIT in browsers?

Just take a look at this article from Microsoft Browser Vulnerability Research where they do challenge this and perform benchmarks without JIT

>https://microsoftedge.github.io/edgevr/posts/Super-Duper-Sec...

>Looking at CVE (Common Vulnerabilities and Exposures) data after 2019 shows that roughly 45% of CVEs issued for V8 were related to the JIT engine.


It's easy to turn off the JIT if you want - you can even do it on the iPhone now. It makes the web dogshit slow and is a fundamental nonstarter of an idea.

A slightly more feasible way forward here is replacing JavaScript with dear god anything else but JavaScript but that's a long way off. WebAssembly seems to have stalled.


Are you on a very old iPhone model? I haven’t found disabling jit to be noticeably slower on my iPhone 13 when running iOS 16 in “Lockdown Mode”. The Microsoft Edge team has benchmarks supporting the impact is minimal on Chromium based browsers too (https://microsoftedge.github.io/edgevr/posts/Super-Duper-Sec...).


It’s not really noticeable on an iPhone 13 Pro. Just an FYI if you’re on latest hardware.


> WebAssembly seems to have stalled.

Can you elaborate?


We’ll stop using JITs when it becomes acceptable to make computers 2x slower, or more.


Or when it becomes acceptable to make them 100x faster by not using JS, but that's even less likely to happen :)


No JIT also rules out most modern uses of Java and C#, along with things like LuaJIT.


The JavaScript JIT is much more complex due to guessing whether a number is int or floating point or if a prototype is a class etc.


LuaJIT has a very good interpreter too. You could still use it in a lot of places.


>>>c'mon, will we ever stop using JIT in browsers?


Not using JS means you need to replace it with something, which means you gotta put it into the browser.


Who says we need JS?


Nobody? That's why the word replace is there?


For the stuff javascript code does directly? Especially now that we have WASM sitting around for heavy math? Hell yeah make it 2x slower. We've advanced the speed so much and have plenty to spare.

...but if that only solves half the vulnerabilities I don't think it's worth it. Still worth keeping in mind in case that ratio shifts significantly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: