"Google Chromium CSS contains a use-after-free vulnerability that could allow a remote attacker to potentially exploit heap corruption via a crafted HTML page. This vulnerability could affect multiple web browsers that utilize Chromium, including, but not limited to, Google Chrome, Microsoft Edge, and Opera."
That's pretty bad! I wonder what kind of bounty went to the researcher.
duozerk 5 hours ago [-]
> That's pretty bad! I wonder what kind of bounty went to the researcher.
I'd be surprised if it's above 20K$.
Bug bounties rewards are usually criminally low; doubly so when you consider the efforts usually involved in not only finding serious vulns, but demonstrating a reliable way to exploit them.
Maybe google is an exception (but then again, maybe that payout was part marketing to draw more researchers).
throwaway150 3 hours ago [-]
So is there anything that would actually satisfy crowd here?
Offer $25K and it is "How dare a trillion dollar company pay so little?"
Offer $250K and it is "Hmm. Exception! Must be marketing!"
What precisely is an acceptable number?
cwillu 9 minutes ago [-]
One is a lament that the industry average is so low, and the other is… a lament that the industry average is so low. What's the problem?
idiotsecant 1 hours ago [-]
A number better than what the exploit could be sold for on the black market
i_am_jl 56 minutes ago [-]
I don't believe those numbers will ever come close to converging, let alone bounty prices surpassing black market prices.
It seems like these vulnerabilities will always be more valuable to people who can guarantee that their use will generate a return than to people who will use them to prevent a theoretical loss.
Beyond that, selling zero-days is a seller's market where sellers can set prices and court many buyers, but bug bounties are a buyer's market where there is only one buyer and pricing is opaque and dictated by the buyer.
So why would anyone ever take a bounty instead of selling on the black market? Risk! You might get arrested or scammed selling an exploit on the black market, black market buyers know that, so they price it in to offers.
gbalduzzi 8 minutes ago [-]
> So why would anyone ever take a bounty instead of selling on the black market? Risk!
I like to believe there are also ethics involved in most cases
seanw444 34 minutes ago [-]
Not sure why you're getting downvoted. It's the unfortunate reality.
hsbauauvhabzb 2 hours ago [-]
An increase in the average bug payout. Bounty programs pay low on average.
salviati 5 hours ago [-]
I think a big part of "criminally low" is that you'll make much more money selling it on the black market than getting the bounty.
duozerk 5 hours ago [-]
I read this often, and I guess it could be true, but those kinds of transaction would presumably go through DNM / forums like BF and the like. Which means crypto, and full anonymity. So either the buyer trusts the seller to deliver, or the seller trusts the buyer to pay. And once you reveal the particulars of a flaw, nothing prevents the buyer from running away (this actually also occurs regularly on legal, genuine bug bounty programs - they'll patch the problem discreetly after reading the report but never follow up, never mind paying; with little recourse for the researcher).
Even revealing enough details, but not everything, about the flaw to convince a potential buyer would be detrimental to the seller, as the level of details required to convince would likely massively simplify the work of the buyer should they decide to try and find the flaw themselves instead of buying. And I imagine much of those potential buyers would be state actors or organized criminal groups, both of which do have researchers in house.
The way this trust issue is (mostly) solved in drugs DNM is through the platform itself acting as a escrow agent; but I suspect such a thing would not work as well with selling vulnerabilities, because the volume is much lower, for one thing (preventing a high enough volume for reputation building); the financial amounts generally higher, for another.
The real money to be made as a criminal alternative, I think, would be to exploit the flaw yourself on real life targets. For example to drop ransomware payloads; these days ransomware groups even offer franchises - they'll take, say, 15% of the ransom cut and provide assistance with laundering/exploiting the target/etc; and claim your infection in the name of their group.
chc4 3 hours ago [-]
I don't think you know anything about how these industries work and should probably read some of the published books about them, like "This Is How They Tell Me The World Ends", instead of speculating in a way that will mislead people. Most purchasers of browser exploits are nation-state groups ("gray market") who are heavily incentivized not to screw the seller and would just wire some money directly, not black market sales.
moring 4 hours ago [-]
> Even revealing enough details, but not everything, about the flaw to convince a potential buyer would be detrimental to the seller, as the level of details required to convince would likely massively simplify the work of the buyer should they decide to try and find the flaw themselves instead of buying.
Is conning a seller really worth it for a potential buyer? Details will help an expert find the flaw, but it still takes lots of work, and there is the risk of not finding it (and the seller will be careful next time).
> And I imagine much of those potential buyers would be state actors or organized criminal groups, both of which do have researchers in house.
They also have the money to just buy an exploit.
> The real money to be made as a criminal alternative, I think, would be to exploit the flaw yourself on real life targets. For example to drop ransomware payloads; these days ransomware groups even offer franchises - they'll take, say, 15% of the ransom cut and provide assistance with laundering/exploiting the target/etc; and claim your infection in the name of their group.
I'd imagine the skills needed to get paid from ransomware victims without getting caught to be very different from the skills needed to find a vulnerability.
consumer451 5 hours ago [-]
I am far from the halls of corporate decision making, but I really don't understand why bug bounties at trillion dollar companies are so low.
arcfour 5 hours ago [-]
Because it's nice to get $10k legally + public credit than it is to get $100k while risking arrest + prison time, getting scammed, or selling your exploit to someone that uses it to ransom a children's hospital?
consumer451 3 hours ago [-]
Thanks, great answer. I was just thinking from a simple market value POV.
kspacewalk2 4 hours ago [-]
Is it in fact illegal to sell a zero day exploit of an open source application or library to whoever I want?
IggleSniggle 4 hours ago [-]
Depends. Within the US, there are data export laws that could make the "whoever" part illegal. There are also conspiracy to commit a crime laws that could imply liability. There are also laws that could make performing/demonstrating certain exploits illegal, even if divulging it isn't. That could result in some legal gray area. IANAL but have worked in this domain. Obviously different jurisdictions may handle such issues differently from one another.
sailfast 3 hours ago [-]
What about $500K selling it to governments?
bell-cot 1 hours ago [-]
Issue 1: Governments which your own gov't likes, or ones which it doesn't? The latter has downsides similar to a black market sale.
Issue 2: Selling to governments generally means selling to a Creepy-Spooky Agency. Sadly, creeps & spooks can "get ideas" about their $500k also buying them rights to your future work.
wepple 4 hours ago [-]
> but demonstrating a reliable way to exploit them
Is this a requirement for most bug bounty programs? Particularly the “reliable” bit?
bicepjai 5 hours ago [-]
So basically Firefox is not affected ?
hdgvhicv 5 hours ago [-]
The listed browsers are basically skins on top of the same chromium base.
It’s why Firefox and Safari as so important despite HN’a wish they’d go away.
autoexec 4 hours ago [-]
HN doesn't want firefox to go away. HN wants firefox to be better, more privacy/security focused, and to stop trying to copy chrome out of the misguided hope that being a poor imitation will somehow make it more popular.
Sadly, mozilla is now an adtech company (https://www.adexchanger.com/privacy/mozilla-acquires-anonym-...) and by default firefox now collects your data to sell to advertisers. We can expect less and less privacy for firefox users as Mozilla is now fully committed to trying to profit from the sale of firefox users personal data to advertisers.
ddtaylor 3 hours ago [-]
As a 25 year Firefox user this is spot on. I held out for 5 years hoping they would figure something out, but all they did was release weird stuff like VPNs and half baked services with a layer of "privacy" nail polish.
Brave is an example of a company doing some of the same things, but actually succeeding it appears. They have some kind of VPN thing, but also have Tor tabs for some other use cases.
They have some kind of integration with crypto wallets I have used a few times, but I'm sure Firefox has a reason they can't do that or would mess it up.
You can only watch Mozilla make so many mistakes while you suffer a worse Internet experience. The sad part is that we are paying the price now. All of the companies that can benefit from the Chrome lock in are doing so. The web extensions are neutered - and more is coming - and the reasons are exactly what you would expect: more ads and weird user hostile features like "you must keep this window in the foreground" that attempt to extract a "premium" experience from basic usage.
Mozilla failed and now the best we have is Brave. Soon the fingerprinting will be good enough Firefox will be akin to running a Tor browser with a CAPTCHA verification can for every page load.
linkregister 29 minutes ago [-]
What would be an acceptable revenue model? Google Chrome has the same privacy profile with the exception that Google retains the data for their own ad platforms.
Selling preferential search access is legally precarious due to FTC's lawsuit against Mozilla.
jacquesm 2 hours ago [-]
HN wants Firefox but with better stewardship and fewer misdirected funds.
Mozilla - wrongly - believes that the majority of FF users believe in Mozilla's hobby projects rather than that they care about their browser.
That's why - as far as I know - to this day it is impossible to directly fund Firefox. They'd rather take money from google than to be focusing on the one thing that matters.
LunaSea 54 minutes ago [-]
I don't think that Mozilla believes that their pet projects are what the use community wants. I think they just don't care. Google's check will clear next year anyways.
wvbdmp 4 hours ago [-]
Particularly weird impulse for technically inclined people…
Although I must admit to the guilty pleasure of gleefully using Chromium-only features in internal apps where users are guaranteed to run Edge.
zozbot234 5 hours ago [-]
Firefox is safe from this because their CSS handling was the first thing they rewrote in Rust.
bawolff 4 hours ago [-]
I mean, even if it was written in c or c++, its unlikely two separate code bases would have the exact same use after feee vuln.
jacquesm 2 hours ago [-]
It's unlikely, but it does actually happen. I've seen more than one complete rewrite of something important that had exactly the same bug. And I'm very sure that those sources were not related somehow.
jsheard 5 hours ago [-]
Firefox and Safari are fine in this case, yeah.
DetroitThrow 5 hours ago [-]
It's pretty hard to have an accidental a use after free in the FireFox CSS engine because it is mostly safe Rust. It's possible, but very unlikely.
topspin 5 hours ago [-]
That came to my mind as well. CSS was one of the earliest major applications of Rust in FireFox. I believe that work was when the "Fearless Concurrency" slogan was popularized.
moritzwarhier 4 hours ago [-]
Firefox and Safari developers dared the Chromium team to implement :has() and Houdini and this is the result!
/s
deanc 4 hours ago [-]
Presumably this affects all electron apps which embed chrome too? Don’t they pin the chrome version?
comex 4 hours ago [-]
Yes, but it's only a vulnerability if the app allows rendering untrusted HTML or visiting untrusted websites, which most Electron apps don't.
waynesonfire 5 hours ago [-]
"Actually, you forgot Brave."
mpeg 5 hours ago [-]
I quoted directly from NIST, there's many other browsers and non-browsers that use chromium
sumtechguy 4 hours ago [-]
Steam and VSCode pop into my mind.
waynesonfire 5 hours ago [-]
It was intended as a joke reference to the 2004 Kerry / Bush debate. It's not a coincidence that Google would leave off an ad-blocking variant of Chrome.
order-matters 5 hours ago [-]
they listed the top 3 most popular chromium browsers, covering 90%+ of chromium users
ipaddr 4 hours ago [-]
But not 90% of users here.
pear01 4 hours ago [-]
did you also take poland being omitted to be some sort of conspiracy? seems you missed the point of why that "Actually, you forgot..." moment became such a punchline. Like it or not Brave is a very niche browser with rather insignificant market share why you would expect them to be mentioned in the first place is entirely lost on me. there are dozens of chromium forks also with under 1% market share, should we be forced to mention them all?
pjmlp 5 hours ago [-]
Yeah, but lets keeping downplaying use-after-free as something not worth eliminating in 21st century systems languages.
pheggs 5 hours ago [-]
I love rust but honestly I am more scared about supply chain attacks through cargo than memory corruption bugs. The reason being that supply chain attacks are probably way cheaper to pull off than finding these bugs
That being said as many above have pointed out you can choose not to bring in dependencies. The Chrome team already does this with the font parser library they limit dependencies to 1 or 2 trusted ones with little to no transitive dependencies. Let's not pretend C / C++ is immune to this we had the xz vuln not too long ago. C / C++ has the benefit of the culture not using as many dependencies but this is still a problem that exists. With the increase of code in the world due to ai this is a problem we're going to need to fix sooner rather than later.
I don't think the supply chain should be a blocker for using rust especially when once of the best C++ teams in the world with good funding struggles to always write perfect code. The chrome team has shown precedent for moving to rust safely and avoiding dependency hell, they'll just need to do it again.
They have hundreds of engineers many of which are very gifted, hell they can write their own dependencies!
cogman10 5 hours ago [-]
If you can bring in 3rd party libraries, you can be hit with a supply chain attack. C and C++ aren't immune, it's just harder to pull off due to dependency management being more complex (meaning you'll work with less dependencies naturally).
jacquesm 2 hours ago [-]
It's not more complex in C or C++, you just have less of a culture of buying into a whole eco-system. C and C++ play nice with the build system that you bring, rather than that you are forced into a particular way of working.
It's 'just a compiler' (ok, a bit more than that). I don't need to use a particular IDE, a particular build system, a particular package manager or even a particular repository.
That is not to throw shade on those other languages, each to their own, but I just like my tools to stay in their lane.
Just like I have a drawer full of different hammers rather than one hammer with 12 different heads, a screwdriver, a hardware store and a drill attachment. I wouldn't know what to do with it.
skydhash 4 hours ago [-]
You’ll find more quality libraries in C because people don’t care about splitting them down to microscopic parcels. Even something like ‘just’ have tens of deps, including one to check that something is executable.
You also won’t typically find C/C++ developers blinding yolo’ing the latest version of a dependency from the Internet into their CI/CD pipeline.
They’ll stick with a stable version that has the features they need until they have a good reason to move. That version will be one they’ve decided to ship themselves, or it’ll be provided by someone like Debian or Red Hat.
pjmlp 2 hours ago [-]
Unless of course they are using vcpkg, conan or FetchContent.
Most corporations are already using the likes of Nexus or JFrog Artifactory, regardless of the programming language.
pheggs 3 hours ago [-]
yes, the average amount of dependencies used per dependency appears to be much larger in rust and thats what I meant and is worrying me. In theory C can be written in a memory safe manner, and in theory rust can be used without large junks of supply vulnerabilities. both of these are not the case in practice though
kibwen 2 hours ago [-]
> both of these are not the case in practice though
No, people routinely write Rust with no third-party dependencies, and yet people do not routinely write C code that is memory-safe. Your threat model needs re-evaluating. Also keep in mind that the most common dependencies (rand, serde, regex, etc) are literally provided by the Rust project itself, and are no more susceptible to supply chain attacks than the compiler.
pheggs 1 hours ago [-]
I know it's a sensitive topic for a lot of people, but as I said, I love rust. I don't know a lot of rust projects though that don't use any dependencies. In my humble opinion, disregarding the risks of such supply chain attacks is at least as bad as people disregarding the risk of memory unsafe code. But keep in mind, I'm not saying don't use rust.
mamma_mia 57 minutes ago [-]
mamma mia! one day anyhow and anyerror will be backdoored it's inevitable
dbdr 2 hours ago [-]
One difference is that it's an incredibly hard problem to check whether your C code is memory safe since every single line of your code is a risk. On the other hand, it's easy to at least assess where your supply vulnerabilities lie (read Cargo.toml), and you can enforce your policy of choice (e.g. whitelist a few specific dependencies only, vendor them, etc).
pheggs 1 hours ago [-]
I would argue that almost all major rust projects use dependencies. Checking the dependencies for vulnerabilities might be just as difficult as checking C code for memory safety, maybe even worse, because dependencies have dependencies and the amount of code to be checked can easily sky rocket. The problem gets even worse if you consider that not all rust code is safe, and that C libraries can be included and so on
kibwen 5 hours ago [-]
But this is irrelevant. If you're afraid of third-party code, you can just... choose not to use third-party code? Meanwhile, if I'm afraid of memory corruption in C, I cannot just choose not to have memory corruption; I must instead simply choose not to use C. Meanwhile, Chromium uses tons of third-party Rust code, and has thereby judged the risk differently.
JoeAltmaier 5 hours ago [-]
Maybe it's more complicated than that? With allocate/delete discipline, C can be fairly safe memory-wise (written a million lines of code in C). But automated package managers etc can bring in code under the covers, and you end up with something you didn't ask for. By that point of view, we reverse the conclusion.
nemothekid 1 hours ago [-]
>can be fairly safe memory-wise (written a million lines of code in C)
We are currently in a thread, where a major application has a heap corruption error in its CSS parser, and it's not even rare for such errors to occur. This doesn't seem true.
>But automated package managers etc can bring in code under the covers, and you end up with something you didn't ask for.
Last year there was a backdoor inserted into xz that was only caught because someone thought their CPU usage a little too high. I don't think the whole "C is safer because people don't use dependencies" is actually sound.
nagaiaida 3 hours ago [-]
yes, people often invoke "simply write safer c" but that doesn't make it any more realistic of a proposition in aggregate as we keep seeing.
stackghost 3 hours ago [-]
>With allocate/delete discipline, C can be fairly safe memory-wise (written a million lines of code in C)
The last 40-50 years have conclusively shown us that relying on the programmer to be disciplined, yourself included, does not work.
staticassertion 5 hours ago [-]
Google already uses `cargo-vet` for rust dependencies.
pheggs 5 hours ago [-]
thats good, but it wont eliminate the risk
staticassertion 5 hours ago [-]
Nothing eliminates the risk but it is basically a best-in-class solution. If your primary concern is supply chain risk, there you go, best in class defense against it.
If anything, what are you doing about supply chain for the existing code base? How is cargo worse here when cargo-vet exists and is actively maintained by Google, Mozilla, and others?
pheggs 3 hours ago [-]
true, but rusts success in creating an easy to use dependency manager is the curse. In general rust software seems to use a larger amount of dependencies than c/c++ due to that, where each is at risk of becoming an attack vector. my prediction is that we will see some abuse of this in future, similar to what npm experienced
staticassertion 53 minutes ago [-]
All mainstream package managers are built with zero forethought into security, as far as I can tell. I don't think any of them are any good at it at all, otherwise they wouldn't give arbitrary code execution with literally zero restrictions, ability to audit, etc.
That said, `cargo-vet` is easily the best tool for mitigating this that I am aware of and it exists for Rust and is actively maintained by Google, Mozilla, and many others. I think it's fine to say "Rust encourages using more dependencies" but it has to be acknowledged that Rust also brings with it the best in class tool for supply chain security.
Could it be better? Absolutely. God yes. Why is cargo giving access to `~/.ssh/` for every `build.sh`? Why do package managers not make any effort to sandbox? But that's life today.
StilesCrisis 24 minutes ago [-]
It would also require a sandbox escape to be a meaningful vulnerability.
Unfortunately, "seen in the wild" likely means that they _also_ had a sandbox escape, which likely isn't revealed publicly because it's not a vulnerability in properly running execution (i.e., if the heap were not already corrupted, no vulnerability exists).
kykat 3 hours ago [-]
I don't quite understand the vulnerability, when exploited, you can get information about the page from which the exploit code is running. Without a sandbox escape or XSS, that seems almost completely harmless?
Arbitrary code execution within the renderer process sandbox
Information disclosure — leak V8 heap pointers (ASLR bypass), read renderer memory contents
Credential theft — read document.cookie, localStorage, sessionStorage, form input values
Session hijacking — steal session tokens, exfiltrate via fetch() / WebSocket / sendBeacon()
DOM manipulation — inject phishing forms, modify page content
Keylogging — capture all keystrokes via addEventListener('keydown')
chc4 3 hours ago [-]
Browser exploits are almost always two steps: you exploit a renderer bug in order to get arbitrary code execution inside a sandboxed process, and then you use a second sandbox escape exploit in order to gain arbitrary code execution in the non-sandboxed broker process. The first line of that (almost definitely AI generated) summary is the bad part, and means that this is one half of a full browser compromise chain. The fact that you still need a sandbox escape doesn't mean that it is harmless, especially since if it's being exploited in the wild that means whoever is using it probably does also have a sandbox escape they are pairing with it.
kykat 2 hours ago [-]
Thanks for the explanation. So much for AI making it easier to learn things!
tripplyons 6 hours ago [-]
"Use after free in CSS" is a funny description to see.
maxloh 5 hours ago [-]
I think they meant something like the CSS parser, or the CSS Object Model (CSSOM).
bawolff 4 hours ago [-]
One of the other commenters wrote a post that said it was related to @font-feature-values
w4yai 5 hours ago [-]
Why ?
8-prime 5 hours ago [-]
To me at least it reads funny because when I think of CSS I think of the language itself and not the accompanying tools that are then running the CSS.
Saying "Markdown has a CVE" would sound equally off.
I'm aware that its not actually CSS having the vulnerability but when simplified that's what it sounds like.
Tyr42 3 hours ago [-]
Funny you'd mention that, when Notepad had a CVE in it's markdown parsing recently.
4 hours ago [-]
cosmic_cheese 42 minutes ago [-]
I wonder how many bugs like this are lurking in the various dark corners of the Chromium/Blink codebase that nobody has taken a good, hard look at in a long time.
Given the staggering importance of the projects they should really have a full-time, well-staffed, well-funded, dedicated team combing through every line, hunting these things down, and fixing them before they have a chance to be used. It'd be a better use of resources than smart fridge integration or whatever other bells and whistles Google has most recently decided to tack onto Chrome.
StilesCrisis 19 minutes ago [-]
Chromium is pretty aggressively fuzzed. There aren't a lot of dark corners that can't be reached via a sufficiently aggressive fuzzer.
sproketboy 37 minutes ago [-]
[dead]
bitbasher 5 hours ago [-]
Maybe Chromium should also rewrite their rendering engine in Rust ;p
himata4113 5 hours ago [-]
The fact that these still show up is pretty wild to me. Don't we have a bunch of tools that should create memory-safish binaries by applying the same validation checks that memory-safe languages get for free purely from their design?
I get that css has changed a lot over the years with variables, scopes and adopting things from less/sass/coffee, but people use no-script for the reason because javascript is risky, but what if css can be just as risky... time to also have no-style?
Honestly, pretty excited for the full report since it's either stupid as hell or a multi-step attack chain.
staticassertion 5 hours ago [-]
> Don't we have a bunch of tools that should create memory-safish binaries by applying the same validation checks that memory-safe languages get for free purely from their design?
No, we don't. All of the ones we have are heavily leveraged in Chromium or were outright developed at Google for similar projects. 10s of billions are spent to try to get Chromium to not have these vulnerabilities, using those tools. And here we are.
I'll elaborate a bit. Things like sanitizers largely rely on test coverage. Google spends a lot of money on things like fuzzing, but coverage is still a critical requirement. For a massive codebase, gettign proper coverage is obviously really tricky. We'll have to learn more about this vulnerability but you can see how even just that limitation alone is sufficient to explain gaps.
masklinn 4 hours ago [-]
> Things like sanitizers largely rely on test coverage.
And not in a trivial “this line is traversed” way, you need to actually trigger the error condition at runtime for a sanitizer to see anything. Which is why I always shake my head at claims that go has “amazing thread safety” because it has the race detector (aka tsan). That’s the opposite of thread safety. It is, if anything, an admission to a lack of it.
josefx 3 hours ago [-]
I heard they once created an entire language that would replace C++ in all their projects. Obviously they never rewrote Chrome in Go.
> 10s of billions are spent to try to get Chromium to not have these vulnerabilities, using those tools. And here we are.
Shouldn't pages run in isolated and sandboxed processes anyway? If that exploit gets you anywhere it would be a failure of multiple layers.
StilesCrisis 8 minutes ago [-]
I don't think Go was ever planned to completely overtake C++. It is still a garbage collected language at the end of the day.
stackghost 3 hours ago [-]
They do run in a sandbox, and this exploit gives the attacker RCE inside the sandbox. It is not in and of itself a sandbox escape.
However if you have arbitrary code execution then you can groom the heap with malloc/new to create the layout for a heap overflow->ret2libc or something similar
ripbozo 5 hours ago [-]
I'd love to see what the PoC code looks like, of course after the patch has been rolled out for a few weeks.
this is insane! what other zero days are out there and being used
also this seems chromium only so it doesnt impact firefox ?
6 hours ago [-]
astrobe_ 5 hours ago [-]
This doesn't affect the many browsers based on Chromium?
gruez 4 hours ago [-]
It does, it's just that blog is for chrome so it doesn't mention other browsers.
thinkingemote 4 hours ago [-]
"This vulnerability could affect multiple web browsers that utilize Chromium, including, but not limited to, Google Chrome, Microsoft Edge, and Opera"
iririririr 2 hours ago [-]
why on earth would you even assume somthing like this?
honestly curious. do you think "based on chrome" means they forked the engine and not just "applied some UI skin"?
MallocVoidstar 5 hours ago [-]
Devtools is seemingly partially broken in this version, if I have devtools open on a reasonably dynamic web app Chrome will crash within a minute or two
aapoalas 4 hours ago [-]
It's also been ridiculously slow for a month or two now :/ not a good time to be working on some relatively intricate performance optimisation with DevTools taking 1-4 seconds to even start the performance recording.
jijji 3 hours ago [-]
use after free.... ahh the irony
kittbuilds 4 hours ago [-]
[dead]
idoxer 6 hours ago [-]
[dead]
fulafel 5 hours ago [-]
Isn't this a wrongly editorialized title - "Reported by Shaheen Fazim on 2026-02-11" so more like 7-day.
Aachen 5 hours ago [-]
It refers to your many days software is available for, with zero implying it is not yet out so you couldn't have installed a new version and that's what makes it a risky bug
The term has long watered-down to mean any vulnerability (since it was always a zero-day at some point before the patch release, I guess is those people's logic? idk). Fear inflation and shoehorning seems to happen to any type of scary/scarier/scariest attack term. Might be easiest not to put too much thought into media headlines containing 0day, hacker, crypto, AI, etc. Recently saw non-R RCEs and supply chain attacks not being about anyone's supply chain copied happily onto HN
Edit: fwiw, I'm not the downvoter
nickelpro 5 hours ago [-]
It's original meaning was days since software release, without any security connotation attached. It came from the warez scene, where groups competed to crack software and make it available to the scene earlier and earlier. A week after general release, three days, same-day. The ultimate was 0-day software, software which was not yet available to the general public.
In a security context, it has come to mean days since a mitigation was released. Prior to disclosure or mitigation, all vulnerabilities are "0-day", which may be for weeks, months, or years.
It's not really an inflation of the term, just a shifting of context. "Days since software was released" -> "Days since a mitigation for a given vulnerability was released".
fulafel 2 hours ago [-]
Wikipedia: A zero-day (also known as a 0-day) is a vulnerability or security hole in a computer system unknown to its developers or anyone capable of mitigating it
This seems logical since by etymology of zeroday it should apply to the release (=disclosure) of a vuln.
bawolff 4 hours ago [-]
I think the implication in this specific context is that malicious people were exploiting the vuln in the wild prior to the fix being released
baq 5 hours ago [-]
I wonder if this was found with LLM assistance, if yes, with which one and is it a one-off or does it mark a start of a new era (I assume it does).
paavohtl 5 hours ago [-]
Absolutely nothing in the announcement or other publicly available source implies that, to my knowledge. Might as well speculate if a random passer-by on the street is secretly a martian.
Rendered at 22:20:42 GMT+0000 (Coordinated Universal Time) with Vercel.
That's pretty bad! I wonder what kind of bounty went to the researcher.
I'd be surprised if it's above 20K$.
Bug bounties rewards are usually criminally low; doubly so when you consider the efforts usually involved in not only finding serious vulns, but demonstrating a reliable way to exploit them.
Offer $25K and it is "How dare a trillion dollar company pay so little?"
Offer $250K and it is "Hmm. Exception! Must be marketing!"
What precisely is an acceptable number?
It seems like these vulnerabilities will always be more valuable to people who can guarantee that their use will generate a return than to people who will use them to prevent a theoretical loss.
Beyond that, selling zero-days is a seller's market where sellers can set prices and court many buyers, but bug bounties are a buyer's market where there is only one buyer and pricing is opaque and dictated by the buyer.
So why would anyone ever take a bounty instead of selling on the black market? Risk! You might get arrested or scammed selling an exploit on the black market, black market buyers know that, so they price it in to offers.
I like to believe there are also ethics involved in most cases
Even revealing enough details, but not everything, about the flaw to convince a potential buyer would be detrimental to the seller, as the level of details required to convince would likely massively simplify the work of the buyer should they decide to try and find the flaw themselves instead of buying. And I imagine much of those potential buyers would be state actors or organized criminal groups, both of which do have researchers in house.
The way this trust issue is (mostly) solved in drugs DNM is through the platform itself acting as a escrow agent; but I suspect such a thing would not work as well with selling vulnerabilities, because the volume is much lower, for one thing (preventing a high enough volume for reputation building); the financial amounts generally higher, for another.
The real money to be made as a criminal alternative, I think, would be to exploit the flaw yourself on real life targets. For example to drop ransomware payloads; these days ransomware groups even offer franchises - they'll take, say, 15% of the ransom cut and provide assistance with laundering/exploiting the target/etc; and claim your infection in the name of their group.
Is conning a seller really worth it for a potential buyer? Details will help an expert find the flaw, but it still takes lots of work, and there is the risk of not finding it (and the seller will be careful next time).
> And I imagine much of those potential buyers would be state actors or organized criminal groups, both of which do have researchers in house.
They also have the money to just buy an exploit.
> The real money to be made as a criminal alternative, I think, would be to exploit the flaw yourself on real life targets. For example to drop ransomware payloads; these days ransomware groups even offer franchises - they'll take, say, 15% of the ransom cut and provide assistance with laundering/exploiting the target/etc; and claim your infection in the name of their group.
I'd imagine the skills needed to get paid from ransomware victims without getting caught to be very different from the skills needed to find a vulnerability.
Issue 2: Selling to governments generally means selling to a Creepy-Spooky Agency. Sadly, creeps & spooks can "get ideas" about their $500k also buying them rights to your future work.
Is this a requirement for most bug bounty programs? Particularly the “reliable” bit?
It’s why Firefox and Safari as so important despite HN’a wish they’d go away.
Sadly, mozilla is now an adtech company (https://www.adexchanger.com/privacy/mozilla-acquires-anonym-...) and by default firefox now collects your data to sell to advertisers. We can expect less and less privacy for firefox users as Mozilla is now fully committed to trying to profit from the sale of firefox users personal data to advertisers.
Brave is an example of a company doing some of the same things, but actually succeeding it appears. They have some kind of VPN thing, but also have Tor tabs for some other use cases.
They have some kind of integration with crypto wallets I have used a few times, but I'm sure Firefox has a reason they can't do that or would mess it up.
You can only watch Mozilla make so many mistakes while you suffer a worse Internet experience. The sad part is that we are paying the price now. All of the companies that can benefit from the Chrome lock in are doing so. The web extensions are neutered - and more is coming - and the reasons are exactly what you would expect: more ads and weird user hostile features like "you must keep this window in the foreground" that attempt to extract a "premium" experience from basic usage.
Mozilla failed and now the best we have is Brave. Soon the fingerprinting will be good enough Firefox will be akin to running a Tor browser with a CAPTCHA verification can for every page load.
Selling preferential search access is legally precarious due to FTC's lawsuit against Mozilla.
Mozilla - wrongly - believes that the majority of FF users believe in Mozilla's hobby projects rather than that they care about their browser.
That's why - as far as I know - to this day it is impossible to directly fund Firefox. They'd rather take money from google than to be focusing on the one thing that matters.
Although I must admit to the guilty pleasure of gleefully using Chromium-only features in internal apps where users are guaranteed to run Edge.
/s
That being said as many above have pointed out you can choose not to bring in dependencies. The Chrome team already does this with the font parser library they limit dependencies to 1 or 2 trusted ones with little to no transitive dependencies. Let's not pretend C / C++ is immune to this we had the xz vuln not too long ago. C / C++ has the benefit of the culture not using as many dependencies but this is still a problem that exists. With the increase of code in the world due to ai this is a problem we're going to need to fix sooner rather than later.
I don't think the supply chain should be a blocker for using rust especially when once of the best C++ teams in the world with good funding struggles to always write perfect code. The chrome team has shown precedent for moving to rust safely and avoiding dependency hell, they'll just need to do it again.
They have hundreds of engineers many of which are very gifted, hell they can write their own dependencies!
It's 'just a compiler' (ok, a bit more than that). I don't need to use a particular IDE, a particular build system, a particular package manager or even a particular repository.
That is not to throw shade on those other languages, each to their own, but I just like my tools to stay in their lane.
Just like I have a drawer full of different hammers rather than one hammer with 12 different heads, a screwdriver, a hardware store and a drill attachment. I wouldn't know what to do with it.
https://github.com/casey/just/blob/master/Cargo.toml
That’s just asking for trouble down the line.
They’ll stick with a stable version that has the features they need until they have a good reason to move. That version will be one they’ve decided to ship themselves, or it’ll be provided by someone like Debian or Red Hat.
Most corporations are already using the likes of Nexus or JFrog Artifactory, regardless of the programming language.
No, people routinely write Rust with no third-party dependencies, and yet people do not routinely write C code that is memory-safe. Your threat model needs re-evaluating. Also keep in mind that the most common dependencies (rand, serde, regex, etc) are literally provided by the Rust project itself, and are no more susceptible to supply chain attacks than the compiler.
We are currently in a thread, where a major application has a heap corruption error in its CSS parser, and it's not even rare for such errors to occur. This doesn't seem true.
>But automated package managers etc can bring in code under the covers, and you end up with something you didn't ask for.
Last year there was a backdoor inserted into xz that was only caught because someone thought their CPU usage a little too high. I don't think the whole "C is safer because people don't use dependencies" is actually sound.
The last 40-50 years have conclusively shown us that relying on the programmer to be disciplined, yourself included, does not work.
If anything, what are you doing about supply chain for the existing code base? How is cargo worse here when cargo-vet exists and is actively maintained by Google, Mozilla, and others?
That said, `cargo-vet` is easily the best tool for mitigating this that I am aware of and it exists for Rust and is actively maintained by Google, Mozilla, and many others. I think it's fine to say "Rust encourages using more dependencies" but it has to be acknowledged that Rust also brings with it the best in class tool for supply chain security.
Could it be better? Absolutely. God yes. Why is cargo giving access to `~/.ssh/` for every `build.sh`? Why do package managers not make any effort to sandbox? But that's life today.
Unfortunately, "seen in the wild" likely means that they _also_ had a sandbox escape, which likely isn't revealed publicly because it's not a vulnerability in properly running execution (i.e., if the heap were not already corrupted, no vulnerability exists).
This is the "impact" section on https://github.com/huseyinstif/CVE-2026-2441-PoC:
Arbitrary code execution within the renderer process sandbox Information disclosure — leak V8 heap pointers (ASLR bypass), read renderer memory contents Credential theft — read document.cookie, localStorage, sessionStorage, form input values Session hijacking — steal session tokens, exfiltrate via fetch() / WebSocket / sendBeacon() DOM manipulation — inject phishing forms, modify page content Keylogging — capture all keystrokes via addEventListener('keydown')
Saying "Markdown has a CVE" would sound equally off. I'm aware that its not actually CSS having the vulnerability but when simplified that's what it sounds like.
Given the staggering importance of the projects they should really have a full-time, well-staffed, well-funded, dedicated team combing through every line, hunting these things down, and fixing them before they have a chance to be used. It'd be a better use of resources than smart fridge integration or whatever other bells and whistles Google has most recently decided to tack onto Chrome.
I get that css has changed a lot over the years with variables, scopes and adopting things from less/sass/coffee, but people use no-script for the reason because javascript is risky, but what if css can be just as risky... time to also have no-style?
Honestly, pretty excited for the full report since it's either stupid as hell or a multi-step attack chain.
No, we don't. All of the ones we have are heavily leveraged in Chromium or were outright developed at Google for similar projects. 10s of billions are spent to try to get Chromium to not have these vulnerabilities, using those tools. And here we are.
I'll elaborate a bit. Things like sanitizers largely rely on test coverage. Google spends a lot of money on things like fuzzing, but coverage is still a critical requirement. For a massive codebase, gettign proper coverage is obviously really tricky. We'll have to learn more about this vulnerability but you can see how even just that limitation alone is sufficient to explain gaps.
And not in a trivial “this line is traversed” way, you need to actually trigger the error condition at runtime for a sanitizer to see anything. Which is why I always shake my head at claims that go has “amazing thread safety” because it has the race detector (aka tsan). That’s the opposite of thread safety. It is, if anything, an admission to a lack of it.
> 10s of billions are spent to try to get Chromium to not have these vulnerabilities, using those tools. And here we are.
Shouldn't pages run in isolated and sandboxed processes anyway? If that exploit gets you anywhere it would be a failure of multiple layers.
However if you have arbitrary code execution then you can groom the heap with malloc/new to create the layout for a heap overflow->ret2libc or something similar
also this seems chromium only so it doesnt impact firefox ?
honestly curious. do you think "based on chrome" means they forked the engine and not just "applied some UI skin"?
The term has long watered-down to mean any vulnerability (since it was always a zero-day at some point before the patch release, I guess is those people's logic? idk). Fear inflation and shoehorning seems to happen to any type of scary/scarier/scariest attack term. Might be easiest not to put too much thought into media headlines containing 0day, hacker, crypto, AI, etc. Recently saw non-R RCEs and supply chain attacks not being about anyone's supply chain copied happily onto HN
Edit: fwiw, I'm not the downvoter
In a security context, it has come to mean days since a mitigation was released. Prior to disclosure or mitigation, all vulnerabilities are "0-day", which may be for weeks, months, or years.
It's not really an inflation of the term, just a shifting of context. "Days since software was released" -> "Days since a mitigation for a given vulnerability was released".
This seems logical since by etymology of zeroday it should apply to the release (=disclosure) of a vuln.