NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Intent to Deprecate and Remove XSLT (groups.google.com)
chrismorgan 98 days ago [-]
Presuming this goes ahead, I believe this is the first time a standard, baseline-available feature will be removed.

There have been other removals, but few of them were of even specified features, and I don’t think any of them have been universally available. One of the closest might be showModalDialog <https://web.archive.org/web/20140401014356/http://dev.opera....>, but I gather mobile browsers never supported it anyway, and it was a really problematic feature from an implementation perspective too. You could argue Mutation Events from ~2011 qualifies¹; it was supplanted by Mutation Observers within two years, yet hung around for over a decade before being removed. As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.

And so here they are now planning to remove a well-entrenched (if not especially commonly used) feature against the clearly-expressed will of the actual developers, in a one year time frame.

—⁂—

¹ I choose to disqualify Mutation Events because no one ever finished their implementation: WebKit heritage never did DOMAttrModified, Gecko/Trident heritage never did DOMNodeInsertedIntoDocument or DOMNodeRemovedFromDocument. Flimsy excuse, probably. If you want to count it, perhaps you’ll agree to consider XSLT the first time a major, standard, baseline-available feature will be removed?

veeti 98 days ago [-]
Look, I wouldn't want to be responsible for maintaining anything to do with XML or XSLT either. All the technical arguments outlined for removing support make sense. But can users really call it an "update" if you could view an XML/XSLT document in Internet Explorer 6 or Chrome 1 but not the newest version?

I think this sets a concerning precedent for future deprecations, where parts of the web platform are rugpulled from developers because it's convenient for the browser vendors.

troupo 98 days ago [-]
> I think this sets a concerning precedent for future deprecations, where parts of the web platform are rugpulled from developers because it's convenient for the browser vendors.

The precedent was already set when they tried to remove alert/prompt. See https://dev.to/richharris/stay-alert-d and https://css-tricks.com/choice-words-about-the-upcoming-depre...

Only a large public outcry stopped them, barely.

To quote from the first link:

--- start quote ---

Meanwhile, we don't seem to be learning from the past. If alert is fair game for removal, then so is every API we add to the platform if the web's future stewards deem it harmful.

Given Chrome's near-monopoly control of the browser market, I'm genuinely concerned about what this all means for the future of the web. An ad company shouldn't have this much influence over something that belongs to all of us. I don't know how to fix the standards process so that it's more representative of the diversity of the web's stakeholders, but I'm increasingly convinced that we need to figure it out.

--- end quote ---

echelon 98 days ago [-]
> maintaining anything to do with XML or XSLT either.

These aren't horrible formats or standards. XSLT is actually somewhat elegant.

hannob 98 days ago [-]
Counterpoint: XML is a horrible format.

Why? Answer this question: how can you use XML in a way that does not create horrible security vulnerabilities?

I know the answer, but it is extremely nontrivial, and highly dependent on which programming language, library, and sometimes even which library function you use. The fact that there's no easy way to use XML without creating a security footgun is reason enough to avoid it.

Mikhail_Edoshin 98 days ago [-]
I myself know only two "security vulnerabilities":

1. The entity bomb. An entity that expands to another, which expands to another, and so on so that the final result is enormous. This is an issue of the implementation: if it expands the entities eagerly then the bomb will work. But it it first examines them and checks how much space they require it can safely reject the document if it exceeds some configurable limit. As far as I know this has been fixed in all XML processors.

2. An entity can resolve to a local or remote file. First, this is a feature. Imagine a large collection of bibliographic records, each in a separate file. A publication can provide its list of references as a list of entities that refer to these files using entities. (There is an RFC that uses this as an example.) And, of course, we need both local and remote entities.

But, of course, if your XML comes from an untrusted source and you read it with this feature enabled this can lead to obvious disasters. Yet it is not a vulnerability of XML. Again, as far as I know all XML processors can disable access to local or remote entities.

rhdunn 98 days ago [-]
You can say the same thing about HTML forms (see CORS et. al.), innerHTML, rendering user-submitted data, SQL, JSON, etc. That does not mean that you remove HTML forms or SQL databases.

If you removed support for anything that has/could have security vulnerabilities you would remove everything.

da_chicken 98 days ago [-]
That's not any different than JSON, though. Injection, insecure deserialization , etc. can all exist in that format as well.

There's plenty of reasons to criticize XML, and plenty more to criticize XSLT. But security being the one you call out feels at least moderately disingenuous. It's a criticism of the library, not the standard or the format.

dtech 98 days ago [-]
There's an extremely large difference in that a JSON deserialization vulnerability is almost always a bug in the library. JSON is not an inherently insecure format.

XML is so complex that a 100% bug-free compliant library is inherently insecure, and the vulnerability is a "user is holding it wrong" siutation, they should have disabled specific XML features etc. That means XML is an inherently much more insecure format.

There's a reason there's name for vulnerabilities like XML External Entity (XXE) injection [1] and they're named after XML, and not "bug in lib/software X". JSON and most other data formats don't have that.

[1] https://portswigger.net/web-security/xxe

Mikhail_Edoshin 98 days ago [-]
XML has a relatively small specification. For some time I used to "print" web pages into PDF (or XPS) and I remember that XML 1.0 specification was three times shorter than that of YAML (it was YAML 2, I think, I don't quite remember). And XML included a) serialization itself, b) simple grammar specification in the form of DTD, c) things like internal references from one element to another, d) basic support for other notations, so that you could add, say, LaTeX math notation and formally define that this element's content is in this notation. I do not think (b), (c) or (d) were part of YAML or any other similar format.
nothrabannosir 98 days ago [-]
Do those points not apply verbatim to HTML?

Let alone JavaScript…

masfuerte 98 days ago [-]
The security argument isn't that great. Google has been grumbling about xslt for more than a decade. If security was really their concern they could have replaced the compiled C library with an asm.js version ten years ago, much as they did for pdf rendering. They could use wasm now. They don't need to deprecate it.
ExoticPearTree 98 days ago [-]
> But can users really call it an "update" if you could view an XML/XSLT document in Internet Explorer 6 or Chrome 1 but not the newest version?

Yes. Just like we don't have Flash everywhere or ActiveX. Good riddance to them and to XSLT and, fingers crossed, XML in the future.

mx7zysuj4xew 98 days ago [-]
I'm going to say this calmly and politely to you. You think this is some kind of "fun" spicy take, but considering that some rely on these technologies I find your remark to be incredibly offensive and insulting. If we were discussing this face to face you'd have a major problem right now
ExoticPearTree 98 days ago [-]
It would help if you would give some details about how you rely on Flash/ActiveX/XML. Is it a matter of life and death? If so, how?
bawolff 98 days ago [-]
> As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.

I feel like there is a bit of a no true scotsman to this.

XSLT was always kind of on the side. If FTP or flash weren't part of the web platform than i dont know that xslt is either. Flash might not be "standard" but it certainly had more users in its heyday than xslt ever did.

Does removal of tls 1.1 count here? Its all kind of a matter of definitions.

Personally i always thought the <keygen> tag was really cool.

chrismorgan 98 days ago [-]
XSLT is an integrated part of the web platform: browsers can load XML documents that use an XSLT stylesheet, and even inside HTML documents XSLTProcessor is available.

FTP was never integrated: it just so happened that some platforms shipped a protocol handler for it, and some browsers included an FTP protocol handler themselves. But I don’t believe you could ever, say, fetch("ftp://…").

Flash, like applets, was even more clearly not part of the web platform. It was a popular third-party extension that you had to go out of your way to install… or wait for it to be installed by some shady installer Adobe paid off. Though I have a vague feeling Chrome shipped with Flash at some point? I don’t remember all the history any more, this is a long time ago.

Older versions of TLS is definitely a more interesting case. It’s a different kind of feature, but… yeah, I might consider it.

<keygen> was an interesting concept that in practice went nowhere.

bawolff 98 days ago [-]
> FTP was never integrated: it just so happened that some platforms shipped a protocol handler for it, and some browsers included an FTP protocol handler themselves. But I don’t believe you could ever, say, fetch("ftp://…").

I never tried, but i believe the relavent spec said it should work, until it was deprecated and removed from the standard https://github.com/whatwg/fetch/pull/1166

With flash - that might all be true, but there was a time when many websites required it. It might not have been a de jure standard but it was a de facto standard. To the point where a browser not supporting it was considered broken. Apple refusing to support it was incredibly controversial at the time.

om2 97 days ago [-]
Fetch API is a pretty recent addition to the web platform. Back in the day, you could absolutely embed images of stylesheets from ftp: URLs. You could even use it with XMLHttpRequest (predecessor of Fetch). Even further back, gopher: was integrated with the web. URL schemes were invented for the web with the idea that http: is not the only one. These other protocols were really part of the web until they weren’t.
98 days ago [-]
bartread 98 days ago [-]
Yeah… on the one hand I don’t care about XSLT, haven’t used it in more than 20 years, and never intend to use it again.

On the other… I’m still a bit uncomfortable with the proposed change because it reads as another example of Google unilaterally dictating the future of the web, which I’ve never liked or supported.

Feeling quite conflicted.

0x000xca0xfe 98 days ago [-]
XSLT is not trendy technology but I doubt it's worse than WebBluetooth, WebUSB or WebGL from a complexity/maintenance/security perspective.

This change definitely feels like moving a (tiny) step into the direction of turning the Web platform into something akin to the Android dev experience.

righthand 98 days ago [-]
"It didn't affect me so I didn't care." Is usually how control is amassed by Google (or authoritarians).
bartread 98 days ago [-]
Very fair point, and that cuts to the root of why I’m uncomfortable with it.

I mean, presumably they have the usage stats… except that plenty of enterprises deployed XSLT apps back in the day - it was on a massive portion of the job ads I was looking at in 2000 to 2002 - and I’d bet a chunk of those legacy systems are still running. I’d also bet a good chunk of those systems are running in the sort of orgs that won’t allow submission of telemetry to Google, so Google’s usage stats underreport real world usage.

To me it looks like zero effort has been made to engage with Mozilla, Apple, etc., on the right way forward here - just Google high-handedly making moves and abusing their position as per usual.

jsnell 98 days ago [-]
> To me it looks like zero effort has been made to engage with Mozilla, Apple, etc., on the right way forward here - just Google high-handedly making moves and abusing their position as per usual.

What would make you think that? The submission links prominently to the whatwg proposal github issue, which is the forum where that engagement would happen. It explicitly deep-links to Mozilla's and Apple's posts in that thread. It has the usage stats that you just presume exist.

It's like you just made up a scenario and posted it as facts with zero effort to verify any of it.

spiffytech 98 days ago [-]
The post indicates WHATWG has "broad agreement" about removing XSLT. I don't know how many seats Google has there, but on the surface it doesn't sound like a unilateral decision.
indolering 97 days ago [-]
Mozilla and others fell out of love with XML a long time ago. Deprecation of these technologies was probably inevitable after the WHATWG pivot and when they stopped adopting new XML tech. XML and related technologies got frozen in time and JavaScript took over.

The XML proponents lost this fight a long time ago. Without continued development, the user base shriveled up. Now that no one uses it, the runtimes are looking to cut dead weight.

I disagree with the pivot (RIP noscript) but it's not Google making this move unilaterally. It's been in the works for a long time.

om2 98 days ago [-]
XSLT is also a really problematic feature from an implementation perspective (albeit in a different way than showModalDialog or MutationObservers).

I’m not a Chrome dev but I think they have decent reasons for going this way.

sam_lowry_ 97 days ago [-]
1.1 is not that complex to implement
om2 96 days ago [-]
Implementing it without tons of security bugs is apparently pretty hard.
CamJN 98 days ago [-]
Maybe the blink or marquee tags? I’m pretty sure those don’t work anymore...
chrismorgan 98 days ago [-]
<marquee> still works fine. Better than it used to, honestly, as at least Firefox and Chromium removed the deliberate low frame rate at some point in the last decade.

<blink> was never universal, contrary to popular impression: <https://en.wikipedia.org/wiki/Blink_element#:~:text=The%20bl...>, it was only ever supported by Netscape/Gecko/Presto, never Trident/WebKit. Part of the joke of Blink is that it never supported <blink>.

> Netscape only agreed to remove the blink tag from their browser if Microsoft agreed to get rid of the marquee tag in theirs during an HTML ERB meeting in February 1996.

Fun times. Both essentially accusing the other of having a dumb tag.

bojle 98 days ago [-]
marquee is used religiously by some official Indian websites [1]. It's the primary mechanism they use to deliver news or updates on the websites.

[1] For example: https://www.nagpuruniversity.ac.in/

chrismorgan 98 days ago [-]
Extremely popular in Indian government websites, often implemented with <marquee>, but also often implemented by a different mechanism so that it can stop scrolling on mouseover.

Indian Rail <https://www.indianrail.gov.in/> has one containing the chart from a mid-2024 train accident, an invitation to contribute a recording of the national anthem from 2021, and a link to parcel booking. Oh, and “NEW!” animated GIFs between the three items.

bojle 98 days ago [-]
>Oh, and “NEW!” animated GIFs between the three items.

That's gotta be the second most popular web design quirk. Haha

anal_reactor 98 days ago [-]
> As for things like Flash or FTP, those were never part of the web platform. Nor were they ever anything like universal anyway.

Flash was the web technology.

chrismorgan 98 days ago [-]
The web technology… that didn’t come out of the box, wasn’t supported on all platforms, and didn’t integrate?
Mikhail_Edoshin 98 days ago [-]
One might think that as technology progresses more and more pieces of older technologies get revived and incorporated into the available tooling. Yet the very opposite thing happens: good and working parts are removed because the richest companies on Earth "cannot afford" to keep them.

In 19th century Russia there was a thinker, N. F. Fedorov, who wanted to revive all dead people. He saw it as the ultimate goal of humanity. (He worked in a library, a very telling occupation. He spent most of what he earned to support others.) We do not know how to revive dead people or if we can do that at all; but we certainly can revive old tech or just not let it die.

Of course, this job is not for everyone. We cannot count on the richest, apparently, they're too busy getting richer. This is a job for monks.

dtech 98 days ago [-]
> good and working parts are removed

The browser vendors are arguing XSLT is neither good - it's adoption has always been lacking because of complexity and has now become a niche technology because better alternatives exist - nor working, see the mentioned security and maintenance issues. I think they have a good point there.

Mikhail_Edoshin 98 days ago [-]
Well, one can argue that Metafont fonts are "niche". No font library supports them. But from what I know they could easily be technically superior to, say, Type 1 fonts. Of course a technology will be niche if it is treated as a poor relative. XSLT could be an alternative to CSS, for example. It is definitely more powerful than CSS, because it actually transforms the document, not just alters the appearance and sprinkles some automatic content here and there. And is actually used this way in XSL-FO, which, I think, powers a substantial share of technical publishing.
jasomill 98 days ago [-]
XSLT as an alternative to CSS sounds like a nightmare, though to be fair, I felt the same way about XSLT as an alternative to DSSSL, which to me felt more like a satirical response to the "XML everywhere" zeitgeist in the spirit of INTERCAL and eating babies than a serious design proposal.
ExoticPearTree 98 days ago [-]
> One might think that as technology progresses more and more pieces of older technologies get revived and incorporated into the available tooling. Yet the very opposite thing happens: good and working parts are removed because the richest companies on Earth "cannot afford" to keep them.

I think it is because nobody, excepts a handful of people around the world, feels the need to use XSLT in lieu of CSS. Hence, CSS has evolved over time while XSLT has not.

This is how the world works: technology advances and old things become obsolete over time.

mx7zysuj4xew 98 days ago [-]
This proves that you do not understand the technology at hand.

XSLT isn't about styling documents, but is more like ETL (Extract, Transform, and Load)

Mikhail_Edoshin 97 days ago [-]
Yes, XSLT is a transform. But XSL-FO is a special XML notation for printed media (it also has aural components, but I don't know if they are implemented anywhere). It uses a model similar to CSS, but does not use CSS stylesheets. Instead all attributes are attached directly to XML elements. (Like Tailwind). There is inheritance, but that's all; there are no CSS selectors, no variables, no generated content, nothing, because why? All this can be done during transform. This is both a simpler and more powerful approach than CSS.
om2 97 days ago [-]
It doesn’t scale well to content that changes dynamically on the client side very well. Dynamic manipulation of the post transform XSL-FO is confusing and difficult, retransforming the whole document from source is too slow and loses state. This is a big part of why CSS won.
mx7zysuj4xew 97 days ago [-]
What the hell are you talking about

CSS and XSL-FO are entirely different concepts

om2 96 days ago [-]
Take it up with the parent of my comment, who compared them directly.
ExoticPearTree 97 days ago [-]
> This is both a simpler and more powerful approach than CSS.

If it were true, everyone would have used this instead of CSS.

ExoticPearTree 98 days ago [-]
I had the displeasure of working with XML.

And I know here on HN there are people that for whatever reason like it. I don't.

98 days ago [-]
heavyset_go 98 days ago [-]
Pertinent to your point, he wanted to resurrect ancestors so that they, too, could participate in the general resurrection. The analogy being old technology resurrected to work alongside contemporary technology towards a shared goal.
jraph 98 days ago [-]
XSLT is to my knowledge the only client side technology that lets you include chunks of HTML without using JavaScript and without server-side technology.

XSLT lets you build completely static websites without having to use copy paste or a static website generator to handle the common stuff like menus.

ErroneousBosh 98 days ago [-]
> XSLT lets you build completely static websites without having to use copy paste or a static website generator to handle the common stuff like menus.

How many people ever do this?

gregabbott 98 days ago [-]
Plain text, markup and Markdown to HTML with XSLT:

REPO: https://github.com/gregabbott/skip

DEMO: https://gregabbott.pages.dev/skip

(^ View Source: 2 lines of XML around a .md file)

paularmstrong 97 days ago [-]
Parsing the XSLT file fails in Firefox :)
gregabbott 97 days ago [-]
Thanks! Reworked for Firefox.
Mikhail_Edoshin 98 days ago [-]
I did that. You can write .rst, then transform it into XML with 'rst2xml' and then generate both HTML and PDF (using XSL-FO). (I myself also did a little literate programming this way: I added a special reStructuredText directive to mark code snippets, then extracted and joined them together into files.)
spiffytech 98 days ago [-]
If this is "declarative XSL Processing Instructions", apparently 0.001% of global page loads.
duskwuff 97 days ago [-]
skechers.com (a shoe manufacturer) used to do this:

https://web.archive.org/web/20140101011304/http://www.skeche...

They don't anymore. It was a pretty strange design.

jasonkester 98 days ago [-]
Ah, shame. I always meant to expand on my little experiment here to ship 100% content pages to the client:

http://www.blogabond.com/xsl/vistacular.xml

The upside is that the entire html page is content. I defy google to not figure out what to index here:

view-source:http://www.blogabond.com/xsl/vistacular.xml

The downside is everything else about the experience. Hence my 15 years of not bothering to implement it in a usable way.

chrismorgan 98 days ago [-]
> I defy google to not figure out what to index here:

Easy: ignore due to no content-type header.

thro1 98 days ago [-]
cute :) (focused and instant)
cassonmars 98 days ago [-]
XSLT is great, but its core problem is that the tooling is awful. And a lot of this has to do with the primary author of the XSLT specification, keeping a proprietary (and expensive) library as the main library that implements the ungodly terse spec. Simpler standards and open tooling won out, not just because it was simpler, but because there wasn't someone chiefly in charge of the spec essentially making the tooling an enterprise sales funnel. A shame.
MattPalmer1086 98 days ago [-]
Was there ever a good reason for XSLT to be an XML document itself? It was painful writing it.
righthand 98 days ago [-]
Is XSLT not an XML document itself?

I’m confused by your comment. My XSLT stylesheets are like this:

``` <?xml version="1.0"?> <xsl:stylesheet version="1.0" xmlns:xsl="http://www.w3.org/1999/XSL/Transform"> ```

MattPalmer1086 98 days ago [-]
Yes it is, I was asking why it needed to be. Sorry if that wasn't clear.
cess11 98 days ago [-]
Once upon a time HTML was a kind of XML, which is why the current version is very similar to XML and hence painful to write. This in turn is why we tend to use programmatic tools to handle the HTML, and you should if you work with XML too.
samus 97 days ago [-]
HTML5 is unlike XML in very important regards, which makes it actually quite difficult for tools to handle. But easier than XML. XHTML was quite annoying to write as far as I remember.
MattPalmer1086 98 days ago [-]
In fact, HTML predates XML. Both can be seen as types of SGML.
Mikhail_Edoshin 98 days ago [-]
This way you can manipulate XSLT using XSLT. A common case is to generate XSLT.
MattPalmer1086 98 days ago [-]
Clearly that would be possible, but I never actually saw anyone doing that though.

Any pointers to tech that did this, if it was a common case?

Mikhail_Edoshin 98 days ago [-]
I doubt it was common. But, for example, there is such thing as Schematron: this is a special notation that checks that an XML document follows business rules and the final tool that it produces is a custom XSLT that transforms the document into the report.

(I'm also doing this currently; I need to prepare a sort of an annotated patch to an XML document, so I concocted a notation that describes edits and use it to generate both the documentation that highlights differences and also the patch itself; the patch comes out as XSLT.)

arwhatever 98 days ago [-]
Yo dawg, I heard you like xml … so I made you an xml-based language to turn xml into other xml!
solatic 98 days ago [-]
For what it's worth, this is the difference between private-sector and public-sector development. The public sector would have instead argued for some budget to hire developers to maintain libxslt and issue RFPs for grant money to rewrite it in Rust for memory safety guarantees. The private sector decides that it's just not a profitable use of resources and moves to cancel support.

The question isn't whether or not you use XSLT yourself, it's whether you use a different feature that could be deemed unprofitable and slammed on the chopping block. And therefore a question of whether it wouldn't be better for everyone for this work to be publicly funded instead.

jmspring 98 days ago [-]
I’m lost at “the public sector would have argued for some budget”. Xslt and libxslt are used across a no - trivial amount of deployments.

Why would the public sector feel bound to support it as opposed to pivot in the same direction the winds are blowing?

Outside the idiocy of this particular administration in the US, gov is pivoting toward more commercial norms (with compliance/etc for gov cloud and etc compliance).

solatic 98 days ago [-]
> Why would the public sector feel bound to support it

The underlying axiom is the Pareto principle - that you get 80% of the benefit from the first 20% of the work, and getting the last 20% of the benefit takes up 80% of the work. The private sector will stop funding after the first 80% of benefit (it's not profitable to chase the last 20%) but the public sector is usually mandated to support everybody so it is indeed required to put in that extra effort.

MrJohz 97 days ago [-]
I'm quite unconvinced by this - it seems very easy to come up with all sorts of counterexamples, particularly in terms of public infrastructure, but also all of public services are regularly cut if the organising body doesn't see that service as achieving its goals any more.

It is true that public bodies are less concerned with profitability, which changes how they make decisions around deprecations and removals, but being cost-effective is still important for them, especially when budgets are low and need is high. In situations like that, it's not uncommon for, say, a service to get cut so that funding can be reallocated elsewhere where it's more needed.

I don't think publicly funding this sort of work would necessarily significantly change the equation here. The costs of XSLT are relatively high because of its complexity and the natural security risks that arise from that complexity. Meanwhile, it is very rarely used, and where it is used, there are better alternatives (generally loading a sandboxed library rather than using the built-in tooling).

Fileformat 98 days ago [-]
One extremely important use-case is for RSS/Atom feeds. Right now, clicking on a link to feed brings up a wall of XML (or worse, a download link). If the feed has an XSLT stylesheet, it can be presented in a way that a newcomer can understand and use.
mbo 98 days ago [-]
This is why I've needed to use XLST, to style my personal RSS feed. Great guide for this: https://andrewstiefel.com/style-atom-xsl/ Looks like it's been raised on the whatwg issue too: https://github.com/whatwg/html/issues/11523#issuecomment-315...
whimsicalism 98 days ago [-]
extremely important? i use rss a lot and i have never seen anyone do this
Fileformat 97 days ago [-]
This is exactly my point: everyone here is tech-savvy and knows what to do with an RSS/Atom link. So we don't see a need for XSLT.

But someone who hasn't seen/used an RSS reader will see a wall of plain-text gibberish (or a prompt to download the wall of gibberish).

XSLT is currently the only way to make feeds into something that can still be viewed.

I think RSS/Atom are key technologies for the open web, and discovery is extremely important. Cancelling XSLT is going in the wrong direction (IMHO).

I've done a bunch of things to try to get people to use XSLT in their feeds: https://www.rss.style/

You can see it in action on an RSS feed here (served as real XML, not HTML): https://www.fileformat.info/news/rss.xml

otterley 98 days ago [-]
This continues the saga discussed here: https://news.ycombinator.com/item?id=44952185
joeiq 98 days ago [-]
XSLT is wildly under-appreciated. You can take hierarchical data and bend it to your will, remix it, and turn it inside out if you wish. Those developers working with XML should consider XSLT before rolling their own manipulation script.

Now, do you need XSLT’s capabilities in the browser? Their stats say no one’s really using it.

imiric 98 days ago [-]
So, instead of a giant corporation with all the resources in the world stepping in and maintaining a core web library, they're deciding to remove a feature because the lone maintainer who has been doing a thankless job for years has decided to unsurprisingly step down from this role.

I suppose we can expect support for XML to be dropped soon as well, since libxml2 maintenance is ending this year.

I don't buy the excuse of low number of users. Google's AMP has abysmal usage numbers, yet they're still maintaining that garbage.

Google has been a net negative for the web, and is directly responsible for the shit show it is today. An entirely expected outcome considering it is steered by corporate interests.

its-summertime 98 days ago [-]
Probably more due to the fact that browsers only support the 1999 XSLT 1.0, and no one has shown any interest in implementing XSLT 2.0 from 2001, or XSLT 3.0 from 2017. So there has been a sign of lack of desire since 2001 at minimum, likewise there is seemingly no attempt by anyone to document incompatibilities and push browsers to unify their incompatibilities like HTML, JS, and CSS.

The writing has been on the wall for a long while. Mozilla hasn't stepped up, Google hasn't stepped up, GNOME hasn't stepped up, Oracle hasn't stepped up, etc. Maybe its just a format that once anyone gets involved with, they no longer want to be involved with it any further.

ozim 98 days ago [-]
I would expect governments finally taking over.

I believe they didn’t just because most of politicians don’t know anything about software.

Being aware of the problems that “governmatization” of open source can bring it still is something I expect to be picked up by countries.

bawolff 98 days ago [-]
People are free to make their own browser if they want.

Part of the reason google chrome won the browser wars is because they are willing to make decisions like this. Kitchen sink software is bad software.

imiric 98 days ago [-]
> People are free to make their own browser if they want.

Some peple are doing that[1]. It's not a matter of desire, but of the amount of effort and resources required to build and maintain the insanity of the modern web stack.

> Part of the reason google chrome won the browser wars is because they are willing to make decisions like this.

Eh, no. Google Chrome won because it is backed by one of the largest adtech corporations with enough resources and influence to make it happen. They're better at this than Microsoft was with IE, but that's not saying much. When it launched it introduced some interesting and novel features, but it's now nothing but a marketing funnel for Google's services.

[1]: https://ladybird.org/

righthand 98 days ago [-]
Will Ladybird support web standards or just take lead from Google though. Will Ladybird support XSLT?
bawolff 98 days ago [-]
> Some peple are doing that[1]. It's not a matter of desire, but of the amount of effort and resources required to build and maintain the insanity of the modern web stack.

People say that, but i don't think that's true. The web stack was always insane, the only difference is its documented now. I think now is a much easier time to build a web browser than the past was.

Not to mention the irony of complaining the web stack is insane while insisting a really difficult to support feature that never saw much use should be kept forever because reasons.

> Eh, no. Google Chrome won because it is backed by one of the largest adtech corporations with enough resources and influence to make it happen

Google won because nobody else really tried.

Firefox has been a dumpster fire of bad management decisions and has reduced itself to basically just copying google's every decision sacraficing any unique identity of its own.

Safari is never going win when it is mac only and apple doesnt seem to fund it very hard.

Most of the rest are just chrome reskins that dont deserve to be called a separate browser.

Maybe something interesting might come out of ladybird. Its still quite early to tell.

troupo 98 days ago [-]
> Google won because nobody else really tried.

Google won because it:

- built on a very solid foundation from the start (it started out as a webkit fork), and was generally a good fast browser. This is the very minor part

- Sabotaged Firefox: https://archive.is/tgIH9

- Heavily promoted and advertised Chrome across all of its properties which included such insignificantly small sites like Google Search and Youtube.

bawolff 97 days ago [-]
> - Sabotaged Firefox: https://archive.is/tgIH9

Running an advertising campaign is hardly sabotage

troupo 97 days ago [-]
You didn't read the link and assumed that my last bullet point refers to sabotage.

Also, you somehow think that running an exclusive directed ad campaign for Chrome on two most popular sites on the internet is nothing to worry about.

_heimdall 98 days ago [-]
The whole point of browser standards is to avoid every browser picking and choosing features their own features, its a terrible end user experience.

No one should fork chrome and maintain it with XSLT still baked in. Not only would it go unused, it doesn't help anyone wanting to ship XSLT on a site because users would literally have to install a different browser just to see that page.

Mikhail_Edoshin 98 days ago [-]
A kitchen sink software implies an image of a kitchen sink filled with dirty dishes. But the solution is not to throw them all away and leave a single dish, still dirty, but at least looking manageable. The solution is to wash all dishes and put them neatly on the rack.
eurleif 97 days ago [-]
The idiom "everything but the kitchen sink" (and variant "everything including the kitchen sink") doesn't refer to a sink filled with dirty dishes. Rather, the kitchen sink (originally the kitchen stove) is being used as an example of a particularly bulky item. "Everything but the kitchen sink" means, roughly, everything except for what would be too large and/or absurd to include.

https://english.stackexchange.com/questions/96582/what-is-th...

bawolff 98 days ago [-]
Really, because mozilla seamonkey tried that. How do you think that went for them?
troupo 98 days ago [-]
> Kitchen sink software is bad software.

Ah yes. That's why Chrome bravely refuses to be a kitchen sink. It only has a small set of available APIs like USB, MIDI, Serial, Sensors (Ambient Light, Gyroscopes etc.), HID, Bluetooth, Barcode detection, Battery Status, Device Memory, Credential Management, three different file APIs, Gamepads, three different background sync APIs, NFC...

1718627440 98 days ago [-]
And it still doesn't support alternate stylesheets, maybe due to NIH syndrome.
bawolff 97 days ago [-]
Firefox killed the alternate stylesheet UI in like version 1. Nobody really supports alternate stylesheets.
1718627440 97 days ago [-]
Can you explain what you mean? My Firefox version supports setting alternate stylesheets just fine. It doesn't persist across page reloads, which is annoying though.
codedokode 98 days ago [-]
I want browsers to be minimal and simple. For example, canvas should only provide a framebuffer to draw into, and all the rest can be done with WASM libraries. Web Audio should only provide an audio thread, and things like low-pass filters can be implemented in WASM. WebRTC should only provide UDP support, etc.

This would make creating competition easier and reduce attack surface. As a nice side effect, it would become impossible to use canvas or web audio for fingerprinting.

bartread 98 days ago [-]
Dude… no.

Firstly, it puts a huge burden of non-value-adding work onto developers and the organisations they work for.

Secondly it would lead to even higher frequency and prevalence of people inventing their own half-arsed ways of doing things that used to be in the box. Nobody would think about standard usability affordances, accessibility, etc.

Thirdly, it would simply move the attack surface into an emergent library ecosystem without really solving anything.

Fourthly, it would increase website payloads even further. Developers have historically been awful at using bandwidth efficiently (still a concern in many scenarios due to connectivity limitations and costs), and we don’t need to offer more opportunities for them to demonstrate how terrible and undisciplined they are at it.

Fifthly, not everyone wants or needs (or should!) to learn web assembly in the same way that not everyone wants or needs to learn x86/64 assembly, ARM assembly, C or Rust.

Sixthly, it would lead to a huge amount of retooling and rewriting which, yes, to some extent would happen anyway because, apparently, we all love endless churn masquerading as progress, but it would be considerably worse.

The web would become significantly buggier and more unusable as a result of all of the above.

dwb 98 days ago [-]
That sounds awful! For a start, accessibility would get even worse than it already is. The browser may become more “minimal” and “simple” from the point of view of the implementer, but certainly not the user.
chrismorgan 98 days ago [-]
On the off chance that by “canvas” you didn’t mean all rendering (which is a terrible idea for reasons addressed by others) but only what is currently covered by <canvas>—the frame buffer approach is incompatible with GPU acceleration, so 3D would basically not be possible. For better or for worse, WebGL is pretty close to the minimum acceptable for that kind of functionality.
sureglymop 98 days ago [-]
I think it sounds good to have that "on top" of what's already there. So that those who want to can use a lower level abstraction.
massifist 98 days ago [-]
I like the idea but you could take it a step further and have just a core virtual machine that you could attach virtual (input/output) devices to. So then the canvas and audio would just be virtual devices that met some specification. Or say for example, you just want to listen to an audio playlist, you could attach an audio device, a keyboard and a terminal device (for feedback). A canvas device wouldn't necessarily be required (if there was no use for one). And it would be up to the user to attach the devices required by an application, or at least the user would have direct control.

TLDR: QEMU but much simpler and only WASM need be supported.

codedokode 98 days ago [-]
Yes, but it also would be good to have some dumbed down version of HTML/DOM/CSS, so that the text can be copied and accessibility works.
thro1 98 days ago [-]
Gecko currently has much deeper integration of the XSLT engine with the browser internals: The XSLT engine operates on the browser DOM implementation. WebKit and Chromium integrate with libxslt in a way that's inherently bad for performance ( https://github.com/whatwg/html/issues/11578#issuecomment-321... )

Just Firefox XSLT is faster, better, cheaper than Google's (and JS), same, old Firefox extensions were to powerful Google could compete with Firefox (or block adblocks).

JS is very needed for ads, tracking and other strings attaching - and XSLT is not for that - but would make JS mostly obsolete in many cases.. [..]

Google pay Mozilla to criple Firefox. It's money from ads, to not let the web be free. Right now, how much $ and CPU power a JS engine could cost, for that, is irrelevant - except for the final user [paying for that saving on costs by some big company - or having to redo NOW in less efficient way something that still works well so far regardless of decades passed] !

https://news.ycombinator.com/item?id=44994459 - with answering lame questions of a developer not having a clue what is all about.

(Just.. live and let others live too ? Thx.)

Moreover.. Content First - and browser is a secondary thing to existing content (Chrome came after) - not the (double) opposite (primary, for ads n tracking instead)

- isn't Google as a public servant - so part of their job is to fix their bugs - but not in the position to decide to kill someone else existing content or solution - for not displaying ads so easy?

thro1 98 days ago [-]
Moreover: there is no JS solution being so stable and for so long as that standard: "25 year old version of XSLT".

Can be "made with JS" doesn't mean that by chance it would be in any bit better than long proved and still used solution - not a one of many crippled, always changing, excluding imitations of it - for example like that one: https://news.ycombinator.com/item?id=45183624 (no caching, not instant, transparent or othogonal etc.).

With XSLT removed, Chrome can not claim to be a standard internet browser neither.

There is nothing wrong with XSLT - it's just Google not wanting to fix few bugs since decades - but others have to follow, nothing changes.

Actually.. I can't care less about Chrome - if others will not follow neither allow Google to reach such position claiming to be able to dastandardize working and used solutions.

harrisi 97 days ago [-]
For what it's worth, there does not exist a "standard internet browser," assuming that means an application that adheres to all relevant web standards. No piece of software exists (at least not publicly) that even adheres to the entirety of any single relevant web standard (e.g. HTML, CSS, ECMAScript, etc.), as far as I know.

Maybe for a few small things like JSON, I suppose, but not for any of the major standards. And not just as in they implement a superset of the standards - every browser implements a distinct set of each standard that is neither a subset nor a superset.

I'm still not a fan of Chrome nor the effect it has on the web.

thro1 97 days ago [-]
The standard so far is to respect existing standards still in use, peoples effort and work done already - but not to outsource bug fixing costs by forcing any of that to be redone or lost.
thro1 97 days ago [-]
- as in Europe, I don't see neither how taxpayers money or users time (if still alive) could be forcibly used to cover the costs of some far away corporation savings (on bug fixing) and profit, downgrading then a lot into more costly, less maintainable, not standard solutions.

But I see less of that money but much better used - to support any of open, independent, not for profit, conforming to standards browsers instead - in not following what a big corporation says and want.

29athrowaway 98 days ago [-]
XSLT is amazing.
Kimitri 98 days ago [-]
It really is. It's extremely handy albeit a bit niche these days.
29athrowaway 97 days ago [-]
XSLT is very useful to transform nested structures.

The input has to be XML, but you can get there via YAML, JSON, tree-sitter etc. And the output doesn't have to be XML.

xsltproc is usually easy to install.

icameron 98 days ago [-]
During my college undergrad CS series we had a practicum with a real engineer from HP or somewhere. Our project was to help the world find and download printer drivers over the web. The project was to make a Java web service send XML that conformed to a schema, which would be turned into a webpage by a transform aka XSLT. It seemed convoluted at the time. The teacher showed us “the how” but I guess “the why” was left as an exercise for the reader. I never understood the big picture- at the time it seemed rather complex. But now I realize this probably would have scaled quite well on turn of the century hardware.
swiftcoder 98 days ago [-]
Makes sense, but I'll have to update my rss feed (which currently uses XSLT to display in browsers that don't have native RSS capabilities - i.e. basically all modern browsers)
postepowanieadm 98 days ago [-]
In Europe some countries still use XML as the official data format and XSLT as the official code format.
rhdunn 98 days ago [-]
It's used a lot in the publishing industry, which stores the content in JATS and other similar XML markup. It's also used by the US government for bills, etc.

Typically, these use XSLT on the backend to transform the content to HTML to be sent to the web browser.

And there's RSS which was mentioned in the previous discussions. Podcasts will typically have HTML renderings of that data, but if you opened the RSS in a web browser you could use XSLT to provide a user-friendly view of the content.

XSLT can also be used to provide fallback rendering for unsupported content, such as converting MathML to HTML for browsers without support. -- Chrome as of 109 supports MathML Core, but doesn't support the content markup (used for more semantic markup of common constructs like N-ary sum, integrals, etc.), so would still need something like XSLT to convert that markup to the presentation markup supported by Chrome.

0x000xca0xfe 98 days ago [-]
Not just "still use", at least here in Germany our brand new e-invoicing system (mandatory since this year) is built on XML.

And XSL is used to validate invoice documents.

ExoticPearTree 98 days ago [-]
It's not only in Germany, it's an EU mandated thing to have e-invoicing to prevent fraud and whatever other things the bureaucrats in Brussels felt like it would be solved with technology.

And yes, sadly the powers that be decided that this crap needs to be XML. Because why not, why use a modern standard...

Devasta 98 days ago [-]
What about XML isn't modern? It's a far more capable format than JSON or anything else you can devise.
ExoticPearTree 98 days ago [-]
How about we agree to disagree? XML has no place in the modern world. If it wouldn't be for the likes of Microsoft, IBM, Oracle and a few others that keep using it and pushing it, it would have sailed into the sunset a long time ago.
98 days ago [-]
lolive 98 days ago [-]
JSON and XML are basically syntactic variant of the same data storage strategy [:tree of attributes and children]. [1]

Where XSLT shines, and JavaScript currently has no equivalent afaik, is in transforming a tree into another one, rule-based.

The lack of support of XSLT 2.0 in browsers is a major issue, as it includes many solutions to problems absolutely not covered by XSLT 1.0.

[1]: xml2dict/dict2xml is an implementation of exactly this duality.

sam_lowry_ 97 days ago [-]
XML allows multiple trees in one file via namespaces, this is a big difference
ExoticPearTree 98 days ago [-]
> The lack of support of XSLT 2.0 in browsers is a major issue,

... for who?

lolive 98 days ago [-]
People stuck with XSLT 1.0 [or worse, javascript] as your tree transformation engine
indolering 97 days ago [-]
Yeah, just because browser vendors put XML tech in maintenance mode doesn't mean that it wouldn't be nice to have.

We might see real world usage of these technologies had browser vendors not frozen them out.

sam_lowry_ 97 days ago [-]
XSLT 2.0 is a failed language, only Michael Kay could implement it in the paid version of Saxon and that because he was the editor if the spec, so he did as he pleased.
elric 98 days ago [-]
I'm an XSLT fanboy. I've used it for all kinds of things, from generating docs to generating entire UIs from an XML declaration. But never in all my years have I used it in a browser. I didn't even know that was an option.
rhdunn 98 days ago [-]
Part of the issue is that XSLT in the browser is stuck at version 1.0 so lacks a lot of the improvements added in 2.0 and later that make working with it a lot nicer.
sam_lowry_ 97 days ago [-]
XSLT 2.0 is a failed spec, there is only one implementation by Michael Kay in the paid version of Saxon.

And he was also the spec editor, his incentive was to get lucrative contracts from BigTech, not make the world a better place.

rhdunn 97 days ago [-]
XSLT 2.0 is not a failed spec. From [1] RaptorXML (XSLT 3.0) and xjslt (XSLT 2.0) are listed as implementing that spec. MarkLogic also provides XSLT 2.0 support.

Saxon has a free HE version [2] that has the source code available and implements XSLT 2.0 REC, 3.0 REC, and 4.0 ED at the baseline conformance. The paid version implements optional features and vendor-specific extensions [3].

Even though Michael Kay is the editor of the spec, several others are involved in the standardization of XSLT, XPath, and XQuery, including members from BaseX and eXist-db which provide XQuery implementations. And as XPath is a subset of XSLT and XQuery there's a lot of overlap there, and features come from many people, not just Michael Kay.

[1] https://en.wikipedia.org/wiki/XSLT#Processor_implementations

[2] https://www.saxonica.com/download/java.xml

[3] https://www.saxonica.com/products/products.xml

thro1 98 days ago [-]
Wasn't the social contract that get the market share that you can use Chrome to browse all the web already existing as well as by using the other browsers - means not discriminating (non-profit, government, older sites or those working well without JS for ads to be tracked), and not to kill parts of that web when convenient ?
righthand 98 days ago [-]
They're keeping XPath but removing XSLT. What a joke. All these brilliant Google engineers and all this brilliant AI tooling and no one can fix XSLT.

Google pays people to destroy the open web, not improve it. These Google engineers are pathetic and should be ashamed of their inept laziness.

Animats 98 days ago [-]
It would be kind of nice if HTML had something where you can make a remote fetch request for JSON or XML data and get it formatted in some CSS-defined way, without Javascript.
apimade 98 days ago [-]
Why not just expose an HTML representation of the data? Why must it remain JSON, XML, CSV, Parquet, fixed length or tab delimited files, ProtoBuf, etc?

API’s should provide content in the format asked of them. CSS should be used to style that content.

This is largely solved in RFC-6838 which is about “how media types, representation and the interoperability problem is solved”. https://datatracker.ietf.org/doc/rfc6838/

Already supported by .NET Web APIs, Django, Spring, Node, Laravel, RoR, etc.

Less mature ecosystems like Golang have solutions, they’re just very much patch-work/RYO.

Or even use OpenResty or njs in Nginx, which puts the transformation in the web service layer and not the web application layer. So your data might be JSON blob, it’ll convert to HTML in real-time. Something similar can be achieved elsewhere like Apache using mod_lua etc.

I think bastardising one format (HTML), to support another format (JSON), is probably not the right move. We’ve already done that with stuff like media queries which have been abused for fingerprinting, or “has” CSS selectors for shitty layout hacks by devs who refuse to fix the underlying structure.

_heimdall 98 days ago [-]
Rather than adding an HTML endpoint in addition to the XML or JSON, expose the data and link it to stylesheets that dynamically render the HTML client side.

That's the whole point of XSLT, ship the data and tell the browser how to transform it to HTML.

1718627440 98 days ago [-]
Because so you can separate the data from the layout. You can e.g. return a list of strings and then the strings become the summaries of a set of details elements.
apimade 98 days ago [-]
The description you give inherently changes the structure of the data, and JavaScript would be the best way to post-process it. CSS is about styling the structure of HTML, not structural changes to it.

Unless you have a good example, I think you’re coming at this from an “everything’s a nail if the only tool I have is a hammer”.

1718627440 97 days ago [-]
CSS if for styling of semantic structure of HTML, XSLT is a language to convert normalized data to semantic structure. That's what I gave an explanation about, I wasn't talking about CSS.
bawolff 98 days ago [-]
They are only getting rid of xslt. You can still use <?xml-stylesheet with CSS
rhdunn 98 days ago [-]
CSS based views of XML are somewhat useful, but are limited in what they can do. Especially things like setting the HTML title, generating a table of contents, or transforming data like dates and times.
bawolff 98 days ago [-]
Sure, but the parent literally was asking for xml styled with css.
cess11 98 days ago [-]
In practice this means XSLT execution will be moved from clients to servers and I'm not so sure this is positive with regards to security.
thro1 98 days ago [-]
How about that:

Google's unilaterally tries to kill part of the web that not let them track or profit from ads so easy ?

.. and with all that money they get (and brains), still to lazy to fix few old bugs (stuck at old version).

thro1 96 days ago [-]
(?) - then more about the tactic: https://news.ycombinator.com/item?id=44994459 (web looks like nails for that tool we have)

now how about that:

Chrome voluntarily decides to disjoint self from parts of the web where it can't take profits - saying they are not in fashion ..

- and if then, actually no one would like to have to follow ever again anything like that ?. (ocean is _big_ and.. blue)

96 days ago [-]
wosined 98 days ago [-]
Just leave it alone bro.
bugbuddy 98 days ago [-]
Good riddance. The web needs to shed all the old baggages like this to move forward. Looking forward to MCP becoming part of the browser.
imiric 98 days ago [-]
Wow, I couldn't disagree more.

XSLT is no more "baggage" than HTML itself. Removing it in no way "moves the web forward". And integrating technologies part of the current hype cycle, which very well may disappear in a year, is a terrible idea.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 07:04:34 GMT+0000 (Coordinated Universal Time) with Vercel.