> Most developers weren’t deploying simple stateless functions. They were building full-stack apps: apps that talk to a database
Honestly that seemed really obvious from the start - it's hard to think of many use cases where this isn't the case. Glad they realised anyway.
diggan 11 hours ago [-]
> There’s been some criticism lately about Deno - about Deploy, KV, Fresh, and our momentum in general.
It seems like they never replied to the criticism against their momentum (something I haven't seen myself, what would the argument even be), was that intentional or just missed?
> Some of that criticism is valid.
Would have been great to also outline what criticism is/was valid, and how they're aiming to solve those things. Sure, maybe a bit "shoot yourself in the foot" but personally I really prefer companies that are upfront about the drawbacks, and makes it more likely I'll chose them. Migadu is a great example of this, where they have a pro/con page where they are upfront about the drawbacks of using Migadu (https://migadu.com/procon/). Just the existence of that page is probably ~20% of why I chose Migadu in the first place.
skybrian 11 hours ago [-]
Here’s how they addressed momentum:
> Since the release of Deno 2 last October - barely over six months ago! - Deno adoption has more than doubled according to our monthly active user metrics.
The obvious question is: doubled, but compared to what? And what are they measuring? They’re not disclosing any real metrics on adoption.
I think what happened is that people were giving them the benefit of the doubt because they were new and you could imagine huge growth. The disappointment is by comparison to vague hopes and dreams.
sholladay 10 hours ago [-]
I was excited about Deno precisely because it was a greenfield approach without backwards compatibility. Early on, they focused on reducing complexity and it worked. There were definitely some new pain points compared to Node, but I found them pretty manageable.
At some point, rather than coming up with native solutions to those pain points, they retreated and started leaning on backwards compatibility as a workaround.
Today, Deno feels more complex than Node does because it contains both approaches. And now there are lots of edge cases where a Node package ought to work, but doesn’t because of one unimplemented API or option or a bug that exists only in Deno. My favorite testing framework, AVA, still isn’t supported.
I used to just ignore the npm compatibility layer and target Deno itself, but that’s become more cumbersome to do over time. For example, look at `deno run —help` and look at how many command line options and env vars there are. It’s exploded in the past few years. A lot of that is for npm interoperability. For me, it’s just a lot of noise.
The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter. Yet they don’t seem to want to do that.
I really want Deno to succeed, if for no other reason than because it’s pushing Node to do things that they should’ve done years ago, such as adding a permission system. I just don’t think the current vision for Deno is very coherent or consistent with its original purpose.
mark_and_sweep 7 hours ago [-]
> My favorite testing framework, AVA, still isn’t supported.
Have you checked recently? The docs (https://docs.deno.com/runtime/fundamentals/testing/) specifically mention AVA as being supported. Then again, I'd assume that most devs using Deno just use the built-in `deno test` instead of a third-party testing framework.
> The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter.
Again, have you checked recently? According to the docs this is supported: "Deno's built-in linter, `deno lint`, supports recommended set of rules from ESLint to provide comprehensive feedback on your code. (...) You can specify custom rules, plugins, and settings to tailor the linting process to your needs." (https://docs.deno.com/runtime/fundamentals/linting_and_forma...)
I've been using Deno for 6 years now. And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
FunnyLookinHat 3 minutes ago [-]
> And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
I feel similarly. The standard configurations (e.g. tsconfig, linting, formatting) and bolts-included tooling (test, lint, fmt, etc.) are what make Deno so great for developers.
I've started using Deno in my spare time for various projects - and it just _feels_ more productive. I go from idea to testing TypeScript in minutes - which never happened in Node land.
yahoozoo 11 hours ago [-]
The problem with Deno is that it has lost the plot. When it was first announced years ago, it was simply a safer, faster JS/TS runtime “written in Rust” which I assume it still is but now you go to the website and you click a “Products” drop down containing a bunch of other shit.
It’s like they looked at what Vercel did with introducing a deployment platform after their initial NextJS work and wanted to follow suit.
bredren 10 hours ago [-]
Was it this or did the JavaScript / node communities get their acts together?
I had thought a lot of what Deno was setting out to do was cool beans for a time but parity was faster to come from js/node than expected.
mattlondon 12 hours ago [-]
I was super-excited about Deno right up until they threw away their earlier commitments and added backwards compatibility for node and all the shite that comes with it.
The whole selling point for me was that deno was node without the bullshit and baggage, but they dropped that and basically just turned it into node with built in typescript support and a few other minor things like the permissions.
Similar story with bun.sh - node backwards compatibility (although not using V8).
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
skybrian 11 hours ago [-]
Why do you treat adding a feature (npm compatibility) like you’re losing something? You don’t have to use any Node API’s in your app - Deno’s API’s are pretty comprehensive. You can also stick with the libraries available on jsr.io if you’re satisfied with what you can find there.
If you want the developer experience of using something that’s not Node, you can still get it from Deno.
But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
lolinder 10 hours ago [-]
An equivalent argument to yours could be made to defend the introduction of async/await to a language that has previously not had it (edit: like Rust): if you don't like async/await, just don't use it! What does it hurt you to have another feature added?
The answer is obvious in the programming language case: for those who do not want async, the addition of async/await begins to poison the ecosystem. Now they have a growing list of libraries that they cannot use if they want to avoid async, so the effort involved in picking a library goes up and the odds get increasingly high that they're locked out of some of the key tools in the ecosystem because new libraries without async become harder and harder to find.
For those who really hate colored functions, the addition of async is the removal of a feature: colorless functions are replaced with colored functions.
The same can be said of NPM compatibility. Sure, I can try to avoid it and stick to Deno imports and inspect each library that I use for NPM dependencies. But it gets harder and harder as time goes on, because a key feature of Deno has been removed: it's no longer an ecosystem reset.
skybrian 10 hours ago [-]
Node compatibility isn’t a language feature, though, and it doesn’t result in “colored” functions. If a Deno library uses a Node API or an npm library somewhere, it can be entirely encapsulated, so you might not even notice until you see it in a stack trace. That doesn’t seem very intrusive?
So it reminds me more of trying to avoid CGO in Go or avoid “unsafe” in Rust.
It would be worse if Node-specific types started appearing as function parameters or return types, but that seems fairly rare even in npm libraries, so it seems easy to avoid.
lolinder 10 hours ago [-]
> If a Deno library uses a Node API or an npm library somewhere, it can be entirely encapsulated, so you might not even notice until you see it in a stack trace. That doesn’t seem very intrusive?
Node API yes, NPM library no. If you add a dependency on a library that uses NPM you now depend on an entire web of transitive NPM dependencies, with all of the problems that entails. People don't dislike NPM because it's aesthetically displeasing—you can't just abstract away the problems with NPM. The ecosystem causes real problems with real software, and Deno initially recognized those real problems and set out to reset the ecosystem.
The only way in which NPM-compat is different than colored functions is that there's no static compiler feature telling you when you've added a dependency on a bunch of NPM libraries.
skybrian 10 hours ago [-]
I think that’s best addressed by avoiding dependencies and looking for libraries with few indirect dependencies. There are lots of npms that advertise few or no dependencies as a feature.
Though, it is nicer if it’s on jsr.io because you’ll see Typescript source code in the debugger.
There’s nothing about starting over that prevents ending up with a whole new rat’s nest of dependencies, if you’re not careful.
wink 10 hours ago [-]
I'm not saying it's a realistic view, but I had hoped without any inclusions from NPM there would exist a couple more clean-room (or at least decoupled) implementations of things in TS, leaving everything JS behind.
ogoffart 6 hours ago [-]
Off-topic, but the idea of "colored functions" from the "What Color is Your Function" article doesn't apply to Rust's async/await. That article is about JavaScript before it had async/await, when it used callbacks.
In Rust, you can call async functions from normal ones by spawning them on the executor. The .await syntax isn't as painful as dealing with callbacks and closures in JavaScript. Plus, if you call an async function incorrectly, Rust's compiler will catch it and give you a clear error message, unlike JavaScript, where bad function calls could lead to unpredictable behavior. The premises of the article don't apply, so Rust's async/await doesn't introduce the same "colored function" issues.
I read the article when it hit HN months ago but I don't agree that function coloring doesn't apply. What you're describing is that Rust makes coloring less painful, not that functions aren't colored.
JavaScript itself has come a long way towards making coloring less painful. TypeScript+ESLint solves the weird unpredictable behavior issues with JS and async/await solves the syntax issue. Promises in general give well-defined semantics to calling an async function from a sync function. But all that only undoes some of the arguments about function coloring, not all of them. Fundamentally the same question applies: do you make async-ness part of the type system or do you instead build a system like green threads that doesn't put it in the type system?
I happen to think that coloring functions according to their async-ness is actually the right move (with the right ergonomic improvements), but plenty of people don't agree with me there even with all the ergonomic improvements Rust and TypeScript have made to the model.
incrudible 10 hours ago [-]
There is no such thing as a "colorless" alternative to colored functions[1] in Javascript, at least as far as browser-compatibility is concerned. Promises are a convention for what used to be all the colors in the rainbow (and then some imaginary ones). Async/await is syntactic sugar on top that makes it more readable. The inherent pitfalls of asynchronous code don't disappear if you remove that sugar.
If you're gonna argue that fragmentation is a problem in the node ecosystem (which I agree with), you can't convince me that a plethora of approaches to asynchronous code is preferable to async/await and promises.
1) The original essay that coined this term was looking at it from a language design perspective. The argument is a fair one if that design question is up for debate, but that isn't the case for Javascript.
lolinder 10 hours ago [-]
To be clear, I like async. I just don't think "you don't have to use it if you don't like it" is a good argument in favor of it because it's obviously not true.
stevage 10 hours ago [-]
Not the person you're replying to, but I don't get how your argument applies here. JS functions could already return promises. Some of them being declared as async doesn't change anything for the consumer does it?
(In general, I do agree that "you don't have to use it" is not a strong argument.)
jerf 9 hours ago [-]
Using the function color concept was an example of a place where this problem can occur, not the actual problem.
The problem is that if you think statically, you can say "oh, just use the 'clean' subset". But the world is not static. If you think dynamically, you can see the full Node ecosystem as a fairly powerful attractor; why would I write a deno library that only works on deno when I can write a node library that works on both? Well, if I'm writing in the Node ecosystem, why not use the whole thing?
This is a general effect; it is very hard for people to set up long-term ecosystems that are "too close" to existing ecosystems. Generally the new thing will either get pulled in (as in this case) or ignored (as in the many cases of 'hey Typescript seems to have worked pretty well, let me do a Typescript-like gloss of this other language', which generally just get ignored). There are successes, like Typescript (JS is in general a special case because being the only language in the browser for so long it was both a language and a compile target; most other attempts to "Typescriptify" a language flounder on the fact that few if any other languages have that situation), Elixir (managed to avoid just being absorbed by Erlang, IMHO kind of a similar situation where the 'base langauge' for the ecosystem was not really the best), and the occasional Lisp variant that bubbles up (though like Clojure, usually with a story about where it can be run), but in general this is very hard to pull off, harder in some ways than simply starting a brand new language ecosystem, which is itself no picnic.
lolinder 10 hours ago [-]
I'm not talking about JS. I had Rust in mind.
Also, promises already color functions just like callbacks do. Async/await just changes the syntax by which that coloring is expressed. The real problem people have with async is that they prefer green threads as a solution to concurrency, not that they don't like the syntax.
incrudible 10 hours ago [-]
...but you don't have to use it. You can keep using raw promises and you can trivially use any async/promise-based API with informal callbacks. I doubt many people want to do that, but they can.
Of course, in (browser-compatible) Javascript, some things can not be done synchronously, but that's not up for debate.
afavour 10 hours ago [-]
> Why do you treat adding a feature (npm compatibility) like you’re losing something?
Because you are losing something: a better ecosystem. Standardizing around… standards is a good thing. When I dive into the dependencies of any given Node app it’s a mess of transpiled code, sometimes minified even, there’s no guarantee what API it’ll be using and whether it’ll run outside Node (is it using the W3C streams API or the Node streams API?). But inertia is a powerful force. People will just use what’s there if they can. So the ecosystem never gets made.
> But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
By that logic we never build anything newer or better. Python 3 is better than Python 2 and sets the language up for a better future. Transitioning between the two was absolutely torturous and if they just prioritised popularity it would never have happened.
pier25 5 hours ago [-]
> Why do you treat adding a feature (npm compatibility) like you’re losing something?
Because they are losing something.
All the time and money they are investing into node compat could have been used towards a Deno first party ecosystem. It's not like they have hundreds millions to spare. Deno is a small company with limited resources.
People kept complaining that they couldn't use Deno with NPM packages so Deno ended up focusing in providing faster horses.
nicce 11 hours ago [-]
It is less about purity and more about why continue improving Deno APIs since now we can handle the stuff with Node or Node-powered library? Especially, if you are driven by profits. That means that all possible hours will be removed from the future Deno API development. Also, it does not force older libraries to adapt and make versions that use Deno API.
skybrian 10 hours ago [-]
What Deno API’s do you miss, compared to Node? It seems like they’re pretty built out?
I’m looking forward to whatever they’re going to do instead of KV, which I tried and is too limited, even for a KV store. (64k values are too small.) Something like Cloudflare’s Durable Objects might be nice.
You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
nicce 10 hours ago [-]
You should think Deno as standard library for JavaScript/TypeScript. Node was that. How well Node compares to Go/Python for example? We would like to see most used small Node libraries to merged at some level into Deno's standard library so that the amount of dependencies and deprecations would go downwards.
> You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
If enough people find Deno useful enough to skip some old libraries, maintainers are "forced", even thought Deno is not forcing anyone. If they do not adapt, then someone will just create a new library with better practices. In both cases there is pressure for better JS/TS evolution.
rtpg 11 hours ago [-]
by adding node compatibility it reduces pressure for libs to be written "the deno way". Libraries that could be cleaner!
At least that's the theory. To be honest I don't see Deno's value add. The runtime is like... I mean node works fine at this point? And the capabilities system is both too pedantic and too simplistic, so it's not actually useful.
I don't understand the value add of Bun much either. "Native" Typescript support but at the end of the day I need a bundler that does more than what these tools tend to offer.
Now if one of these basically had "esbuild but built in"....
int_19h 1 hours ago [-]
Bundler is for shipping code, but then you need hacks like tsx to use TS in tests, build scripts etc, and configuring all that can be surprisingly gnarly and prone to breakage (e.g. tsx uses unstable Node APIs).
Although now that Node itself has basic TS support with type-stripping, this substantially improves matter. But that's a fairly recent thing, both Deno and Bun predate it by a long time.
Also Bun has a built-in bundler? I'm not sure how it compares with esbuild tho.
pier25 5 hours ago [-]
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
Cloudflare Workers workerd comes to mind but it's fundamentally a different thing.
It's not meant to be a generalist backend runtime and it provides almost zero batteries.
user3939382 11 hours ago [-]
I like JSDoc. Doesn’t have anything to do with Node and you get many of the same benefits without all the compilation toolchain complexity.
hyperpape 11 hours ago [-]
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
I'm sure you can find other projects that are going to fail, but why do you want to?
Node has lots of problems (I am basing this statement on the fact that it's a major tech project). None of them are sufficient to prevent it from being extremely widely used.
To fix those problems in a product that will be used, it is not sufficient to provide something sort of like Node but without those problems. You either have to:
1. Provide a tool that requires a major migration, but has some incredible upside. This can attract greenfield projects, and sometimes edge out the existing tool.
2. Provide a tool with minimal migration cost, and without the same problems. Maybe this tool can replace the existing one. Ideally there will be other carrots (performance, reliability, ease of use). Such a tool can get people to migrate if there are enough carrots, and the migration cost is low enough.
Deno was a classic example of messing this up. It's not #1 or #2, it has the worst of both worlds. The upside was that it did things "the right way", and the downside was that you couldn't run most code that worked on Node. This is the kind of approach that only attracts zealots and hobbyists.
diggan 11 hours ago [-]
> Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
What's the point? If you're in love with static types, but have to do JavaScript because you're targeting the browser, I kind of understand why'd you go for TypeScript. But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
lolinder 11 hours ago [-]
Because out of all the languages that stand a chance of being adopted at most workplaces, TypeScript is in my opinion the single most enjoyable to code in. It has an extremely expressive type system that gets out of your way and allows you to model almost anything you can come up with, it has great IDE integrations on both VS Code and JetBrains, it has strong first-class support for functional programming patterns. On top of all the things the language itself has going fit it, it allows me to write the same language on frontend and backend, which in my experience actually does make a huge difference in avoiding context switching and which also helps with avoiding code duplication.
People like to sneer at TypeScript, but let's be honest: people like to sneer at anything that's popular enough. The fact is that no language that I enjoy better than TypeScript (which is already not a very long list) stands any chance of adoption in an average workplace.
xnorswap 11 hours ago [-]
Well there's C# / .NET, which ticks off all of those boxes, even the functional syntax is well supported since it added pattern matching and people write a lot of fluent functional style anyway with LINQ.
It also interops nicely with F#, so you can write a pure functional library in F# and call it from a C# program in the "functional core, imperative shell" style.
It has an incredibly solid runtime, and a good type system that balances letting you do what you want without being overly fussy.
lolinder 11 hours ago [-]
It misses the frontend/backend symmetry and has too large a coupling to Microsoft and Windows in my head. I know that these days it's supposed to be cross platform, but every time I've tried to figure out how to install it I get lost in the morass of nearly-identical names for totally different platforms and forget which one I'm supposed to be installing on Linux these days.
That doesn't mean there's anything wrong with it and I've often thought to give it another shot, but it's not a viable option right now for me because it's been too hard to get started.
HideousKojima 8 hours ago [-]
>but every time I've tried to figure out how to install it I get lost in the morass of nearly-identical names for totally different platforms and forget which one I'm supposed to be installing on Linux these days.
I realize Microsoft is terrible at naming things, but for .NET/C# it's really not that hard these days. If you want to use the new, cross platform .NET on Linux then just install .NET 8 or 9.
New versions come out every year, with the version number increasing by one each year. Even numbered versions are LTS, odd numbered releases are only supported for about a year. This naming scheme for the cross-platform version of .NET has been used since .NET 5, almost 5 years ago, it's really not too complicated.
lolinder 3 hours ago [-]
Fair enough, I guess I haven't looked in the last few years. The last time that I did a search for .NET there were about five different names that were available and Mono still turned up as the runtime of choice for cross platform (even though I knew it wasn't any more).
int_19h 1 hours ago [-]
This is mostly the legacy stuff still ranking high up in search results.
These days you just add the Microsoft package repo for your distro and then do `apt install dotnet-sdk-9.0` or whatever.
It's also been spreading into the official distro repos. Nix, Arch, and Homebrew all have it.
HideousKojima 2 hours ago [-]
To clear things up for you a bit more (hopefully, or I'll just make it worse):
Any legacy .NET projects are made with .NET Framework 4.x (4.8.1 8s the latest). So if it's 4.x, or called .NET Framework instead of just .NET, it's referring to the old one.
.NET Core is no longer used as a name, and hasn't been since .NET Core 3.1. They skipped .NET Core t completely (to avoid confusion with the old one vut I think they caused confusion with this decision instead) and dropped the Core for .NET 5. Some people will still call .NET 5+ .NET Core (including several of my coworkers) which I'm sure doesn't help matters.
Mono isn't 100% completely dead yet, but you'll have little if any reason to use it (directly). I think the Mono Common Language Runtime is still used behind the scenes by the newer .NET when publishing on platforms that don't allow JIT (like iOS). They've started adding AOT compilation options in the newest versions of .NET so I expect Mono will be dropped completely at some point. Unless you want to run C# on platforms like BSD or Solaris or other exotic options that Mono supports but the newer .NET doesn't.
coolcase 11 hours ago [-]
I'm torn between Go (nicer runtime) and Node (get to use TS for nicer types!)
lolinder 11 hours ago [-]
For me Go doesn't even come close. It's way too restrictive in what I can and can't do with it.
I might feel differently if I worked with a large number of people who I didn't trust, but on small to medium teams composed of very senior people using Go feels like coding with one hand tied behind my back.
coolcase 4 hours ago [-]
I'm talking about the runtime though e.g. concurrency story being the main thing. Speed too. Easier multiform use. Memory footprint.
mattlondon 10 hours ago [-]
Yep I control the stack, and I want it to be typescript.
At this stage, I don't think anyone needs to try and persuade anyone why JavaScript and typescript are the Lingua Franca of software engineering.
Performant, expressive, amazing tooling (not including node/npm), natively cross-platform.
An absolute joy to code with. Why would anyone want to use anything else for general purpose coding?
In my mind there are two alternative approaches in the current ecosystem: C++ where absolute 100% maximal performance is the overriding primary objective and be damned with the consequences, then for everything else just use Typescript
int_19h 1 hours ago [-]
It still doesn't have basics like tuples/records or pattern matching.
Something as simple as a map with two integers as a key, or case-insensitive string keys, requires jumping through hoops. Even Go and Python can do this.
coffeebeqn 9 hours ago [-]
> amazing tooling
Oh I get it, you’re joking
diggan 10 hours ago [-]
> anyone why JavaScript and typescript are the Lingua Franca of software engineering.
I mean, it obviously isn't, although for web development, I'd probably agree with you. But regardless, zealots who hold opinions like this, where there is "one language to rule them all" is why discussing with TS peeps is so annoying. In your world, there is either C or TypeScript, but for the rest of us, we tend to use different languages depending on what problem we're solving, as they all have different strengths and drawbacks. If you cannot see any drawbacks with TypeScript, it's probably not because there aren't any, but you're currently blind to them.
> Why would anyone want to use anything else for general purpose coding?
Because you've tasted the world of having a REPL connected to your editor where you can edit running code live and execute forms. Just one example. There are so many languages available out there when you control your stack. I understand using JavaScript for frontend web stuff, because you have no other options, and I personally have nothing against JavaScript itself. But for the love of god, realize there is a world out there outside of your bubble, and some of those languages have benefits too.
Do you know what it means? A lingua franca is able to facilitate communication between many parties who do not have a common mother language. JavaScript most certainly does not fit that bill. You could argue C is the lingua franca from the side of the CPU since C runs everywhere, it is literally meant for that. A portable assembly.
mattlondon 6 hours ago [-]
C does not run everywhere - you need to compile it to a binary per-platform and per-architecture first, then your platform+arch specific binary only runs on that specific combination... And even then there might be dynamically linked libs to worry about.
You may as well call binary (i.e. 1s and 0s) the Lingua Franca in that case.
Lets not get started on C build chains (especially cross-compiling) ... cmake Vs cygwin Vs msvc Vs whatever else these days with hacky-and-brittle ifdefs conditionals everywhere just to make it work - chaos! JavaScript just runs on pretty much any modern computer you can sit down at or put in your pocket, and even on more exotic things that don't have it installed by default you are about 10 seconds away from installing the official package and you are off and running.
rowanG077 4 hours ago [-]
Obviously "Run everywhere" is after compilation. No language works without some kind of processing, and no machine code is not a language. Even if you'd call machine code a language it cannot be called a lingua franca by definition since it's designed to be architecture specific. I'm not sure why you are even start with linked libs, or even linking or libraries at all. That's far removed from the language itself, the C language standard does not prescribe how to link even. It's an implementation detail.
imicrowaveforks 6 hours ago [-]
[flagged]
Octoth0rpe 11 hours ago [-]
> why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js. We want an ROI, we just don't want node/npm.
diggan 11 hours ago [-]
> Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js.
Right, makes sense. It also makes sense that most of those learnings are transferable, it's not like TypeScript is the only language with types. So your design/architecture skills can be used elsewhere too. Locking yourself into the ecosystem of one language, then asking other runtimes to adhere to your preference sounds like a sure way of getting disappointed, instead of being pragmatic and flexible to chose the right tool for the problem.
rtpg 11 hours ago [-]
Typescript's type system is uniquely good for preventing types in "bog standard enterprise code". No other language comes close to it. Two words: untagged unions. And of course all the other utilities it provides.
It's extremely good! Shame about it being coupled to Javascript.
LoganDark 11 hours ago [-]
I haven't seen any true competition with TypeScript, at least for me. Go is unsound (Go slices...), Python is too high-level (even with mypyc), Rust is too low-level (in comparison to TS), and so on. It's not just "a language with types".
Also, when I was writing a frontend and backend both in TS, I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
diggan 11 hours ago [-]
> I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right? And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
It's not like TypeScript gives you some inherit benefit that makes it easier to convert to JavaScript, besides the fact that it's literally a "Compile-to-JS" language.
LoganDark 9 hours ago [-]
> But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right?
JavaScript does make it easier to target both the web browser and Node.js, sure. But TypeScript also has a fairly mature type system and ecosystem (flaws in `tsc` itself notwithstanding). Not to say that no novel approaches are worth exploring, though; I just haven't seen one that rivals my TS experience yet.
> And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
That depends. In many other programming languages (such as ones that compile to WASM) it's also possible to have common code shared between server and client, but it's usually pretty inconvenient to actually get the code running in both environments. It's also possible to have a common interface definition and generate types for server and client from that definition, but that's still more complicated.
Anyway I don't fault anyone for being disappointed that Deno fell into the Node.js compatibility trap. Pure TypeScript without that particular cruft is also something I was excited about. I also was excited to see what looked like some fair innovation (like their import mechanism and their sandboxing) but I don't know how that'll continue if Node.js compatibility ends up being too much of a time sink.
I don't have very strong opinions because I've never really used Deno and I probably won't even bother at this point, but I definitely would not agree that this is just a problem of needing to use another programming language instead.
skydhash 11 hours ago [-]
I don’t understand the need to share types between frontend and backend. It’s like a strong incentive to take the wrong decisions down the lines instead of using the right data model for the domain.
LoganDark 10 hours ago [-]
I mean JSON payload types.
ChocolateGod 11 hours ago [-]
> But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript
Why not use it? What high level programming language would you suggest instead with the same level of performance and ecosystem support.
liveoneggs 11 hours ago [-]
Java has (much) better performance and a bigger ecosystem. It has types and multiple alternative languages which target it (kotlin, scala, clojure, etc)
alabastervlog 9 hours ago [-]
Hell, last I checked PHP is still king of the server-side Web scripting languages, as far as real-world speed. Like, fast enough that you've gotta be pretty careful in Java or Go or whatever, or you'll end up slower than PHP.
The way you make a scripting language fast is by getting the hell out of it and into C or C++ as fast as possible, and PHP's library ecosystem embraces that harder than just about any other scripting language, is the reason (I think).
[EDIT] My point is mainly that Node's performance isn't really that impressive, in the field of "languages one might use to write server-side Web code". It beats absolute slugs like Python and Ruby handily, but past that... eh. And really, in its speed category you'd probably do just as well using either of those and paying a little more attention to things like network calls and database access patterns.
ChocolateGod 43 minutes ago [-]
Google has poured millions of $ and 100s of developers into the development of V8 to get every last inch of speed and optimisation. As far as comparing it to AOT/non-GC languages of course it will always be slower, but I would struggle to believe it's slower than PHP other than in specific situations.
Go. If by same level of performance you mean much better performance. The language even comes with a http server built into it so you never have to deal with something like node
silverwind 1 hours ago [-]
Performance maybe, but you will suffer go's rudimentary language features and it's very limited type system.
koakuma-chan 11 hours ago [-]
Why would you use something other than a JS framework to build a web app back-end? You don't have to deal with OpenAPI or GraphQL, you can just use server actions.
williamdclt 10 hours ago [-]
"server actions" seems to be a NextJS thing, not a JS thing? JS does not mean React and NextJS. The communication between frontend and backend is (almost) always HTTP, regardless of the languages and frameworks (server actions are http). Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.
IDK what you mean by "deal with OpenAPI", OpenAPI is a spec not a technology like graphql.
In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
koakuma-chan 10 hours ago [-]
> server actions" seems to be a NextJS thing, not a JS thing?
It's a JS framework thing. Every mainstream JS framework has server actions or equivalent.
> Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.
It is way more convenient to write a server action and be able to immediately use it in a client component than having to write an HTTP endpoint in a separate back-end project, and then regenerate your client via OpenAPI, or whatever else you use.
> IDK what you mean by "deal with OpenAPI"
I mean dealing with tooling to generate an HTTP client from OpenAPI schema.
> In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
Wrong
williamdclt 7 hours ago [-]
> Every mainstream JS framework has server actions or equivalent
No. "Server actions" are a React concept, it has little to do with backend technology (the backend still speaks HTTP). This concept is completely irrelevant to most big frameworks like Express, NestJS, Koa, Hapi. Next is barely a "backend" framework: it's rather a React server framework that has a few basic backend functionalities.
koakuma-chan 7 hours ago [-]
Okay, every full-stack JS framework.
dboreham 7 hours ago [-]
Raising my hand with the opinion that you're wrong. Making it confusing as to whether code is executing on server or client is imho an antipattern.
koakuma-chan 7 hours ago [-]
I’m wrong in what? Even if you’re confused whether server actions run on the server or on the client, it doesn’t take away benefits I listed before.
homebrewer 11 hours ago [-]
Better/scalable performance, actual runtime type checking without wrapping everything with third-party libraries and paying the associated overhead, talent pool in my area, better (or even existing) libraries for the task at hand, better observability instrumentation, personal preference... is this really an honest question?
koakuma-chan 11 hours ago [-]
> is this really an honest question?
Yes, there is nothing that works better than server actions. None of what you listed really makes sense to me. I have never had any runtime performance problems with TypeScript and wasn't JavaScript the most popular programming language in the world (the talent pool argument)?.
diggan 11 hours ago [-]
Because there are other options available that might be better? Personally I'd chose Clojure for anything I have a choice with. Cargo culting a language like that does no one any favors.
koakuma-chan 11 hours ago [-]
> Personally I'd chose Clojure for anything I have a choice with.
And then you would have to solve the problem of how to communicate with the client.
diggan 11 hours ago [-]
I dunno, HTTP works pretty well for me as a transport layer to communicate with clients. Otherwise Websockets. Not sure why you'd think that be a difficult thing?
koakuma-chan 11 hours ago [-]
> Not sure why you'd think that be a difficult thing?
You aren't suggesting to handwrite an HTTP API client, right? You would have to set up either OpenAPI which is a mess, or GraphQL which is also a mess. LMK if you have a better solution.
diggan 11 hours ago [-]
What exactly is the problem you're encountering when trying to interact with HTTP APIs? If you really want to use OpenAPI for whatever reason, there are plenty of ways of doing that even in Clojure/Java, so I'm sure it's possible in other languages as well. But it's not like using OpenAPI or GraphQL are the only two options, if you're of that opinion I'm afraid you've drank way too much of the koolaid.
koakuma-chan 11 hours ago [-]
> What exactly is the problem you're encountering when trying to interact with HTTP APIs?
The problem is that, unlike when using server actions, when using HTTP APIs, there is nothing that automatically generates bindings.
> If you really want to use OpenAPI for whatever reason
No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
> But it's not like using OpenAPI or GraphQL are the only two options
What are other options?
diggan 10 hours ago [-]
> when using HTTP APIs, there is nothing that automatically generates bindings.
Is that really the biggest problem you face when programming? How many endpoints do you have? Even with projects with ~30 endpoints, it doesn't seem problematic to me, but maybe people regularly work on projects with 100s of endpoints, then it kind of makes sense. But I'm not sure that's typical enough.
> No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
People do a lot of stuff for a lot of nonsense reasons, doesn't mean that's the best way to approach things. The JS/TS ecosystem seems extra easy to fall into cargo culting too.
floydnoel 10 hours ago [-]
"handwriting" an HTTP client is too much work for a developer? you need to import a library that does it for you? wow. abstraction at any cost, eh?
koakuma-chan 10 hours ago [-]
Yes, you have to waste time handwriting it and making sure that you actually wrote a proper binding. Then you also probably need to deal with API versioning. All of this goes away with server actions.
liveoneggs 11 hours ago [-]
nextjs or bust? This is a wild take
koakuma-chan 11 hours ago [-]
Yep, Next.js has the best support for vibe coding.
diggan 9 hours ago [-]
> vibe coding
"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering. It was a tiny experiment that somehow people thought was a suggested way of developing software, which obviously it isn't. Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
koakuma-chan 8 hours ago [-]
> "Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering.
Well I guess making Next.js apps isn't really "serious engineering"
> Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
I do read the code but I barely write any code by hand.
diggan 8 hours ago [-]
> Well I guess making Next.js apps isn't really "serious engineering"
Where did I say that?
> I do read the code but I barely write any code by hand.
Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now. The description "vibe coding" is explicitly about "programming" with a LLM without reading or writing any code at all, for any purpose. If you read the code, you're not really vibe coding as originally described by Karpathy.
koakuma-chan 8 hours ago [-]
> Where did I say that?
You replied to a comment that says "Yep, Next.js has the best support for vibe coding."
> Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now.
You can stop arguing that if one glances at the code one is no longer vibe coding, because in practice by looking at the code or even the LLM's thoughts you can catch things you don't want early.
coolcase 11 hours ago [-]
You don't need to deal with those anyway. RPC it. fetch one one end, route on other.
koakuma-chan 11 hours ago [-]
> RPC it. fetch one one end, route on other.
What do you mean by this?
coolcase 4 hours ago [-]
JS fetch function in three browser.
Add a route on the back end.
RPC means just call it! don't worry about REST, GQL etc.
msie 11 hours ago [-]
Wow…
jgalt212 11 hours ago [-]
maybe because async Python is painful, an async TypeScript / JS is the default.
diggan 11 hours ago [-]
Use another language where async isn't painful then? Since it's the backend, you have 100% control over what stack to chose, unless some other requirement gets in the way. And no, async isn't "the default" in at least JavaScript, not sure where you'd get that from.
int_19h 33 minutes ago [-]
It is the default in a sense that most libraries and increasingly many stdlib functions and even core language features require it. E.g. dynamic import() is async-only.
jgalt212 11 hours ago [-]
> And no, async isn't "the default" in at least JavaScript, not sure where you'd get that from.
My bad. I was conflating common idioms and actuality.
3 hours ago [-]
candiddevmike 12 hours ago [-]
Post is by the CEO and doesn't really address the criticisms around Deno, just seems to justify their own internal decisions (or his?). Seems like Deno products work really well for Deno though!
nchmy 11 hours ago [-]
What criticisms of deno do you think went unaddressed?
candiddevmike 11 hours ago [-]
They don't really address stability, and even go as far to say they aren't chasing parity. Blog post gives off major "you're holding it wrong" vibes.
I guess we’ll see soon enough what Deploy will become since that's "imminent".
KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.
Fresh is being refactored with an alpha in "late Q3 2025 (likely September)". It was a fairly basic framework to being with. The no compilation/build step was the only interesting idea and that's going away.
The runtime is actively developed but I find this statement amusing:
> We’re not chasing feature parity with other runtimes.
The release notes on Node/NPM compatibility would suggest otherwise.
pier25 8 hours ago [-]
> KV is dead
Yeah this is a terrible move. Companies aren't relying on KV precisely because it's in beta not because it was a bad idea. I use Cloudflare Workers KV a lot and I'm not interested in durable objects. I was really interested in Deno KV until now.
Plus the optics of announcing a product and abandoning it are not good. Ryan is a great technical guy but these decisions don't look good from a strategic perspective.
fallinditch 10 hours ago [-]
> KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.
I think you're right, I was just about to use it for something but now I'm considering other options...
wyuenho 12 hours ago [-]
Most developers weren’t deploying simple stateless functions. They were building full-stack apps: apps that talk to a database, that almost always is located in a single region.
I wonder if this is true in general for most people on serverless these days. If so, whether this is what the original intention of this movement and whether these people just don't want to deal with docker/k8s.
mosura 11 hours ago [-]
My gut feeling is that people want a modernized heroku. Managed RDBMS and an auto scaling set of servers that use it.
That covers a massive proportion of the companies that don’t need or want massive scale.
o_m 9 hours ago [-]
Most people and even most companies don't need horizontal scaling. Hardware has been much faster and cheaper since Heroku's heyday. Scaling vertically with 80+ cores on a single CPU and 256gb+ of ram only costs a few hundred dollars a month these days. With caching on a server like that, it can handle a million requests a second, or tens of thousands a second for dynamic data from the database on the same server.
leptons 1 hours ago [-]
If Deno were supported on AWS Lambda I might think about using it. FaaS on a major infrastructure provider is what I need. I'm not putting a project that means anything to me on Deno's servers, they aren't really leading the industry and might not be around in the blink of an eye the way the tech world is going lately.
quantadev 18 minutes ago [-]
When there's two competing technologies, and they're both solving the same problem, I always go with whatever's more popular, unless there's a compelling reason not to. This is mainly because of community support and having more companies/developers motivated to rapidly solve challenges that come up, and/or fix bugs.
But especially in the AI/LLM era it's even _more_ important to use what's most popular because the LLM will know more about it, and can pull information from vastly more resources, including it's most important "source" of all: The model weights.
nicce 11 hours ago [-]
> One of the biggest questions we’ve been hearing is about Deno Deploy — specifically, the reduction in available regions. While we understand the optics of this scaling back, it isn’t for the reasons often feared or accused.
> Rather, reality is: most applications don’t need to run everywhere. They need to be fast, close to their data, easy to debug, and compliant with local regulations. We are optimizing for that.
Why does this sound very odd? I chose to not use Deno Deploy because region was not close enough and it would have just make everything slower than using other means. (Because there are many options to host data closer to overall end-users, and some regulations also happen on country level)
k__ 10 hours ago [-]
It might just be my perception, but I had the impression Deno got his ass whooped by Bun and Node.js.
While some people whine about the Node.js compat, I'd assume it's the main point that kept Deno on life-support in the long run.
Bun did it right from the start and it seems people love it. Being quite a bit faster than Node.js (even with the compat APIs) and Deno obviously helps too. If they keep that going, they'd enter Go level of performance.
ctz 2 hours ago [-]
It is instructive to compare Bun and Deno's issue tracker. Like, the five most recent issues for Bun at the time of writing are all crashes. Some of these are controlled panics or assertion failures, but others are like "we are now executing from address -1" or "we are trying to read from address 0x00000069." Recently written software simply should not have these classes of problem.
Sytten 12 minutes ago [-]
When you read the Bun codebase it is scary how they ignore edgecases. The codebase of Deno is actually legit, I used it as reference for LLRT modules.
This is really a case were Rust will shine compare to Zig.
nicce 20 minutes ago [-]
In Chrome, many of those bugs would be $5000 bounty and you don’t even need the exploit code.
11 hours ago [-]
eknkc 12 hours ago [-]
Whenever I read a blog post assuring me that something is not how it looks, it turns out to be exactly how it looks at the end.
BTW, I don't use deno and haven't been following any news whatsoever so this is simply a shitty statement from an outsider. It is interesting that I tested deno a couple of times but kept using node until bun came around and I basically switched to bun. I can't say why exactly.
CuriouslyC 11 hours ago [-]
Bun has high node compatibility with lightning fast testing and a good/fast built in package manger. I'd use bun for local dev even I was deploying with node.
bredren 10 hours ago [-]
Why not esbuild? It was ~fast enough first and free of capital entanglements.
CuriouslyC 8 hours ago [-]
Bun is still faster, and Bun's testing is insanely fast -- I had a test suite that would take 30 seconds with Jest that finished in 800ms with bun. Plus Bun's networking performance is insane compared to Node, and you can have a lot more concurrent clients in a light vpc (think 1gb).
fluidcruft 11 hours ago [-]
I agree also as an outsider. These sorts of "meta" discussions always smell of spin aimed at investors and usually are not good news for customers. Customers generally care about things like product and long term reliability and stability. These meta things always have the tone of Monty Python's "Bring out your dead!" segment.
teucris 8 hours ago [-]
I’m seeing some debate on Deno’s decision to ensure Node compatibility, apparently as it gives up a core value prop of early Deno to try and hit the reset button.
Can someone help me understand what was lost here? Is there no longer a way to use Deno without using the Node ecosystem?
ffsm8 8 hours ago [-]
Mark Twain was born in 1835, made the quote in 97 and died in 10.
So the quote was done around 60 yrs old. And he perished roughly 1/4 of the of the time later.
Demo was released in 2018, it has now quoted the statement, 7 yrs later.
I guess the next 2 years are gonna be interesting?
thunderbong 8 hours ago [-]
Is this a pattern seen elsewhere?
ffsm8 8 hours ago [-]
Of projects and companies saying "we're not going to close down" and then - shortly after - shutting down?
It's not rare, so kinda.
The reason a project addresses these rumors at all means that they've noticed a trend and are worried about it.
Just like meta isn't publishing articles how react isn't going anywhere - they know it won't, despite the countless articles claiming otherwise.
What this kind of statement actually means it's basically "we're not secure, but we can't admit to it as that would cement it."
Which funnily enough applied to Twain too, as he did indeed suffer from the illness people were gossiping about. It was just a lot less eminently dangerous then the rumors claimed
ecares 11 hours ago [-]
Deno is just a marketing company dressed as a software startup
55 minutes ago [-]
rc00 10 hours ago [-]
And Rust is the perfect lingua franca to accomplish this. The demise of both of these trends cannot come soon enough.
jppope 11 hours ago [-]
Personally I love what Ryan and the Deno team have done, if there is anything to really say its that incumbency in languages and software ecosystems is really strong- and its stronger the further down the stack you get.
I will say that I was disappointed when they added NPM into the project, I understand why they did it but I would have preferred they not do it.
With that said all of my blogs and client sites are all being happily built in lume with deno right now (hosted on cloudflare) and they have been great for years now. I am still very happy for having made that change.
rafram 11 hours ago [-]
I’m not sure why anyone would want their JS runtime to be their package manager, code formatter, compiler, bundler, web framework, KV store, and cloud provider(!) all at the same time. There’s just no way that they can ship the best product in all of those categories.
rglover 4 hours ago [-]
It's less likely (and really, depends on the quality of the team and their attention to detail—it's not a "law"), but certainly not a "no way." I'm far more concerned when the official answer to a feature missing is "just use a third-party!" That spoils a lot of tools for me as I immediately read that as "you're on your own, buddy, have fun with your Frankenstein app!"
Over time, unless the team building the thing is entirely tone deaf, I'd expect each individual tool to improve as demanded/necessary. Not only that, but knowing that those tools are being thought about as parts of a whole is deeply comforting (I trust them more than standalone tools as interdependency headaches have likely been solved).
One of the biggest headaches in JS is the tendency for tool builders to just eschew responsibility in favor of sending their community on a goose chase. I commend the Deno folks for taking this approach. We should have more, not less of this attitude in the ecosystem.
popcorncowboy 11 hours ago [-]
I get the "earnest, 'authentic', 'responsible' engagement by the CEO", but this post and title is lifted straight out of media-playbook-fails-101. Post a title like this at your peril. The content doesn't "own" the missteps, it writes the epitaph of Deno. All this article does is validate the "reports" of "demise" and unavoidably presents as "doth protest too much". If you insist on engaging with a negative narrative there are more constructive ways to frame it. Don't talk about "committing" to anything, just DO. Ryan, if you do nothing else, think about changing the title. But this entire post should ideally get rewritten. There are some really positive things you're doing. But they're covered in stink.
eqvinox 10 hours ago [-]
This is really OT, but if I don't ask now I might never get an answer…
Someone mentioned to me "Deno-style event loops" / "Deno-style main loops". I asked what that is but they were gone. I've tried to look it up, to no avail.
I do quite a bit of work on low level event loops. I'm continually interested in how different projects are doing it and what ideas and tricks they come up with. It bugs me to no end that I can't find anything on what this "Deno style loop" is supposed to be.
Anyone know what's meant / have a pointer or two?
skybrian 9 hours ago [-]
I’m guessing the use of Promises rather than Node-style network programming, which gets pretty hairy.
TiredOfLife 10 hours ago [-]
"Google Stadia is not shutting down" was posted by Stadia 2 months before being shut down.
v3ss0n 11 hours ago [-]
There is barely very little positive points in moving existing node to deno - which are mostly frontend.
NHQ 9 hours ago [-]
Deno ought to become an engine for the innovative development of web browsers. That is what we need, and what it could very well offer. Better permissions, protocol choices, simpler extensions, all around more options and control.
Business wise turn their deploy system into a resource for the browser base, for instance app store, for instance flash compute/rendering, for instance agent hosting services.
afavour 12 hours ago [-]
I’m sure a bunch of the criticism of Deno is exaggerated. But there’s something fundamental holding me back from investing my time in Deno, or Bun for that matter: they’re both VC funded.
The post is a good illustration of why that matters. Very little of it is about Deno itself, instead it’s mostly about the paid-for services Deno Inc offers. They have to prioritise and chase that because their investors want to see financial growth.
It’s the same reason they abandoned the idea of Deno being a fresh start in the JS ecosystem (an idea I loved) and instead started patching in Node compatibility layers. They had to reduce friction, not add to it. But for me that compromised the reason for using it in the first place.
Node has many flaws. And it’s boring. But its existence doesn’t depend upon the whims of random investors who might decide to pull the plug at any moment. So I’m just going to stick with it.
nchmy 11 hours ago [-]
Could it be that they added node compatibility because people wanted node compatibility. If investors pushed for it as well, then they were just being sensible...
I started working with JS/TS just before Deno 2 came out and having, essentially, full node (and TypeScript) compatibility was the primary reason I switched to it. It is all just so simple in comparison to node.
But, I agree about the VC funding - it certainly gives cause for concern about Deno's direction and longevity. But what other option is there, really? Hopefully what was said in this post about reduction of Deno Deploy locations being a function of use rather than economics was true
afavour 10 hours ago [-]
> Could it be that they added node compatibility because people wanted node compatibility
I imagine that’s exactly the reason! But they outlined their reasoning for a clean break pretty well in their 1.0 announcement post[1] and they haven’t, to my knowledge, posted a follow up “here’s why we were wrong about all that” post.
All of which is to say I understand the business reasons why they did it, but to me it compromises the original technical promise of Deno. A rebooted, sensible JS ecosystem was the reason I was interested in Deno originally. I use Node every day and I’m mostly happy with it but whenever I need to dive into a dependency to see what’s going on it’s a five layer deep rats nest of transpiled and sometimes even minifed code using different module formats and platform specific APIs. I’d love to be done with all that.
Sometimes it pays to be bold when you’re challenging an entrenched incumbent. Any non-Node JS platform has to pitch "don't use the status quo, take a risk, use me" and absent the original benefit I don’t see a good argument to use Deno, especially when factoring in the risk of VC-driven priorities. I’m not saying everyone has to agree with me on that but it’s my personal perspective.
A lot of the boldness stands though, even with the compatibility layer, and it's a "layer" more than a pivot for Deno. deno.json is still far simpler than package.json. Deno still takes a batteries included approach with smart defaults by default. Deno still pushes you toward a modern-standards "native" approach: ESM by default; ESM native libraries including a growing "standard library" on JSR; your dependency graph is still mostly an importmap you can also dump directly into a browser, too (even some of the compatibility shims with the npm ecosystem).
Deno still has a permissions model that is very different and far more opt-in than Node. This post makes a case for thinking of Deno's deep, native OpenTelemetry support as something very new and different from Node's approach, and clearly important to the future of application deployment and observability.
Technically Deno is still very interesting in technical promise, especially compared to Node, even with a larger compatibility layer.
nchmy 2 hours ago [-]
You nailed it. Its incredible how many dunces here are lamenting how Deno abandoned everything, when all they really did was add a compatibility layer.
afavour 2 hours ago [-]
Maybe next time instead of name calling you could consider that other people simply have different perspectives to you?
I and others are lamenting that the compatibility layer removed incentive to help create a new JS ecosystem that isn’t layers of garbage piled on top of each other. That new ecosystem is what I wanted and Deno is no longer the path to it. If that makes me a “dunce”, so be it.
bityard 8 hours ago [-]
I almost never write server-side JS/TS so I don't have a horse in this race, but it sounds like a good time and reason for a community fork that eschews legacy compatibility in order to focus on only modern features.
afavour 8 hours ago [-]
Announcing Oden...
zem 8 hours ago [-]
when it hits 1.0 they can feature freeze and call it "done"
nchmy 3 hours ago [-]
Evidently people didn't sufficiently value the technical promise of Deno and just wanted a (MUCH) better node. But, it also has plenty of new, bold, extra things (a sibling comment elaborates on it much better than I can). I, for one, am quite happy with it.
Given that you still use Node, you might want to try Deno 2 out... It'll likely solve a lot of your headaches.
afavour 1 hours ago [-]
I’ve tried it, I like it, but the positives for me personally aren’t worth investing in a VC backed product. I don’t have many headaches with Node itself these days.
vegadw 3 hours ago [-]
> "But what other option is there, really?"
Not taking VC funding, having slow organic growth, making a good product, and having pride in your work?
Like, maybe I'm missing something, but why does the end goal always have to be VC funding and acquisition? Is it too much to ask to stay independent and just make something you take pride in and enjoy the craft over many years of a successful, but not self-canibalizing, business?
I dunno man, I just keep seeing every smaller business's end goal to be acquired or turn into a money pumping SASS and it's just depressing. Lets make good things, enjoy delivering a product that people like, and spread good. Keep you and, if you have employees, making a good living and be happy?
nchmy 3 hours ago [-]
Have you considered that its a much larger job than someone can just bootstrap? Also, Node itself was initially sponsored by a corporate entity, and it lead to considerable problems as well. Now it has backing in many other ways.
Also, are you aware that Deno is being built by literally the creator of Node? This isn't some get rich quick scheme - its something that he deeply wants to see exist. He's also leading the charge against Oracle (a genuine parasite) for the copyright/trademark of Javascript.
pier25 8 hours ago [-]
> If investors pushed for it as well, then they were just being sensible...
Not really.
The biggest issue with Node is the dependance on the fragile NPM ecosystem. Strategically, fixing this is the thing that would distinguish Deno and make it more valuable.
And Node is already adding TS and other features that were initially the reason to leave for Deno.
nchmy 3 hours ago [-]
Deno created JSR, and it already has TS (node may take a while to implement that) and plenty more.
pier25 2 hours ago [-]
JSR is essentially NPM with a couple of extra improvements.
And it's only a matter of time until Node has full TS support.
wink 10 hours ago [-]
yes of course, more people probably wanted node compat than people explicitly didn't want it (like the person who you replied to, and me too).
You have to decide where to go and apparently not being a niche product was one the reasons, that's fine - but now they have to live with at least 2+ unhappy (ex?) users.
Lerc 9 hours ago [-]
>it’s mostly about the paid-for services Deno Inc offers.
In a way I think that's a good thing. Their plan for making money is to provide those services. That goal is enhanced by Deno being healthy. I would be more concerned if Deno was the product they were wanting to sell.
As long as Deno itself is FOSS, then I think I'm ok with it.
WorldMaker 7 hours ago [-]
Also, I've been using Deno Deploy for hobby projects and it is a delight to work with so far. In terms of finding a product that is a good complement to the open source Deno, they seem to have good ideas. Though I'm still in the VC-subsidized freeloader category in my hobby usage today, so I haven't experienced it yet as a paid product.
lolinder 11 hours ago [-]
Yep. Honestly, the pivot to Node/NPM compatibility was the moment I lost interest in Deno. I know why they did it, and as you say from a financial perspective it makes complete sense, but they had the chance to be a fresh start to the whole ecosystem and they gave that up.
I really like coding in TypeScript and think that most of people's irritation with JavaScript isn't actually related to the language so much as the ecosystem of NPM. The exponentially growing web of dependencies and the constant churn of deprecations are exhausting, detracting from a core language that is now pretty solid.
Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors. And again, that's totally fair—resetting an ecosystem is risky and probably wouldn't have yielded the return they needed! But giving up on that was giving up on what made Deno different and interesting. If I'm going to use NPM anyway why not stick with Node while I'm at it?
the_gipsy 10 hours ago [-]
They basically said they were taking a risk (node/npm incompatible) for a big long-term benefit. They gave up on that forever, for some short-term growth. How many more times would they back-pedal on any risk they announced taking?
skybrian 10 hours ago [-]
They did say something like that, but I don’t remember what the big long-term benefit was supposed to be. What specifically did they give up? Maybe it wasn’t that important after all.
afavour 10 hours ago [-]
Their original announcement post covers it pretty well:
I haven’t done all that much network programming in Deno, but I think it’s still fairly easy to get “promises all the way down” by sticking with Deno API’s? What’s your experience with this?
afavour 9 hours ago [-]
My complaint is not with Deno APIs. From what I’ve seen they’re great. My problem arrives the moment you install a dependency, because it isn’t using Deno APIs. And digging into exactly what any dependency is doing is often an odyssey through transpiled-to-ES3 JavaScript, outdated APIs and so on.
The original promise of Deno was a consistent ecosystem. Absent that it doesn’t matter to me all that much how great Deno is within itself, the case for using it simply isn’t compelling enough. These days the newer, standards-compliant Node APIs are pretty good too!
monooso 6 hours ago [-]
Surely you can choose to only install "Deno native" dependencies?
It may sometimes be difficult to find such an option, but that was always going to be the case without Node compatibility.
Now, in theory at least, you have the option of sticking with Deno native dependencies, and an escape hatch when none are available.
That seems like the most pragmatic solution to the ideology vs adoption dilemma.
the_gipsy 8 hours ago [-]
They gave up on a quality Deno ecosystem. And being relevant.
dkarl 10 hours ago [-]
> Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors
Maybe it was too risky for users? The people with the most appetite for a new start and a new way of doing things are people who are suffering from their existing investment in Node. Making a halfway jump to a new platform with no path to completing the migration would leave their customers running on two platforms indefinitely. It's the worst-case outcome for a migration, going halfway and getting stuck with your feet in two canoes.
By supporting Node, Deno lets customers switch to something new and better and bring their legacy baggage along as well.
lolinder 10 hours ago [-]
It was always going to be too risky for a subset of users, from the moment they announced it. That would not have stopped a project that was not VC funded—a smaller project with less at stake could easily have stuck to their guns and just appealed to the people who were actually interested in pioneering a new ecosystem.
skydhash 11 hours ago [-]
Yeah. JavaScript is fine if you’re dealing with the DOM and vendor libraries, or if you’re using it in some scripting environment like GNOME. Node were ok too, but failed to develop a standard library like Go or Python. Addressing that failure would go a long way towards a better JS ecosystem.
skybrian 10 hours ago [-]
Deno’s standard libraries [1] seem pretty nice, but to be honest I never used them much, because lots of stuff on MDN works fine in Deno.
I stumbled upon Deno when I needed to spin up a simple API to add to/update a CSV file, and really the only thing I found was deno-csv library, and it worked great. I was pleased with how easy it was with Deno, had it going in under an hour.
1vuio0pswjnm7 6 hours ago [-]
"But there's something fundamental holding me back from investing my time in Deno, or Bun for that matter: they're both VC funded."
For me it is the lack of support for musl. Perhaps there is a connection between inattention to certain details and being VC-funded.
bredren 11 hours ago [-]
Yes. And m this is also why Poetry remains good enough in the face of PDM.
That said, next.js achieved widespread adoption and displaced create react app. And however you feel about the framework, react itself, are possibly reasons to believe.
What others are out there?
ramesh31 10 hours ago [-]
>And it’s boring.
This is a feature. Once upon a time, Node was the new hotness that all the cutting edge hackers were excited to play around with, and needed a hard sell to management. It has since graduated to IBM status - i.e. "no one ever got fired for...". And thank god for that. It's the most mature possible ecosystem choice at this point for the niche it fills, and we are able to build rock solid maintainable systems with it (and hire people who know it deeply). That didn't come cheaply or easily (IO.js drama anyone?), and anything that wants to take its place will need to make it through the same process.
3 hours ago [-]
3 hours ago [-]
redwood 11 hours ago [-]
Anyone looking at Mastra on top of Deno for at-scale concurrent agent orchestration?
11 hours ago [-]
Rendered at 23:37:44 GMT+0000 (Coordinated Universal Time) with Vercel.
Honestly that seemed really obvious from the start - it's hard to think of many use cases where this isn't the case. Glad they realised anyway.
It seems like they never replied to the criticism against their momentum (something I haven't seen myself, what would the argument even be), was that intentional or just missed?
> Some of that criticism is valid.
Would have been great to also outline what criticism is/was valid, and how they're aiming to solve those things. Sure, maybe a bit "shoot yourself in the foot" but personally I really prefer companies that are upfront about the drawbacks, and makes it more likely I'll chose them. Migadu is a great example of this, where they have a pro/con page where they are upfront about the drawbacks of using Migadu (https://migadu.com/procon/). Just the existence of that page is probably ~20% of why I chose Migadu in the first place.
> Since the release of Deno 2 last October - barely over six months ago! - Deno adoption has more than doubled according to our monthly active user metrics.
The obvious question is: doubled, but compared to what? And what are they measuring? They’re not disclosing any real metrics on adoption.
I think what happened is that people were giving them the benefit of the doubt because they were new and you could imagine huge growth. The disappointment is by comparison to vague hopes and dreams.
At some point, rather than coming up with native solutions to those pain points, they retreated and started leaning on backwards compatibility as a workaround.
Today, Deno feels more complex than Node does because it contains both approaches. And now there are lots of edge cases where a Node package ought to work, but doesn’t because of one unimplemented API or option or a bug that exists only in Deno. My favorite testing framework, AVA, still isn’t supported.
I used to just ignore the npm compatibility layer and target Deno itself, but that’s become more cumbersome to do over time. For example, look at `deno run —help` and look at how many command line options and env vars there are. It’s exploded in the past few years. A lot of that is for npm interoperability. For me, it’s just a lot of noise.
The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter. Yet they don’t seem to want to do that.
I really want Deno to succeed, if for no other reason than because it’s pushing Node to do things that they should’ve done years ago, such as adding a permission system. I just don’t think the current vision for Deno is very coherent or consistent with its original purpose.
Have you checked recently? The docs (https://docs.deno.com/runtime/fundamentals/testing/) specifically mention AVA as being supported. Then again, I'd assume that most devs using Deno just use the built-in `deno test` instead of a third-party testing framework.
> The one area of Node compatibility that I want the most is support for ESLint configs in the Deno linter.
Again, have you checked recently? According to the docs this is supported: "Deno's built-in linter, `deno lint`, supports recommended set of rules from ESLint to provide comprehensive feedback on your code. (...) You can specify custom rules, plugins, and settings to tailor the linting process to your needs." (https://docs.deno.com/runtime/fundamentals/linting_and_forma...)
I've been using Deno for 6 years now. And I'm actually quite happy that most Deno projects don't have a custom testing and linting setup.
I feel similarly. The standard configurations (e.g. tsconfig, linting, formatting) and bolts-included tooling (test, lint, fmt, etc.) are what make Deno so great for developers.
I've started using Deno in my spare time for various projects - and it just _feels_ more productive. I go from idea to testing TypeScript in minutes - which never happened in Node land.
It’s like they looked at what Vercel did with introducing a deployment platform after their initial NextJS work and wanted to follow suit.
I had thought a lot of what Deno was setting out to do was cool beans for a time but parity was faster to come from js/node than expected.
The whole selling point for me was that deno was node without the bullshit and baggage, but they dropped that and basically just turned it into node with built in typescript support and a few other minor things like the permissions.
Similar story with bun.sh - node backwards compatibility (although not using V8).
Does anyone know of a server-side typescript scripting engine that is not trying to be backwards compatible with node?
If you want the developer experience of using something that’s not Node, you can still get it from Deno.
But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
The answer is obvious in the programming language case: for those who do not want async, the addition of async/await begins to poison the ecosystem. Now they have a growing list of libraries that they cannot use if they want to avoid async, so the effort involved in picking a library goes up and the odds get increasingly high that they're locked out of some of the key tools in the ecosystem because new libraries without async become harder and harder to find.
For those who really hate colored functions, the addition of async is the removal of a feature: colorless functions are replaced with colored functions.
The same can be said of NPM compatibility. Sure, I can try to avoid it and stick to Deno imports and inspect each library that I use for NPM dependencies. But it gets harder and harder as time goes on, because a key feature of Deno has been removed: it's no longer an ecosystem reset.
So it reminds me more of trying to avoid CGO in Go or avoid “unsafe” in Rust.
It would be worse if Node-specific types started appearing as function parameters or return types, but that seems fairly rare even in npm libraries, so it seems easy to avoid.
Node API yes, NPM library no. If you add a dependency on a library that uses NPM you now depend on an entire web of transitive NPM dependencies, with all of the problems that entails. People don't dislike NPM because it's aesthetically displeasing—you can't just abstract away the problems with NPM. The ecosystem causes real problems with real software, and Deno initially recognized those real problems and set out to reset the ecosystem.
The only way in which NPM-compat is different than colored functions is that there's no static compiler feature telling you when you've added a dependency on a bunch of NPM libraries.
Though, it is nicer if it’s on jsr.io because you’ll see Typescript source code in the debugger.
There’s nothing about starting over that prevents ending up with a whole new rat’s nest of dependencies, if you’re not careful.
In Rust, you can call async functions from normal ones by spawning them on the executor. The .await syntax isn't as painful as dealing with callbacks and closures in JavaScript. Plus, if you call an async function incorrectly, Rust's compiler will catch it and give you a clear error message, unlike JavaScript, where bad function calls could lead to unpredictable behavior. The premises of the article don't apply, so Rust's async/await doesn't introduce the same "colored function" issues.
(See also https://without.boats/blog/let-futures-be-futures/ )
JavaScript itself has come a long way towards making coloring less painful. TypeScript+ESLint solves the weird unpredictable behavior issues with JS and async/await solves the syntax issue. Promises in general give well-defined semantics to calling an async function from a sync function. But all that only undoes some of the arguments about function coloring, not all of them. Fundamentally the same question applies: do you make async-ness part of the type system or do you instead build a system like green threads that doesn't put it in the type system?
I happen to think that coloring functions according to their async-ness is actually the right move (with the right ergonomic improvements), but plenty of people don't agree with me there even with all the ergonomic improvements Rust and TypeScript have made to the model.
If you're gonna argue that fragmentation is a problem in the node ecosystem (which I agree with), you can't convince me that a plethora of approaches to asynchronous code is preferable to async/await and promises.
1) The original essay that coined this term was looking at it from a language design perspective. The argument is a fair one if that design question is up for debate, but that isn't the case for Javascript.
(In general, I do agree that "you don't have to use it" is not a strong argument.)
The problem is that if you think statically, you can say "oh, just use the 'clean' subset". But the world is not static. If you think dynamically, you can see the full Node ecosystem as a fairly powerful attractor; why would I write a deno library that only works on deno when I can write a node library that works on both? Well, if I'm writing in the Node ecosystem, why not use the whole thing?
This is a general effect; it is very hard for people to set up long-term ecosystems that are "too close" to existing ecosystems. Generally the new thing will either get pulled in (as in this case) or ignored (as in the many cases of 'hey Typescript seems to have worked pretty well, let me do a Typescript-like gloss of this other language', which generally just get ignored). There are successes, like Typescript (JS is in general a special case because being the only language in the browser for so long it was both a language and a compile target; most other attempts to "Typescriptify" a language flounder on the fact that few if any other languages have that situation), Elixir (managed to avoid just being absorbed by Erlang, IMHO kind of a similar situation where the 'base langauge' for the ecosystem was not really the best), and the occasional Lisp variant that bubbles up (though like Clojure, usually with a story about where it can be run), but in general this is very hard to pull off, harder in some ways than simply starting a brand new language ecosystem, which is itself no picnic.
Also, promises already color functions just like callbacks do. Async/await just changes the syntax by which that coloring is expressed. The real problem people have with async is that they prefer green threads as a solution to concurrency, not that they don't like the syntax.
Of course, in (browser-compatible) Javascript, some things can not be done synchronously, but that's not up for debate.
Because you are losing something: a better ecosystem. Standardizing around… standards is a good thing. When I dive into the dependencies of any given Node app it’s a mess of transpiled code, sometimes minified even, there’s no guarantee what API it’ll be using and whether it’ll run outside Node (is it using the W3C streams API or the Node streams API?). But inertia is a powerful force. People will just use what’s there if they can. So the ecosystem never gets made.
> But it turns out that few people care that much about purity, so it’s fortunate that they’re not relying on that.
By that logic we never build anything newer or better. Python 3 is better than Python 2 and sets the language up for a better future. Transitioning between the two was absolutely torturous and if they just prioritised popularity it would never have happened.
Because they are losing something.
All the time and money they are investing into node compat could have been used towards a Deno first party ecosystem. It's not like they have hundreds millions to spare. Deno is a small company with limited resources.
People kept complaining that they couldn't use Deno with NPM packages so Deno ended up focusing in providing faster horses.
I’m looking forward to whatever they’re going to do instead of KV, which I tried and is too limited, even for a KV store. (64k values are too small.) Something like Cloudflare’s Durable Objects might be nice.
You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
> You can’t “force” maintainers of old libraries to do anything. Deno never had that power. For people who are interested in supporting multiple alternate platforms, jsr.io seems pretty nice?
If enough people find Deno useful enough to skip some old libraries, maintainers are "forced", even thought Deno is not forcing anyone. If they do not adapt, then someone will just create a new library with better practices. In both cases there is pressure for better JS/TS evolution.
At least that's the theory. To be honest I don't see Deno's value add. The runtime is like... I mean node works fine at this point? And the capabilities system is both too pedantic and too simplistic, so it's not actually useful.
I don't understand the value add of Bun much either. "Native" Typescript support but at the end of the day I need a bundler that does more than what these tools tend to offer.
Now if one of these basically had "esbuild but built in"....
Although now that Node itself has basic TS support with type-stripping, this substantially improves matter. But that's a fairly recent thing, both Deno and Bun predate it by a long time.
Also Bun has a built-in bundler? I'm not sure how it compares with esbuild tho.
Cloudflare Workers workerd comes to mind but it's fundamentally a different thing.
https://github.com/cloudflare/workerd
It's not meant to be a generalist backend runtime and it provides almost zero batteries.
I'm sure you can find other projects that are going to fail, but why do you want to?
Node has lots of problems (I am basing this statement on the fact that it's a major tech project). None of them are sufficient to prevent it from being extremely widely used.
To fix those problems in a product that will be used, it is not sufficient to provide something sort of like Node but without those problems. You either have to:
1. Provide a tool that requires a major migration, but has some incredible upside. This can attract greenfield projects, and sometimes edge out the existing tool.
2. Provide a tool with minimal migration cost, and without the same problems. Maybe this tool can replace the existing one. Ideally there will be other carrots (performance, reliability, ease of use). Such a tool can get people to migrate if there are enough carrots, and the migration cost is low enough.
Deno was a classic example of messing this up. It's not #1 or #2, it has the worst of both worlds. The upside was that it did things "the right way", and the downside was that you couldn't run most code that worked on Node. This is the kind of approach that only attracts zealots and hobbyists.
What's the point? If you're in love with static types, but have to do JavaScript because you're targeting the browser, I kind of understand why'd you go for TypeScript. But if you're on the backend, and don't need anything JS, why limit yourself to TypeScript which is a "Compile-to-JS" language? You control the stack, make another choice.
People like to sneer at TypeScript, but let's be honest: people like to sneer at anything that's popular enough. The fact is that no language that I enjoy better than TypeScript (which is already not a very long list) stands any chance of adoption in an average workplace.
It also interops nicely with F#, so you can write a pure functional library in F# and call it from a C# program in the "functional core, imperative shell" style.
It has an incredibly solid runtime, and a good type system that balances letting you do what you want without being overly fussy.
That doesn't mean there's anything wrong with it and I've often thought to give it another shot, but it's not a viable option right now for me because it's been too hard to get started.
I realize Microsoft is terrible at naming things, but for .NET/C# it's really not that hard these days. If you want to use the new, cross platform .NET on Linux then just install .NET 8 or 9.
New versions come out every year, with the version number increasing by one each year. Even numbered versions are LTS, odd numbered releases are only supported for about a year. This naming scheme for the cross-platform version of .NET has been used since .NET 5, almost 5 years ago, it's really not too complicated.
These days you just add the Microsoft package repo for your distro and then do `apt install dotnet-sdk-9.0` or whatever.
It's also been spreading into the official distro repos. Nix, Arch, and Homebrew all have it.
Any legacy .NET projects are made with .NET Framework 4.x (4.8.1 8s the latest). So if it's 4.x, or called .NET Framework instead of just .NET, it's referring to the old one.
.NET Core is no longer used as a name, and hasn't been since .NET Core 3.1. They skipped .NET Core t completely (to avoid confusion with the old one vut I think they caused confusion with this decision instead) and dropped the Core for .NET 5. Some people will still call .NET 5+ .NET Core (including several of my coworkers) which I'm sure doesn't help matters.
Mono isn't 100% completely dead yet, but you'll have little if any reason to use it (directly). I think the Mono Common Language Runtime is still used behind the scenes by the newer .NET when publishing on platforms that don't allow JIT (like iOS). They've started adding AOT compilation options in the newest versions of .NET so I expect Mono will be dropped completely at some point. Unless you want to run C# on platforms like BSD or Solaris or other exotic options that Mono supports but the newer .NET doesn't.
I might feel differently if I worked with a large number of people who I didn't trust, but on small to medium teams composed of very senior people using Go feels like coding with one hand tied behind my back.
At this stage, I don't think anyone needs to try and persuade anyone why JavaScript and typescript are the Lingua Franca of software engineering.
Performant, expressive, amazing tooling (not including node/npm), natively cross-platform.
An absolute joy to code with. Why would anyone want to use anything else for general purpose coding?
In my mind there are two alternative approaches in the current ecosystem: C++ where absolute 100% maximal performance is the overriding primary objective and be damned with the consequences, then for everything else just use Typescript
Something as simple as a map with two integers as a key, or case-insensitive string keys, requires jumping through hoops. Even Go and Python can do this.
Oh I get it, you’re joking
I mean, it obviously isn't, although for web development, I'd probably agree with you. But regardless, zealots who hold opinions like this, where there is "one language to rule them all" is why discussing with TS peeps is so annoying. In your world, there is either C or TypeScript, but for the rest of us, we tend to use different languages depending on what problem we're solving, as they all have different strengths and drawbacks. If you cannot see any drawbacks with TypeScript, it's probably not because there aren't any, but you're currently blind to them.
> Why would anyone want to use anything else for general purpose coding?
Because you've tasted the world of having a REPL connected to your editor where you can edit running code live and execute forms. Just one example. There are so many languages available out there when you control your stack. I understand using JavaScript for frontend web stuff, because you have no other options, and I personally have nothing against JavaScript itself. But for the love of god, realize there is a world out there outside of your bubble, and some of those languages have benefits too.
https://survey.stackoverflow.co/2024/technology#most-popular... ... JS is the most popular language in the world, per Stack Overflow.
You may as well call binary (i.e. 1s and 0s) the Lingua Franca in that case.
Lets not get started on C build chains (especially cross-compiling) ... cmake Vs cygwin Vs msvc Vs whatever else these days with hacky-and-brittle ifdefs conditionals everywhere just to make it work - chaos! JavaScript just runs on pretty much any modern computer you can sit down at or put in your pocket, and even on more exotic things that don't have it installed by default you are about 10 seconds away from installing the official package and you are off and running.
Because some of us _like_ typescript, or at a minimum, have invested a significant portion of our careers learning ts/js. We want an ROI, we just don't want node/npm.
Right, makes sense. It also makes sense that most of those learnings are transferable, it's not like TypeScript is the only language with types. So your design/architecture skills can be used elsewhere too. Locking yourself into the ecosystem of one language, then asking other runtimes to adhere to your preference sounds like a sure way of getting disappointed, instead of being pragmatic and flexible to chose the right tool for the problem.
It's extremely good! Shame about it being coupled to Javascript.
Also, when I was writing a frontend and backend both in TS, I could literally share the exact same type definitions between them. Then I could use a compiler plugin (`typescript-is`) to validate on the server that payloads match the appropriate type. It was amazing and worked quite well, and I can't really see that being nearly as easy and seamless with anything else.
But isn't that benefit just because TypeScript does compile to JavaScript and is compatible with JavaScript? Remove that compatibility, and you wouldn't get that benefit anymore, right? And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
It's not like TypeScript gives you some inherit benefit that makes it easier to convert to JavaScript, besides the fact that it's literally a "Compile-to-JS" language.
JavaScript does make it easier to target both the web browser and Node.js, sure. But TypeScript also has a fairly mature type system and ecosystem (flaws in `tsc` itself notwithstanding). Not to say that no novel approaches are worth exploring, though; I just haven't seen one that rivals my TS experience yet.
> And if you still can get that benefit, why wouldn't you be able to get that benefit with other languages too?
That depends. In many other programming languages (such as ones that compile to WASM) it's also possible to have common code shared between server and client, but it's usually pretty inconvenient to actually get the code running in both environments. It's also possible to have a common interface definition and generate types for server and client from that definition, but that's still more complicated.
Anyway I don't fault anyone for being disappointed that Deno fell into the Node.js compatibility trap. Pure TypeScript without that particular cruft is also something I was excited about. I also was excited to see what looked like some fair innovation (like their import mechanism and their sandboxing) but I don't know how that'll continue if Node.js compatibility ends up being too much of a time sink.
I don't have very strong opinions because I've never really used Deno and I probably won't even bother at this point, but I definitely would not agree that this is just a problem of needing to use another programming language instead.
Why not use it? What high level programming language would you suggest instead with the same level of performance and ecosystem support.
The way you make a scripting language fast is by getting the hell out of it and into C or C++ as fast as possible, and PHP's library ecosystem embraces that harder than just about any other scripting language, is the reason (I think).
[EDIT] My point is mainly that Node's performance isn't really that impressive, in the field of "languages one might use to write server-side Web code". It beats absolute slugs like Python and Ruby handily, but past that... eh. And really, in its speed category you'd probably do just as well using either of those and paying a little more attention to things like network calls and database access patterns.
Just one benchmark I could find https://benchmarksgame-team.pages.debian.net/benchmarksgame/...
IDK what you mean by "deal with OpenAPI", OpenAPI is a spec not a technology like graphql.
In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
It's a JS framework thing. Every mainstream JS framework has server actions or equivalent.
> Having a wrapper around HTTP isn't really a compelling reason to choose a technology for the very large majority of people: probably the opposite actually.
It is way more convenient to write a server action and be able to immediately use it in a client component than having to write an HTTP endpoint in a separate back-end project, and then regenerate your client via OpenAPI, or whatever else you use.
> IDK what you mean by "deal with OpenAPI"
I mean dealing with tooling to generate an HTTP client from OpenAPI schema.
> In all honesty (and sorry for the directness), you don't really seem to understand these concepts and how relevant or not they are to this conversation
Wrong
No. "Server actions" are a React concept, it has little to do with backend technology (the backend still speaks HTTP). This concept is completely irrelevant to most big frameworks like Express, NestJS, Koa, Hapi. Next is barely a "backend" framework: it's rather a React server framework that has a few basic backend functionalities.
Yes, there is nothing that works better than server actions. None of what you listed really makes sense to me. I have never had any runtime performance problems with TypeScript and wasn't JavaScript the most popular programming language in the world (the talent pool argument)?.
And then you would have to solve the problem of how to communicate with the client.
You aren't suggesting to handwrite an HTTP API client, right? You would have to set up either OpenAPI which is a mess, or GraphQL which is also a mess. LMK if you have a better solution.
The problem is that, unlike when using server actions, when using HTTP APIs, there is nothing that automatically generates bindings.
> If you really want to use OpenAPI for whatever reason
No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
> But it's not like using OpenAPI or GraphQL are the only two options
What are other options?
Is that really the biggest problem you face when programming? How many endpoints do you have? Even with projects with ~30 endpoints, it doesn't seem problematic to me, but maybe people regularly work on projects with 100s of endpoints, then it kind of makes sense. But I'm not sure that's typical enough.
> No, I don't. But people use OpenAPI to avoid having to handwrite an HTTP client. This is especially relevant if you are developing a public API.
People do a lot of stuff for a lot of nonsense reasons, doesn't mean that's the best way to approach things. The JS/TS ecosystem seems extra easy to fall into cargo culting too.
"Vibe coding" as a concept is a fun joke, not a workflow you employ for doing serious engineering. It was a tiny experiment that somehow people thought was a suggested way of developing software, which obviously it isn't. Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
Well I guess making Next.js apps isn't really "serious engineering"
> Read the code yourself, otherwise it'll be really hard to call yourself any sort of engineer.
I do read the code but I barely write any code by hand.
Where did I say that?
> I do read the code but I barely write any code by hand.
Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now. The description "vibe coding" is explicitly about "programming" with a LLM without reading or writing any code at all, for any purpose. If you read the code, you're not really vibe coding as originally described by Karpathy.
You replied to a comment that says "Yep, Next.js has the best support for vibe coding."
> Right, so you use the words "vibe coding" yet you don't actually understand the concept? A lot of things make sense now.
You can stop arguing that if one glances at the code one is no longer vibe coding, because in practice by looking at the code or even the LLM's thoughts you can catch things you don't want early.
What do you mean by this?
Add a route on the back end.
RPC means just call it! don't worry about REST, GQL etc.
My bad. I was conflating common idioms and actuality.
I guess we’ll see soon enough what Deploy will become since that's "imminent".
KV is dead if they've no desire to develop it out of beta and are working on something new. No reason to ever use it for a new project now.
Fresh is being refactored with an alpha in "late Q3 2025 (likely September)". It was a fairly basic framework to being with. The no compilation/build step was the only interesting idea and that's going away.
The runtime is actively developed but I find this statement amusing:
> We’re not chasing feature parity with other runtimes.
The release notes on Node/NPM compatibility would suggest otherwise.
Yeah this is a terrible move. Companies aren't relying on KV precisely because it's in beta not because it was a bad idea. I use Cloudflare Workers KV a lot and I'm not interested in durable objects. I was really interested in Deno KV until now.
Plus the optics of announcing a product and abandoning it are not good. Ryan is a great technical guy but these decisions don't look good from a strategic perspective.
I think you're right, I was just about to use it for something but now I'm considering other options...
I wonder if this is true in general for most people on serverless these days. If so, whether this is what the original intention of this movement and whether these people just don't want to deal with docker/k8s.
That covers a massive proportion of the companies that don’t need or want massive scale.
But especially in the AI/LLM era it's even _more_ important to use what's most popular because the LLM will know more about it, and can pull information from vastly more resources, including it's most important "source" of all: The model weights.
> Rather, reality is: most applications don’t need to run everywhere. They need to be fast, close to their data, easy to debug, and compliant with local regulations. We are optimizing for that.
Why does this sound very odd? I chose to not use Deno Deploy because region was not close enough and it would have just make everything slower than using other means. (Because there are many options to host data closer to overall end-users, and some regulations also happen on country level)
While some people whine about the Node.js compat, I'd assume it's the main point that kept Deno on life-support in the long run.
Bun did it right from the start and it seems people love it. Being quite a bit faster than Node.js (even with the compat APIs) and Deno obviously helps too. If they keep that going, they'd enter Go level of performance.
This is really a case were Rust will shine compare to Zig.
BTW, I don't use deno and haven't been following any news whatsoever so this is simply a shitty statement from an outsider. It is interesting that I tested deno a couple of times but kept using node until bun came around and I basically switched to bun. I can't say why exactly.
Can someone help me understand what was lost here? Is there no longer a way to use Deno without using the Node ecosystem?
So the quote was done around 60 yrs old. And he perished roughly 1/4 of the of the time later.
Demo was released in 2018, it has now quoted the statement, 7 yrs later. I guess the next 2 years are gonna be interesting?
It's not rare, so kinda.
The reason a project addresses these rumors at all means that they've noticed a trend and are worried about it.
Just like meta isn't publishing articles how react isn't going anywhere - they know it won't, despite the countless articles claiming otherwise.
What this kind of statement actually means it's basically "we're not secure, but we can't admit to it as that would cement it." Which funnily enough applied to Twain too, as he did indeed suffer from the illness people were gossiping about. It was just a lot less eminently dangerous then the rumors claimed
I will say that I was disappointed when they added NPM into the project, I understand why they did it but I would have preferred they not do it.
With that said all of my blogs and client sites are all being happily built in lume with deno right now (hosted on cloudflare) and they have been great for years now. I am still very happy for having made that change.
Over time, unless the team building the thing is entirely tone deaf, I'd expect each individual tool to improve as demanded/necessary. Not only that, but knowing that those tools are being thought about as parts of a whole is deeply comforting (I trust them more than standalone tools as interdependency headaches have likely been solved).
One of the biggest headaches in JS is the tendency for tool builders to just eschew responsibility in favor of sending their community on a goose chase. I commend the Deno folks for taking this approach. We should have more, not less of this attitude in the ecosystem.
Someone mentioned to me "Deno-style event loops" / "Deno-style main loops". I asked what that is but they were gone. I've tried to look it up, to no avail.
I do quite a bit of work on low level event loops. I'm continually interested in how different projects are doing it and what ideas and tricks they come up with. It bugs me to no end that I can't find anything on what this "Deno style loop" is supposed to be.
Anyone know what's meant / have a pointer or two?
Business wise turn their deploy system into a resource for the browser base, for instance app store, for instance flash compute/rendering, for instance agent hosting services.
The post is a good illustration of why that matters. Very little of it is about Deno itself, instead it’s mostly about the paid-for services Deno Inc offers. They have to prioritise and chase that because their investors want to see financial growth.
It’s the same reason they abandoned the idea of Deno being a fresh start in the JS ecosystem (an idea I loved) and instead started patching in Node compatibility layers. They had to reduce friction, not add to it. But for me that compromised the reason for using it in the first place.
Node has many flaws. And it’s boring. But its existence doesn’t depend upon the whims of random investors who might decide to pull the plug at any moment. So I’m just going to stick with it.
I started working with JS/TS just before Deno 2 came out and having, essentially, full node (and TypeScript) compatibility was the primary reason I switched to it. It is all just so simple in comparison to node.
But, I agree about the VC funding - it certainly gives cause for concern about Deno's direction and longevity. But what other option is there, really? Hopefully what was said in this post about reduction of Deno Deploy locations being a function of use rather than economics was true
I imagine that’s exactly the reason! But they outlined their reasoning for a clean break pretty well in their 1.0 announcement post[1] and they haven’t, to my knowledge, posted a follow up “here’s why we were wrong about all that” post.
All of which is to say I understand the business reasons why they did it, but to me it compromises the original technical promise of Deno. A rebooted, sensible JS ecosystem was the reason I was interested in Deno originally. I use Node every day and I’m mostly happy with it but whenever I need to dive into a dependency to see what’s going on it’s a five layer deep rats nest of transpiled and sometimes even minifed code using different module formats and platform specific APIs. I’d love to be done with all that.
Sometimes it pays to be bold when you’re challenging an entrenched incumbent. Any non-Node JS platform has to pitch "don't use the status quo, take a risk, use me" and absent the original benefit I don’t see a good argument to use Deno, especially when factoring in the risk of VC-driven priorities. I’m not saying everyone has to agree with me on that but it’s my personal perspective.
[1] https://deno.com/blog/v1
Deno still has a permissions model that is very different and far more opt-in than Node. This post makes a case for thinking of Deno's deep, native OpenTelemetry support as something very new and different from Node's approach, and clearly important to the future of application deployment and observability.
Technically Deno is still very interesting in technical promise, especially compared to Node, even with a larger compatibility layer.
I and others are lamenting that the compatibility layer removed incentive to help create a new JS ecosystem that isn’t layers of garbage piled on top of each other. That new ecosystem is what I wanted and Deno is no longer the path to it. If that makes me a “dunce”, so be it.
Given that you still use Node, you might want to try Deno 2 out... It'll likely solve a lot of your headaches.
Not taking VC funding, having slow organic growth, making a good product, and having pride in your work?
Like, maybe I'm missing something, but why does the end goal always have to be VC funding and acquisition? Is it too much to ask to stay independent and just make something you take pride in and enjoy the craft over many years of a successful, but not self-canibalizing, business?
I dunno man, I just keep seeing every smaller business's end goal to be acquired or turn into a money pumping SASS and it's just depressing. Lets make good things, enjoy delivering a product that people like, and spread good. Keep you and, if you have employees, making a good living and be happy?
Also, are you aware that Deno is being built by literally the creator of Node? This isn't some get rich quick scheme - its something that he deeply wants to see exist. He's also leading the charge against Oracle (a genuine parasite) for the copyright/trademark of Javascript.
Not really.
The biggest issue with Node is the dependance on the fragile NPM ecosystem. Strategically, fixing this is the thing that would distinguish Deno and make it more valuable.
And Node is already adding TS and other features that were initially the reason to leave for Deno.
And it's only a matter of time until Node has full TS support.
You have to decide where to go and apparently not being a niche product was one the reasons, that's fine - but now they have to live with at least 2+ unhappy (ex?) users.
In a way I think that's a good thing. Their plan for making money is to provide those services. That goal is enhanced by Deno being healthy. I would be more concerned if Deno was the product they were wanting to sell.
As long as Deno itself is FOSS, then I think I'm ok with it.
I really like coding in TypeScript and think that most of people's irritation with JavaScript isn't actually related to the language so much as the ecosystem of NPM. The exponentially growing web of dependencies and the constant churn of deprecations are exhausting, detracting from a core language that is now pretty solid.
Deno set out to change that and be something new, but they squandered that chance because it was too risky for their investors. And again, that's totally fair—resetting an ecosystem is risky and probably wouldn't have yielded the return they needed! But giving up on that was giving up on what made Deno different and interesting. If I'm going to use NPM anyway why not stick with Node while I'm at it?
https://deno.com/blog/v1
IMO their logic still holds up. Dahl had a whole talk about the mistakes made with Node:
https://www.youtube.com/watch?v=M3BM9TB-8yA
The original promise of Deno was a consistent ecosystem. Absent that it doesn’t matter to me all that much how great Deno is within itself, the case for using it simply isn’t compelling enough. These days the newer, standards-compliant Node APIs are pretty good too!
It may sometimes be difficult to find such an option, but that was always going to be the case without Node compatibility.
Now, in theory at least, you have the option of sticking with Deno native dependencies, and an escape hatch when none are available.
That seems like the most pragmatic solution to the ideology vs adoption dilemma.
Maybe it was too risky for users? The people with the most appetite for a new start and a new way of doing things are people who are suffering from their existing investment in Node. Making a halfway jump to a new platform with no path to completing the migration would leave their customers running on two platforms indefinitely. It's the worst-case outcome for a migration, going halfway and getting stuck with your feet in two canoes.
By supporting Node, Deno lets customers switch to something new and better and bring their legacy baggage along as well.
[1] https://jsr.io/@std
For me it is the lack of support for musl. Perhaps there is a connection between inattention to certain details and being VC-funded.
That said, next.js achieved widespread adoption and displaced create react app. And however you feel about the framework, react itself, are possibly reasons to believe.
What others are out there?
This is a feature. Once upon a time, Node was the new hotness that all the cutting edge hackers were excited to play around with, and needed a hard sell to management. It has since graduated to IBM status - i.e. "no one ever got fired for...". And thank god for that. It's the most mature possible ecosystem choice at this point for the niche it fills, and we are able to build rock solid maintainable systems with it (and hire people who know it deeply). That didn't come cheaply or easily (IO.js drama anyone?), and anything that wants to take its place will need to make it through the same process.