Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Deno 1.33: Deno 2 is coming (deno.com)
213 points by mephju on April 29, 2023 | hide | past | favorite | 116 comments


Building a KV store into the language is kind of nuts. I love how it abstracts away the local SQLite and deployed FoundationDB behind one interface. Testing would be super easy, as there is no question of "do I spin up an entire db instance or mock the db interface?" as SQLite is relatively lightweight and the burden for keeping both cases consistent falls to the language instead.

That being said, was wondering whether you can take advantage of this without using deno deploy, or if you'd be locked in by using deno kv. Also, wondering how many bugs will only show up when deployed due to the different backbends.


I dunno.

Is having a database that happens to be a product you sell as part of your runtime good, or are you creating some mixed incentives here?

Are you a database vendor now?

If not, don’t build a database.

If this is a mature product that someone else is looking after, and it’s good and free and we’ll maintained like redis or SQLite, sure.

…but it seems like building cloud databases is not the core competency of a group building a javascript runtime.

Once money is involved, it’ll either be a distraction from their core mission, or become their core mission.

A KV db you can use for quick hacks? Sure. Awesome!

A scalable production database you’re selling to people? O_o why are you doing that?


Deno Inc. has two core products: The free Deno runtime and the for-profit Deno Deploy, a Deno hosting service with 35 locations around the world. The question that often popped up was where to store data. Deno Inc. provided several guides to connect to different cloud services. But they want the friction reduced to a simple `await Deno.openKv()`.

Deno Inc. has enough expertise in running a global service that two other companies rely on their work to offer edge functions to their customers (Netlify and Supabase). Adding a database to the service makes sense. And to be clear, they don’t develop a brand new database. They build atop of SQLite and FoundationDB.


> The question that often popped up was where to store data. Deno Inc. provided several guides to connect to different cloud services.

Sure.

> But they want the friction reduced to a simple `await Deno.openKv()`.

Do “they”?

If so, who’s using it to solve that problem? …because it seems the big uses of deno deploy are not using it, fine with that and it’s pretty unclear who the “they” is in this circumstance.

Still, if it’s a thin layer over foundation db or some other established database product and this is just part of the lock-in for their cloud offering, fair enough.

It’s not like others (eg firebase) don’t do the same thing.

The messaging “we’re building a database” and “we’re offering a hosted database service based on existing mature reliable technology” are different things though.

The latter all cloud vendors do.

The former is ridiculous, and it really really wasn’t clear that wasn’t what was happening.


"They" in OP's post is referring to Deno Land Inc.


I think this kind of "batteries included" approach is interesting. My feeling is that in many cases we spend way too much time when building our platforms from various bits and pieces. Some apps and businesses would get huge productivity boost by using more opinionated platform.

Not saying this is for everything and everyone, but there's so much competition on this space that I can understand why somebody would like to try something different.


They need this because Deno Deploy (their Edge platform) is not a normal/traditional deployment target.

If Deno were nothing but "Deno CLI" (the equivalent of the 'node' executable, that runs as a typical server process on a Unix box) then you would be right. But there are two separate Deno runtimes (CLI and Deploy).


It's a lock-in strategy. There is zero reason for a language runtime to be opinionated about the database. All the good programming advice will tell you such a coupling is a bad idea. What if you want to move parts of your application to a different language in the future, do you need to redo the entire database? That is just nuts!

Database providers, cloud application platforms should be agnostic of the language their users are using. There are many such decisions deno is making where they want to be the end-to-end service. That is not a good sign. It creates perverse incentives for the company to devise lock-in mechanisms at each stage and discourage compatibility with the rest of the programming ecosystem, perhaps even undermine the rest of the ecosystem. I would stay away for this reason.


> There is zero reason for a language runtime to be opinionated about the database.

The API is extremely simple. It's little more than a hash map. That's hardly opinionated.

> All the good programming advice will tell you such a coupling is a bad idea.

That advice is outdated. Virtually any modern application will want a database, and this API can serve as a foundation for it – a foundation that conveniently already has many of the features (esp. global replication and consistency) that you will want and that are incredibly difficult to get right if you build them yourself. Think of this as the application's "file system". PaaS today usually doesn't provide direct disk access, so this is the low-level abstraction for persistent storage that is available. An abstraction that, in many cases, will in fact be all your application ever needs.


I'm not familiar with the offering but find your comment confusing. It seems to suggest there isn't lock in because it's just a hashmap, but also that it is globally replicated and consistent.


If you don't deploy to deno cloud it's just SQLite and not distributed.

If you do deploy to deno cloud it's trivial to migrate because there are many competing offerings of kv stores.


If no claim of compatibility with other backends is provided I would be skeptical that it will be trivial by necessity.

I'm not that familiar with either technology, but as an example, deno kv claims strong consistency as the default; while dynamodb requires you to explicitly ask for strong consistency in queries and does not support strongly consistent reads on secondary indexes.


When not in deploy, you'd be stuck with a non distributed SQLite database for each instance your application is hosted on. There currently isn't a way to swap the backend for the deno kv API, so you would still need to update your code to use the new kv store that would have a different API. If you use redis instead, you'd be able to easily be able to switch providers by changing configuration options instead of having to update your code wherever it is used.


> The API is extremely simple. It's little more than a hash map. That's hardly opinionated.

In that case, why does Deno have to provide this? Can't users simply create their own hashmap?


> There is zero reason for a language runtime to be opinionated about the database

* The Erlang runtime has Mnesia and ETS built in. Batteries included. It's awesome.

* Python has a built-in persistent key-value store, dbm.

* Ruby has one as well, pstore.

Now, you can always decide to use something else! 99% of the time m not going to use dbm, or pstore, but they are nice to haves. (Mnesia and ETS rocks though, and I use those all the time.)


Uh oh better tell python to get rid of pickle and _sqlite3_ modules from its standard lib. Seems too opinionated...


This KV thing is not meant to replace your main database. I see it more for catching at the edge.

Yeah maybe you'd prefer to do that with an agnostic solution like Redis (which afaik is possible in Deno) but everything has pros and cons. An e2e solution is coupled but otoh it has many objective advantages. And it's not like Deno will force you to use their cloud service after all.


> What if you want to move parts of your application to a different language in the future, do you need to redo the entire database? That is just nuts!

How often does that happen?


Either way, it's not like you can't export your data and shove it into something else like redis.


iOS apps have an opinion about which Db to use and why it should be core data.


There is a good little overview of Deno KV here (by @simonw): https://til.simonwillison.net/deno/deno-kv

So it's backed by SQLite on the OS local version.

I'm intrigued if there will be a way to swap out the backend for other cloud providers.


I think what's nuts is that databases data structures are not built into the language/stdlibs. I can make an array but if I want to add an index so I can access it fast then I have to jump through some inordinate hoops. I can make an in memory hashmap, but if I want to persist it and keep accessing it fast directly from persistent storage suddenly things get unnecessarily complicated.

I think fast searchable file backed data structures should be primitives of any computer language and runtime.


To me this is quite similar to the Erlang VM's ETS in-memory store. There's a lot of usefulness in ETS itself, especially how it distributes inside clusters. Having an easy to use, efficient, and optimized store as part of the language can be a good enabler. Only time will tell if it is/isn't.


It's not really built into the language though? It's just a function on the global Deno object, similar to how there's a global 'document' object in browsers (and that's not 'built into the language' either).


Backed into the runtime is probably what your parent meant.


> Building a KV store into the language is kind of nuts.

Isn't it part of the standard library rather than the language? Anyways, Erlang has mnesia [0] and ETS [1]. ETS is practically a KV store, which can also be persisted to disk.

[0] https://www.erlang.org/doc/man/mnesia.html [1] https://www.erlang.org/doc/man/ets.html


I think almost all languages should have a database built in. Almost all programs need a way of storing and querying data but the drama required to hook up to an external db is excessive for little programs. Basic dB functionality should be as available in language standard libraries as file access.


What do you mean by drama to hook up external db? It’s almost always just a one liner: let conn = db.newConn(host, port, …).


One line _after_ you figure out the myriad of database libraries available, half of which are abandoned, a quarter which are little toys, and if you're lucky one that has a core dev who is being paid real money to support and maintain the library.


Ah yes, that's fair. Finding the good library to rely on can be really time consuming.


Deno.dev (deploy) uses a different, proprietary runtime than Deno. I believe Ry mentioned at a recent talk that on deno.dev, the kv will be synced across all instances. Obviously this can’t be the case for stock Deno.


KV storage is SQLite, and SQLite replication and syncing using an out-of-process daemon is becoming a solved problem.

For that application at least no changes to the runtime would be necessary


Building a KV store into the runtime, that has more features when using deno deploy is just that: vendor lock-in. It's just one more way to force you into paying for the service when you have locked yourself in and you can't use an alternative store.

Deno is not and has never been an "alternative runtime" to node.


I am confused. In my book vendor lock-in would refer to a situation where a user or customer would be unable to switch vendors or only with prohibitivly large cost and often under the notion that they were unknowingly lead into that situation.

So first of all, _you_ can use whatever ORM or database solution you like in general so _you_ specifically choose which technology you want to invest in. If you want to hand craft your kubernetes cluster for your database nodes and put them on GCP or Azure, then go for it. You want redis? Go for it. If you want to use supabase, firebase, planetscale or whatever, use that. If you want to choose a KV store that's built into your runtime then that's great. Believe it or not but people use lowdb in production and their database is literally a json file. I would argue whichever you choose, switching from one hosting provider, database solution, managed or unmanaged platform to another will always incur cost and how high that is will depend on your depth of integration and familiarity with it. I am most likely going to use this feature and I am happy it exists. Also let's not ignore that there is a large breadth of different needs that developers have. Would I build a massive modular corporate CRM system with BI features with it? Probably not. Am I going to build my next D&D game tool, habit tracker or Tinder for sneaker trading for it? Maybe!

I sincerely wonder what specifically you loose from this feature simply existing.


If they provide a compelling product offering, and the people understand what's available where, and choose to use it, then I guess that can be called vendor lock-in. But it's hardly nefarious.


It is nefarious when you build up your entire product for years, spamming it everywhere as "node but better", yell "it's so open source too!" on every roof, only to sneakily slide in lock-in features, it's worse than being nefarious. You most likely had this plan in mind all along, and actively lied to get there.

Watch as the same happens with Bun.


What's stopping you sticking with Deno CLI, using it for free forever, and completely ignoring anything related to Deno Deploy?

In this relationship of 3 years and counting, they are the positive contributor because they give you something useful for free and you presumably haven't given them anything.


In this specific case I think it's super reasonable.

The KV store has two backbends: local sqlite and distributed deno cloud.

If you choose the deno cloud backend you're signing up for a closed source cloud service from the start. There's no trickery.


Why not just serialize objects to disk? I don't see how this is different from a javascript object, other than being disc-based. Unless I'm missing a detail?

Also, local KV obviously won't work for compute lambdas, as the KV isn't shared across instances.


The Mumps language did this in the 60s.


I still don't understand the true goal of Deno.

It started as "better node.js", with better security and modern tooling. Better security was not achieved imo, it's not flexible enough but painful enough that you want to allow everything straight away. The module system suffers the same issues as of npm, and since everything is loaded from the same domain its hard to tell what is official or not.

They have a builtin server, linter, testing, benchmarking, a hosting platform, now a kv store.

I feel like they are getting away from being a language and heading toward being a framework. Why not, but will probably not help adoption imo.


> The module system suffers the same issues as of npm

Tough to interpret this since NPM isn't a module system, but:

- Requiring the standards-compliant ES module system and getting rid of CommonJS is definitely a huge improvement for the future of the ecosystem

- If by "the same issues as npm" you mean "makes adding dependencies so easy that a lot of them get added", then sure I guess, though I'd argue that isn't an issue

> since everything is loaded from the same domain

It's not though; Deno can import from any URL. Personally I like to import directly from github for a lot of things. But anyone can set up their own repository or CDN if they'd like to, because all you need is an HTTP URL

> its hard to tell what is official or not

The official standard library is under deno.land/std/

> I feel like they are getting away from being a language and heading toward being a framework

Well they were never a language because the language part is (almost) identical to what was already out there. Whether or not they're a "framework" (or attempting to be one) is debatable, though they definitely take an "all the things most people need have official solutions (which work together smoothly)" approach, which is one of my favorite things about Deno. Also, a major usecase (their core business) is serverless scripts, so having all the important stuff included is even more beneficial there

I've stopped using Node for any non-UI projects. Deno has been a godsend for me


Wild take, but agree with your first sentence.

1.) it was never a language

2.) indisputably better security any time you take advantage of any of the features to disable reading of environment variables, using network, reading files, etc.


I see close to no case where you would not enable all access rights


Well they might not be your use cases, but surely you can imagine them, right?

A build tool, that reads input files and writes output files, and nothing else.

A CLI that interacts with the bug tracker, and needs to read the environment and do networking, but doesn't need to read the filesystem, launch subprocesses, etc.

A serverless function that doesn't need anything but networking.

Even when you need to allow whatever you're building to read the filesystem, just specifying which files/folders can be read is in an of itself a huge win.

Thinking on it a little more, I wonder what kind of gigantic monolith use cases are there that need to read/write completely arbitrary filesystem locations, and need to spawn subprocesses that can't be known ahead of time, and need to do networking, and need to read arbitrary environment variables that can't be specified ahead of time, and also need to load dynamic libraries, and also need that other permission I'm forgetting?

I mean, I am sure there are some, but that certainly isn't the default I'd choose.


> A build tool, that reads input files and writes output files, and nothing else.

If it wants to upload artefacts to a CDN it will also require network access.

> A CLI that interacts with the bug tracker, and needs to read the environment variables and do networking, but doesn't need to read the filesystem, launch subprocesses, etc.

Chances are the CLI needs to read ENV variables to authenticate or a configuration file and maybe even subprocesses to setup parallel processing.

> A serverless function that doesn't need anything but networking.

A serverless function will probably read environment variables as well or access a package.json file to read some context about the name of the package and it's version.

My point is that most programs will require all access enabled.


I know that's your point, but I don't understand why you think that.

Even with your own theoretical additions to what the programs above need, none of the programs above actually need "all" access enabled.

And even when you do need to allow environment variables, subprocesses, and filesystem access, you can specify which ones. E.g. this program can read this specific package.json file and no other files. Or these three specific environment variables, and no others.

Which is still significantly different than "all".


I found it really to be the perfect shell script replacement. People usually will use python for more complicated shell scripts but deno is just awesome at this. Just run the file and eventually cache the dependencies for the next run. No node modules or pip install etc Also, you can use typescript without any pre-bundling.

The security model works well for this use case also. If your shell script is only supposed to read a few environment variables and not make network calls, you are covered.


Deno is really useful on edge computing like serverless lambda functions in things like NextJS/Vercel or Supabase and such


Why? Performance? Security?


Is it considered a stable version? The amount of unnecessary breaking changes is really frightening.

I mean, why would one ever need to change standard module names or demo.json structure?

Maybe it looks a bit better, but I’m sure it invokes a lot more pain for all developers, who need to watch out carefully with every release.

I consider this a JavaScript curse to be honest. Every webpack version breaks apis in random ways, react native is notoriously hard to upgrade because of this etc

Why isn’t it considered a problem in the community?


I think Deno feels a fire under their ass. The JS ecosystem has not really embraced Deno in terms of tooling, libraries or frameworks and the serverless ecosystem has not embraced Deno as a runtime (aws/gcp/azure functions), so they're iterating rapidly to see if they can get something to really stick. And then there's Bun.

I love Deno for system scripting but the friction with the rest of the JS ecosystem has prevented me from adopting it further.


Their main issue IMO is that they are not really bringing anything major to the table in exchange for the very high cost of breaking compatibility with nodejs. If you are going to create a competing ecosystem it has to be somewhat revolutionary for the cost of investing/migrating to a whole new ecosystem to be worth it.

I like Typescript quite well, but if I'm told I'm going to throw away my nodejs/ts code base and start from scratch, then they are many alternative with better (for many definitions of "better") langages to consider, which at least will actually be different from what I have just thrown away.


> Their main issue IMO is that they are not really bringing anything major to the table in exchange for the very high cost of breaking compatibility with nodejs.

This is why my money is on Bun, just like a safe bet in Carbon. Both are designed to be drop-in-capable in large codebases where the bean counters aren't going to budget a total rewrite that redelivers the same functionality.


In regards to C++ evolution, Circle is a much better bet than something that is a basic AST interpreter.


Agreed. The way to fix that fire, however, is focusing on giving people the closest thing to a solution -- not the tools to build the closest thing to a solution.

For deno that would be fresh (the framework). And the speed of development is just not there, compared to the competitors (but everything has trouble keeping up with next right now)


Netlify edge workers use only Deno


[flagged]


I would rather they have money and a business plan. That way it’s more likely it will get the resources needed to develop it, build a community around it, and I won’t end up with a mothballed open source project that’s not maintained anymore.

Nodejs is an open source alternative you can stick with, if you don’t like commercial products.


Disgraceful?! I'd much rather OSS authors have an additional SaaS offering to support their work and work on it full-time rather than get them to work nights and weekends for free while AWS collects the checks for deploying their work.


I like to get paid for my work. Most people do. There is nothing disgraceful about that.

I don't see any difference between deno as a company and your SAAS bernard.app.


My SaaS is not an open source project nor a language runtime. How would you feel if Go, Elixir or Rust (just to name a few) were run by a company that depends on their success, so try to push aggressively new features and mindshare, often with disastrous result?

Are Go, Elixir or Rust developers working for free? There is a difference between getting paid to work on open source, and structuring an open source company as a VC-backed startup that needs to grow fast or die.

I guess everyone has already forgotten about Docker, Inc.


This is a bad analogy, it is more like if the Tokio runtime was built by a VC backed business which is really not that bad.

Honestly the anti business sentiment on HN is just becoming annoying. Get shamed when not building OSS, get shamed if you take VC money, etc. People are supposed to be saints that work for free so you can then take that and build your own proprietary stuff? Seriously...


Precisely. So much of the negativity in this thread (and elsewhere) weirdly conflates Deno with a language.

Deno is an open source runtime for TypeScript (and JavaScript and WASM). One that has been delightful, in my experience, for system scripting, build tooling, and CLI apps.

Deno Deploy is a hosted runtime that is very similar, but not open source.

The complaining in this thread reads to me like people complaining that the company that makes AWS Lambda also makes DynamoDB.

This new Deno KV is a fantastic addition to both of their runtimes. Dead simple, no need to sign up for some other service, works out of the box in either case.

But nobody is forcing you to use it. You don't have to opt out of it. If you don't like it, then you can go use all the other database layers that you could before this announcement.


A bit of a tangent, but I looks at your link to Bernard. Aren't there a lot of broken link-checking services out there already? Curious to learn what your USP is, and/or your approach to competing in a saturated market.


None that seem vaguely close to my vision. And one killer feature I would like to focus on (though it's on the roadmap for post-launch) is the ability to notify you if an URL of yours stops being accessible, i.e. you forgot, after a redesign, to set up a redirect from the old URL to the new, which is a nightmare for both users and your SEO.

I've seen that happen countless times, and no service I've tested was able to monitor for that use case satisfactorily and/or at that price point.

In other words, my goal is to create a service to uphold Berners-Lee's "cool URIs don't change" mantra.

https://www.w3.org/Provider/Style/URI.html


The standard library is separate from the Deno runtime and is currently pre-1.0 (0.185.0).

As the blog post touches on, upgrading Deno itself does not force you onto the latest version of the stdlib. You will need to change the import URLs to get it.


Deno has a very cool architecture where almost all of the standard library is imported from outside of the runtime. Which means you can pin specific versions of unstable APIs, and never get caught by surprise even as they're being iterated on rapidly. They also say in the post that eg., the deno.json file changes are backward-compatible

As far as I know, Deno 1.x runtime upgrades are always backwards-compatible


The json structure is not a breaking change. You can use the old and the new structure


Why would you ever add the complexity of having to support two different formats then?


The two formats aren't that different, and the old version is now deprecated. I doubt it adds that much complexity.


Because the new format is easier for their (new) users but removing the old one would be a pita for their (existing) users.

You can't get things 100% right all the times so you have to decide what to do when you realize there is a better way. You have a few choices:

a. You don't improve the interface and instead you offset the issue by adding more documentation, helper scripts, templates/examples etc.

b. You improve the interface, possibly in a backwards compatible way.

So, (b) presents some maintenance costs, but (a) too has some costs! Some organizations tend to prefer (a) because it's easier to just make it somebody else's problem (e.g. some other team will do documentation, some other team will do the developer advocacy).

So it's a locally optimal solution and hence it's often chosen despite (b) being a globally optimal solution, since the product will be easier to use.


Old one will be deprecated in a major version change afaik.


I love the direction deno is going with a batteries included standard library for server side JS. It makes so much sense to me to consider it for simple stuff so I could have backend and frontend code all in modern JS or typescript. Things like internal tools, dev tools, etc. that just need to pop some HTML, rendered markdown, collect a few forms, run some business processes, etc. No need to wrestle the node ecosystem, no need to context switch between backend and frontend code with other languages like python.

I really think this has a strong future in enterprise as dedicated frontend, backend and devops roles all merge into one. Everyone just learns and uses typescript for everything. Everyone is a 'full stack' engineer now. I hate to say it but it's almost like the dream of java all over, except the language and tooling isn't painful.


Something that still stop me from using Deno is the incompatibility between Deno and other tooling regarding the import of TypeScript files. Namely, it is not possible to import a `.ts` file via a `.js` import [0].

[0] https://github.com/denoland/deno/discussions/18293


Yeah, this is an annoying problem because it prevents the same code form working in both Deno and everywhere else.

Most frustrating is that (last I checked) there isn’t even an agreed upon plan to resolve the inconsistency, at least for libraries (import maps, though tedious, at least can solve the problem for non-library code).


Which other tooling allows you to import a `.ts` file from a `.js` import? Not knowing much about the issue, this seems like a rather strange decision to me: If I'm importing a `.js` file, and it does not exist, I'd prefer an error message rather than the sort of magic that you're suggesting: "Oh, but I found a `.ts` file with the same name, so let me just go ahead, transpile the file to JavaScript and let you import the result!" That's rather unexpected IMHO.


This is literally how esm support in node works today unless you turn on an experimental feature flag.

https://www.typescriptlang.org/docs/handbook/esm-node.html


Literally most of other tools: TSC, esbuild, bun, swc, babeljs, ... It is even the recommended way in TSC, esbuild and others to import a .ts file in order to emit esm-valid code. I understand the point of Deno. However, it is just too hard to adopt a tool that cannot be used in tandem with others.

I do not know who downvoted my comment: it is not a healthy behavior.


I'm not sure you understand the point of Deno if you say that the Node ecosystem does something strange so Deno should do it, too... Magic file resolution was one of the regrets explicitly mentioned in Ryan's first talk introducing Deno.


I did not say that. I think it is a good thing to do thing properly. File resolution is pretty complex (too complex) in Node/TSC. I would like others tools to support native .ts imports and translate automatically to .js on code emitting. However, it is not going to happen in the short and medium terms.

Using a fallback to .ts files when importing .js files is a minor and straightforward addition to file resolution. And this is a big gain in terms of interoperability with others tools.


I know this doesn't directly address your specific scenario, with the '.js' import. But there is a similar situation that got a lot better with TypeScript 5.

For a long time I was annoyed by the similar problem of Deno's requirement for fully-specified import paths like 'foo/bar.ts' making our Deno code not easily interoperable with our "plain-ass TypeScript" code.

We have an Nx monorepo with a large amount of "library" code, which is really just TypeScript code that uses standard TypeScript path aliases[1]. This code is used by various teams for various things, like Node programs, Svelte/Angular/FrameworkOfTheWeek apps, local utilities and built tooling, etc).

However, until recently, none of these libs worked with Deno, because internally they had to import things like 'foo/bar' and not 'foo/bar.ts'. Changing the imports to 'foo/bar.ts' would make them work in Deno, but break them for everything else.

But TypeScript 5 really changed that with the addition of a new config option (discussed here recently[2]):

    "compilerOptions": {
        "allowImportingTsExtensions": true,

This lets me take a plain-ass TypeScript Nx monorepo library, and add it to my import_map.json, and use it from Deno code as well. (Nx recently released a new plugin to help with this.)

It's still not ideal, because you have to convert all the files to start importing and exporting using the ".ts" extension, and there are probably edge cases where it breaks something. However, I have been using it since the TypeScript 5 betas and I have not hit any problems yet. (Would be very curious to hear any problems other people have had, though.)

[1]: the entries in the "paths" section of tsconfig.json, which let you map your own import names to filesystem locations, so you can

    import { Hoge } from '@acme/hoge';
instead of:

    import { Hoge } from '../../../../libs/util/hoge/src/index.ts';
[2]: Previous discussion: https://news.ycombinator.com/item?id=35185069#35186448


For the Windmill [1] project, we wrap deno to make it the backbone of a self-hostable infra to run scripts (like aws lambda), workflows (like temporal/n8n) and UIs (like retool). This release is amazing for a few reasons. We leverage enormously that you can define script with dependencies in the same file and provide a monaco editor backed by the deno lsp through websockets. So without deno, windmill would not be possible. We also support python too but it's a much slower as a language and make more sense for slow, long-running processes like ML.

> Improvements to npm and Node compatibility

Every little improvements to npm compatibility actually unblock a very wide range of npm packages to now 'just work'^tm. I would have been interested to get numbers of incompatible library but `npm:crypto` has been the biggest blocker for our users.

[1]: https://github.com/windmill-labs/windmill


While it is interesting to see the work going on this, unless something really messes up node, I don't see any big adoption ever taking off, other than how egcs forced GCC to come back on track.


The only true benefit I’m seeing as a potential Deno user is the faster startup time to reduce cold boots in a serverless setting - everything else is more in the category of “nice to haves” like ease of setup (I’m used to the workflows with Node so while it’s obtuse it’s something I know and is well documented on the web) and a tighter security model (I haven’t personally ran into any issues with this in Node but can see the benefits).

But nothings stopping Node devs from prioritizing the cold boot time, or for other upstarts like Bun with more of a focus on maintaining Node compat, or going with V8 Isolates instead, to give Deno a solid run for its money.


I was unaware of information on superior startup time for Deno and couldn't find references from a quick search. Would you be able to provide more information?

Snapshots having landed in Node (https://nodejs.org/en/blog/release/v18.8.0), is Deno still superior here?


Maybe I’m thinking more of Deno Deploy but they claim sub 400ms cold starts/40ms warm starts so yeah, pretty fast. Only other option I’ve seen that ballpark is with Cloudflare Workers.


I'm a little thrown on the reasoning for shuffling round the std/encoding/* modules to std/*

Seems like it would just clutter the std/* "namespace" and cause churn in dependencies?


Yeah, this is a very stupid change std/encoding/* is also actually the right place, why pollute the top level namespace with encodings ?


How's it namespace pollution? It won't cause any clashes in either std or the user's code. It's just a few directories moving slightly higher up in a directory structure (which as an aside, is even deeper than you might think, because it also includes the deno.land domain, remember).


Just un-necessary pollutes the drop-down list in auto-completes when someone is coding stuff that has nothing to do with encodings. There is a reason for categorization and this change breaks it.


Bun, deno, now supabase, they’re all trying to build custom runtimes with everything included from database to SSR react island based frameworks.

I think it’s a cool idea generally, but let’s not mistake deno for replacing a generalized runtime. Deno, bun, etc. are proprietary infrastructure in a box that are being designed to integrate tightly with the rest of their deployment platforms


Deno's major adoption point can be it's native typescript support but without an easy way to bring an npm package into it, developing its community will be tough. So I can totally see why they are ployfilling `node:crypto` since many cloud SDKs are using those in their packages.


You can import npm packages now. They added it a few release ago.

https://deno.com/manual/node


The polyfills are not complete yet as this update points out


Nice to see ongoing development and a view ahead.

For work I would need a good interface for SQL. In pet projects I am using sqlx for Rust and sqlc for Python. In both you can write SQL directly and get query validation and parsing into struct / PyDantic for free. Is there something like this for Deno?


I was interested in looking up sqlc for python but it seems to be for go? Is there a different version than this? https://github.com/kyleconroy/sqlc


They appear to have beta grade[0] support for Python and Kotlin. (TypeScript on the roadmap as well)

[0]: https://docs.sqlc.dev/en/latest/reference/language-support.h...


They put some work into the Python adoption and it is enough for smaller projects but I did not try it for to complex use cases. You can see it under examples. What is nice is that you can set the option to parse into PyDantic rather the dataclasses.


There's pgtyped, which I believe does almost the same as sqlc

https://github.com/adelsz/pgtyped


Didn't take too long to start the 2.0 version. I really like how Rust does not do this.


I think that difference is that Deno is some kind of commercial startup and Rust feels like community project backed by few huge corporations. So Deno needs to generate hype while Rust's not hurrying anywhere.


Are you familiar with the concept of no-2.0 for Rust? The do this on purpose so the users (the software engineers) never need to convert a version x project to a version y unlike for example Python (2 -> 3). I think this is amazing.


I wonder if they considered an edition like system for Deno versions. I would say that it’s likely too much upkeep for a commercial project, yet I somehow doubt that the Rust project has more capacity, whether that’s funding or engineering resources.


Not the goal of Deno, but right now it’s my favorite way to write single-file CLI scripts paired with zx. All the dependencies are defined at the top of the file and auto installed, TS for static type checking and familiarity, good ESM support, easy to shell out when things are easier with Bash, colors and arg parsing. Not sure I would pick it for anything in production but pretty close to my ideal for quick and dirty things.


Last time opentelemetry was a no go for Deno. https://github.com/open-telemetry/opentelemetry-js/issues/22...

is it planned for v2?


Anyone using Demo in production? If yes, what's your experience has been as compared to, say, Next.js?


I am using Supabase in production (small DAU app for a large company), which uses Deno Deploy as their function provider. So far no issues and it works well. It’s also quite easy coming from a nodejs / browser js background to pick up, as it’s just JavaScript after all.


I wrote tiny script which reloads kube deployments when GitHub CI finishes. I liked the fact that I did it with zero dependencies. Standard library is good enough, and I even had to use some crypto. Zero effort configuration is superb, I hate when I need to configure node for all the things, it easily takes few hours.

I disliked the fact that this script constantly eats few percent of CPU despite being idle 99.999% of the time. Not a big deal but a sign of bad engineering.


The equivalent of Next.js would be Deno + their "Fresh" web framework.

https://fresh.deno.dev/

My impression of Fresh is that it's neat technology but basically a barebones MVP at this stage. They haven't done much development on it since v1 last year, presumably because the team has been busy with other stuff like what's described in the article.


Yeah I tried it out but it’s clearly just a demo and not at all production ready. Maybe they should be prioritizing some this as a killer app that would bring people over, or else achieving full compat with the industry incumbents.


It uses Preact other the hood which often has edge cases vs React and its ecosystem, which makes adoption harder.


Seems like Deno is having an SEO problem with all the people autocorrecting its name to Demo lol


deno seems to follow a very similar line to php. you can do a lot with php without dependencies. but if you are not careful, there is a possibility that you will create garbage. i'm sure this will take deno further, similar to php.


Couldn't you also make this argument for JS in general? I think many folks looking from the outside in at JS would argue that the amount of "JS garbage" has already surpassed PHP.

I know you specified _without dependencies_, but I don't know that it's a fair comparison. The standard library for PHP and JS seem fairly similar to me; I work with both languages daily. Using NPM packages are much easier imo for greenfield projects.

PHP before Composer still had a ton of 3rd party script usage, but with explicit imports instead of autoloading.


Did they describe the changes going into Deno 2.0 vs. this minor release? (“Minor” as in semantic versioning—-not to downplay the great work that went into it.)


> allowing any code editor with LSP support to work great with Deno projects.

Really? And why did he switch to vscode to get autocompletions for deno kv?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: