Almost nobody is going to use Deno to serve a basic HTML site with less than a dozen pages, they are going to use it to build a monolith application to get content from a database, or a service, then generate server-rendered pages, and also host an API from the same codebase, or something similar.
Setting up a monolith application means using something like Ruby-on-Rails, Django, Spring, ASP.net, or rolling your own with a Node.js backend serving a SSR front-end, or hydrating React views.
If you haven't experienced this already, you will come away with one of two conclusions.
1. Wow, this is so much fun, as long as I stay on the happy path of ... Rails ... Django ... ASP.net.
2. Wow, setting up all these moving parts really sucks' I'm mostly writing configuration files, and then Googling the error messages when I inevitably screw something up.
What I think Deno is trying to do is make the process of getting a server side rendered application with modern tooling running with minimal ceremony, while still enabling the developer to customize the system and build things that are not covered by the documentation guides. In addition, their solution is one that can be more easily hosted on edge servers than most other options out there.
I'm glad they are doing it, because it's a sorely lacking area of modern web development. They only other people who are taking this on in a meaningful way that I am aware of are Remix. I would be happy for there to be more entrants into this field.
Best of luck to everyone out there.
3. Wow, this actually works. I am so productive and i can actually get my work done quickly, the same way as hundreds of thousands other normal developers out there. And then i can have a free time and mind free of stressing over what's cool or not
It's about what you value. If it is your time and happinnes, if you mainly want to just build stuff to solve real world problems, then choosing good old monolith application will get you there safe and fast in 90% of situations.
If you value to be hip and cool, then Deno, hydrating whatever, edge servers and sorely lacking areas of modern web development will 100% get you there.
The reasons I have for justifying use of React, JSX, GraphQL, etc. have nothing to do with being "hip and cool" and everything to do with happiness and productivity. Using modern tools is both more enjoyable and more productive in my experience, as someone who used Ruby-on-Rails with templated pages for four years.
The accounting software I use, written in 2004, evolved over the decades starting from 1980s software. It all works (hence, I’m using it now), but comparing the user experience and source code of various features developed at various times reveals a lot about what “modern” means.
Here’s a few things:
Standardisation: Modern software tends to use proven designs, such as SQL over a bespoke query language.
Performance: The strictness of ABIs and APIs makes it difficult to restructure for the sake of optimisation without introducing undesirable breaking changes. Old codebases may be essentially “frozen” and stuck on old versions of libraries. Fresh software can use the latest versions of libraries, with their new APIs & ABIs and whatever optimisations come along.
Fundamental coding improvements: Old languages don’t have ergonomic, performant closures (see lambda functions in Python). New languages do (see arrow functions in ES6). Replace “closures” with any core language feature, like null coalescing, match statements, object & array destructing (or even better: pattern matching assignment), or hygienic macros. (Let’s not discuss async/await as that doesn’t lie in the “unquestionably better” column.)
Better error messages & debugging: Over time, we collectively as developers have figured out what helps and what hinders when trying to track down troubles.
——
I also build stuff that solves real world problems. I once used Python and Django for everything. Now, I’ve moved to Node.js, TypeScript, React, and generally that ecosystem. It works far better than Python+Django. My development speed is blazingly fast in comparison, and the results stand up — they’re shippable.
Once I was familiar with Python & Django, they got the job done.
Now, I’m familiar with TypeScript & Node, they get the job done better.
Familiarity is that invisible force that, in its absence, prevents us from distinguishing things that don’t work from things that do work but are just different from what we know.
i seek out new technologies because i want to be _more_ productive than the average engineer. i don't believe there are hundreds of thousands of developers who are more productive than someone who makes an effort to learn stuff like deno, vite, unjs, fauna, etc. as soon as possible
The novel thing, for this, in my mind, is the edge hosting.
Next.js puts you squarely back in monolith server-side territory in that regard, it's just a different flavor. Of course it has it's upsides if you're doing a purely React application. But you've now got all the complexity of a server-side monolith and a frontend framework, with the added bonus complexity of components that have to switch between client and server-side contexts, then state management and hydration across that boundary as well.
Now people are going to say "Ha, idiot, you just simply..." but that's just part of the learning, so you're in no better place with Next.js/SSR in general unless you want to build an interactive application. This isn't against Next.js by the way, I did enjoy it when I needed it, it's just that there's no avoiding some of the inherent domain problems regardless of which framework you pick.
https://github.com/dodyg/practical-aspnetcore/blob/net6.0/pr...
Assumes facts not in evidence. People are already building JavaScript monstrosities to serve entirely static blog content.
why do you have to sprinkle that snark + negativity in there? it implies a toxic role of "you are superior" and "people who use JavaScript to serve static blog content" are inferior.
why can't we all just get along? especially in this tight-knit programming community that is supposed to be full of love and collaboration. in today's modern day of inclusion and emphasis on mental health, you're spreading hate towards people who use JavaScript to serve static blog content and talking down to them. not every 2022 of you, in my opinion.
let's work to get rid of the culture where you imply somebody else's code project doesn't meet your standards, and that since they wrote it, they are dumber than you for making poor design decisions in your opinion.
I think we just have to accept that that's how websites are built now. It drove me nuts for a while, too. But modern JS engines are blindingly fast, and 2-3mb of JS download (that will be cached aggressively) is a non-issue for the vast majority of users.
I started talking to a junior developer the other day about server side rendering in the days of Rails/PHP/etc. and he looked at me like I was crazy. Couldn't even grasp the concept. I think for better or worse this is where we are headed.
The fact that it is in Deno rather than PHP, Ruby or Python is the point of the article.
Not a web developer. How is this different from CGI or a regular web server? This an honest question - I don't understand the significance.
CGI is replaced by JS
Apache/nginx is replaced by CDN
web server is replaced by edge nodes (aka, glorified runtimes automagically spread across many endpoints)
The only semi-interesting thing here is that this demo pulls dependencies from 3rd party registries via HTTP without an explicit install step. It's really not that different than doing regular Node.js development with a committed node_modules (hi, Google), except that if node.land or crux.land go down, you've lost your reproducibility.
The thing about "familiar/modern techonologies" seem like superficial vanity. A vanilla Node.js equivalent might look something like this
import {createServer} from 'http'
import {parse} from 'url'
const route = path => {
switch (path) {
case '/': return home()
case '/about': return about()
default: return error()
}
}
const home = () => `Hello world`
// etc...
createServer((req, res) => {
res.write(route(parse(req.url)))
res.end()
}).listen(80)
Which is really not anything to write home about, nor an intimidating monstrosity by any measure. Serving cacheable HTML is really not rocket science, it simply does not require "the latest and greatest" anything.> except that if node.land or crux.land go down, you've lost your reproducibility.
Dependencies are cached. This is no different from if npm would go down.
> The only semi-interesting thing here is that this demo pulls dependencies from 3rd party registries via HTTP without an explicit install step
Given that this seems interesting to you, it seems you haven't heard of Deno (https://deno.land). It is not related to node in terms of environment, its a new completely separate runtime.
In regards to your node example, this is fairly different: the dependency pulled in from deno.land is a wrapper around the built-in http server, which does various error handling for you and simplifies the usage. The router isnt a simple switch statement either; its a URLPattern (the web's version of path-to-regexp) based minimal router. Campring these to the node built-ins isnt exactly a fair comparison I would say.
Also on top of this, with node you need a configuration to get typescript working, then you need a package.json, etc etc.
Caching dependencies is very different from general reproducibility. Committing node_modules guarantees that the app works even if the NPM registry were to implode. Try to deploy your deno thing from a cold state (e.g. maybe you're moving to a different AWS region or a different provider or whatever) while there's a deno.land outage and it will blow up. I'm actually curious what this caching story looks like for large cloud fleet deployments. Hopefully you don't have every single machine individually and simultaneously trying to warm up their own caches by calling out to domains on the internet, because that's a recipe for network flake outs. At least w/ something like yarn PNP, you can control exactly how dep caches get shuttled in and out of tightly controlled storage systems in e.g. a cloud CI/CD setup using AWS spot instances to save money.
These deno discussions frankly feel like trying too hard to justify themselves. It's always like, hey look Typescript out of the box. Um, sure, CRA does that too, and it does HMR out of the box to boot. But so what? There's a bunch of streamlined devexp setups out there, from Svelte to Next.js to vite-* boilerplates. To me, deno is just another option in that sea of streamlined DX options, but it isn't (yet) compatible with much of the larger JS ecosystem. </two-cents>
This is what's called an "analogy".
But your other points are valid.
Deployment for a vanilla node.js thing is as simple as adding `node index` as the entry point in your favorite provider (because they all have node.js images these days), I've had such a thing humming along for years. Again, it's really not rocket science.
We used to call those script tags back in the olden days...
I wouldn’t say you lost it, I’d say you never had it in the first place.
[1] https://remix.run/ [2] https://github.com/BuilderIO/qwik [3] https://markojs.com/
1. Get some data from a database/source.
2. Pass that data to a template/component.
3. Convert that template/component to HTML (using the given framework's version of renderToHTML()).
4. Return the HTML to your HTTP request.
For example, here's the SSR for my framework: https://github.com/cheatcode/joystick/blob/development/node/.... It blew my mind when I wrote it because I was under the impression it would be more difficult.
HTML. You're serving HTML.
Doesn't really matter that the server-side language is JS, PHP, or BASIC.
And unlike a pure static site, you can add API or form routes
I'm a huge fan of runtimes that reduce boilerplate and configuration, so that's what makes me most interested in Deno. What I'm most concerned about is that we're pushing the idea that Deno's approach to third party imports solves all the problems of npm et al. If we teach developers to think of third party and native libraries as equivalent, I think we're hiding a lot of problems rather than solving them, which could be even worse.
Interesting. I checked the docs on this and it’s not quite clear to me why this is needed in this case, or what the benefit is in taking this approach. Is this strictly a build time optimization, or is it necessary in this example?
I've been dabbling in Deno for a while now. Standard lib is there. Testing is there. All the packages I'd ever want are there. Linting, a strong style guide, and a documentation generator too.
And unlike other beasts, it feels so minimal and out of the way.
Deno is making JS development fun again. Major props. I hope Deno Deploy is a commercial success for the team.
Technically if you were doing this in Node, you would need at least a package.json and would have to configure your TS/JSX transpile, etc...
The goal was to showcase simple yet intuitive JSX + tailwind at edge, we didn't elaborate on more advanced use-cases like authenticated pages, API endpoints/forms, dynamic pages (location, etc...) or parametric routes.
1 The one I downloaded weighs in at 85MB. That is smaller than some popular smartphone apps.
As I understand it, deno is designed to be somewhat safer than nodejs.
I can edit and compile deno much easier than I can compile a popular web browser. Some popular web browsers are not meant to be edited or compiled by users. If there are things that users dislike about these programs they are powerless to change them.
The "web browser" is created and controlled by companies in the business of user data/metadata collection and advertising. AFAIK, the people behind deno are not engaged in such practices.
The "modern" web browser has become a Trojan Horse of sorts. Instead of a program for displaying and reading HTML, it has become a vehicle by which users indiscriminantly/involuntarily run other peoples' programs, often written in Javascript, in a context where a third party, the commercially-oriented browser vendor, can quietly collect user data/metadata, mediate and manipulate user interaction.
Deno takes the Javascript interpreter out of the browser.
My test case was, basically reproducing something like
<?php echo $_REQUEST . "\n" . $_SERVER; ?>
and I was a little surprised how much convenience was baked into it and how you wouldn't have access to all that in other libs. That someone created an issue[1] makes me think I am not just not looking good enough and it's actually tedious.Also wondering if this can be done serverless-ly or requires something always on?
Why can’t we just run the example Deno program to generate snapshots of html?
It seems like some of us think pure static html is a good goal for some things, so why not use this Deno program to create the same html responses in generated files?
It’s probably the same amount of code because instead of writing a http response you write a file.
Of course you lose some functionality this way, but your app you rules imo
Meanwhile, I just created a JavaScript-free website.
Never have to worry about broken NPM, cookies, trackings, API, or JS-based malware.
And I use my iPhone/Android to edit/create web pages in Markdown, then my CI will build it and post it for me.
Look at the snazzy 3-D CSS, also JS-free.
Did I mention that I have a no-nonsense Privacy Policy?
3-D web page. https://egbert.net/blog/index.html
In iOS Safari, click the “CLI” link at the top, then swipe the page to the right to go back. If you do it slowly it works, but the first time I tried I did a regular flick-swipe from about the height of the page where the version number is. I was trapped in deno.land and couldn’t go back.
(Maybe that’s a bug in deno.land though, not deno.com?)
I guess Node.js could learn a lesson here.
Also note that Deno is an anagram of node :)
My man, let me introduce you to ... HTML. It has "time to interactive" at 0.0 seconds and content paints instantly!
As other posters have pointed out, why not do it in HTML from the start? It's more simple and efficient than this -or any- framework. Just drop the ol HTML file on your server and away you go!
I understand that the supposed "real" utility in this would be when you want to do JS-y things in HTML (auth, API, hand state, etc), but they don't show any of that on their showcase site...so...yeah.
I'm not sure Deno(the service, not Deno the language) are actualy proposing a model - similar to Cloudflare, for example - where you have your infra somewhere and they only host the "edge", in a CDN, or spread around the world.
The browser still has to fetch and render the HTML. JS-heavy sites do tend to be slower, but no site has a 0s TTI/FCP.
How are the cache control headers with this set-up - is there a varnish or similar cdn/cache doing useful work (I'm assuming not, more importantly I'm worried pointing something like fastly at this will fail in caching static pages?).
http://mkws.sh/ uses the standard UNIX tools to generate `HTML` pages featuring a powerful sh based templating engine https://adi.onl/pp.html. Dependencies and complexity are kept to a minimum. It does the minimum required, generate HTML pages and keep duplication low using aprox 400 SLOC.
"Many [devs] are more familiar with the Chrome DevTools console than they are with a Unix command-line prompt. More familiar with WebSockets than BSD sockets, MDN than man pages. (...) Many developers, we think, prefer web-first abstraction layers." -- https://deno.com/blog/the-deno-company
I've been doing JavaScript myself for about 15 year, unfamiliar with the UNIX philosophy (to be doubted as any doctrine). Started doing web development using plain HTML the "old school" way, I personally dare to say the _normal_ way.
Before the rise of SPAs I very much agreed with the idea of progressive enhancement which is coming again into attention with the likes of https://turbo.hotwired.dev/.
While doing SPAs I always felt that stuff constantly didn't fit, that we were constantly using unfit tech, doing hacks for benefits of using a single (unfit?) language both on the server and the client, partial loads (faster loading times) and having a single codebase for all OSes. Stuff felt hacky most of the times and we were hiding those complex hacks under what _seemed_ as elegant and simple abstractions. But I believe most experienced JavaScript developers agree that the elegance and simplicity is mostly on the surface. I constantly felt dissatisfied with the code I wrote. I refuse to go on a full rant regarding SPAs and JavaScript but that's the gist of it.
While configuring my dev environment I stumbled up the https://suckless.org/ guys. Their code embodies the UNIX philosophy well although some people, including me, might say that some stuff is too simple. Simplicity for the sake of simplicity is not a good idea (nothing for the sake of anything is a good idea to be honest) but rather as a consequence of you understanding of what's not really needed.
While investigating more more the UNIX world, discovering OpenBSD and using it as a daily driver things started to fit and make sense.
Now, regarding how mkws fits generating static sites, it mimics building a small C project except the Makefile is replaced by a shell script, so all the principles fit and are well established. pp is the compiler, .upphtml files are the sources, html files are the output binaries. Everything integrates and fits well. I feel satisfied about how everything works.
Code is small and simple, abstractions are kept to a minimum. I, as a single person, am able to investigate, understand and change every part of the generating process. Can't say the same thing about a JavaScript static site generator, you don't really need the v8 engine to generate a few HTML files, that's complex, most of use agree simple is relatively good, complex relatively bad as an industry best practice.
Regarding SPAs, I believe they were a quick solution until we properly solve the problems they solve via progressive enhancement.