React and others are simply a tool to pass the logic to the client. Useful in situations like apps that can run almost completely without the server or it can manage the offline situations.
All the related tools are only optional things to improve something: • transpilation for older browsers • talwindcss to manage the styles -if you are a lot of people- • redux or others to manage the application state
For other kind of applications, you can continue using the same pattern where the server has the main responsibility of generating the entire view.
In my case, I’m quite happy using htmx and avoiding frontend frameworks because usually I don’t develop offline apps.
This is text-book idealism.
Chicken, meet egg
1) Some front end needs and/or frameworks were developped by companies having a scale and pool of talent justifying the needs for new tools tailored to their use cases (extreme audience = need to address many user specificities + pool of talents means the best brains on this planet invent something: doesn't help in coming up with a trivial solution). React, Flutter, AMP.
2) During the zirp, coming up with some great tech was also a way to attract talents. Using such tech was (is) a way to be part of that group too. There is a trend effect. I am wondering whether complex front end tech stacks are justified when you can't hire the best talents and when you need to work faster to keep in business.
3) Micro service all the things leads to using API's everywhere which leads to have consumers everywhere including your front end. Coming back from this could mean using simpler apps and dropping some of the front end complexity.
Probably more :)
Recall that what got investors excited about the Web in the 90's was inline images. They didn't respond to hypertext, but they were looking for a next thing after "multimedia" and saw opportunities. Next thing you know, they were writing thought-leader articles about "push content", imagining the TV-ification of the Web.
no universal import system: Practically all other languages have a universal import system.
minification, uglification, and transpilation: Many other languages are not just minified, but actually compiled to machine code or at least VM bytecode, and still handle source mapping, debugging and code references in stack traces better.
different environments: This point is the only one that partially applies, because the front-end is fixed to JS due to browsers. Non-browser front-ends exist though, e.g. native apps, and have no problems sharing code through libraries.
file structure: littering the root folder with config files is annoying but hardly a real issue.
Configuration hell: Works fine in other languages (no all of them, though)
Development parity: Works fine in other languages (no all of them, though)
Most backend applications are stateless and state management is outsourced to a database which does the heavy lifting. So the complexities are in scaling. Maintaining a complex frontend application is akin to maintaining a complex caching layer in front of your database.
The tooling hell doesn't help of course, but I wouldn't say it is the main reason.
And the .Net forms application would be immeasurably simpler in terms of complexity and as a bonus, would have the backend thrown in almost for free as well.
I deliberately picked .Net forms because despite being much simpler than today’s front end stacks, it was still, much like any MS product, an overengineered corporate driven MS tech.
Something like Ruby on Rails, Laravel etc shows that front end is not inherently complex.
Sweet.
When I was an ROR developer, RailsCasts would tell you to do basically the same thing as HTMX, return partial HTML and use a tiny bit of JS to update the appropriate part of the DOM.
It’s a good fit for simple experiences, but breaks down when you need the result to update more than one place (say, a counter by the cart icon, or a set of options in a select in the sidebar.) Then, you hypermedia approach has forced you to try and explore your HTML snippet to pull relevant information, rather than get it in a nice structured format.
many applications are much more complex than some simple forms.
I will say that some of the complications he mentions are from web apps needing to compile down to a single executable, on a platform that only really supports one interpreted language. Perhaps WASM will help here, over time.
Modern clients are complicated because state management is complex.
Ironically, that's what the early web largely got rid of - for example this web page I'm using now has very little state to manage.
Old school desktop applications didn't really have deep linking, even today it is quite uncommon (with the exception to trigger some action in the app like opening a file as opposed to navigation)
Once you start using compile-to-JS and get out of the JS ecosystem mess, the developer experience suddenly feels much less complicated.
The easiest approach I've ever worked with is vanilla JS. But of course, building complicated stateful apps without a view layer like React is its own complication.
E.g. supporting printers there is just no issue. But maybe I misunderstood what you meant.
Because it's always morphing. Once upon a time, it was only desktop. Then it became mobile. Then the browser added "features" and websites want that "feature". Video. Animation. Accessibility. Wasm. Then where do we hosts websites? I used to be a desktop in a closet, then it became dedicated, containerized, serverless, server-side, client-side, monolith, micro-arch. Information morphs, too. Tracking, telemetry, A/B testing, search, session state, data storage, AI.
Where do we go next?
Computers and computer networks were designed in high trust environment to facilitate free communication. It was pioneered by academics and military organizations, where only highly credentialed people ever touched anything. When the web went commercial in 1993 I think we started a cambrian explosion of diversity in computation. I guess it wouldn't have gone as far without two decades of zero interest money and VC backed 'growth hacking'.
https://htmx.org/essays/how-did-rest-come-to-mean-the-opposi...
""" Some pushed through to Level 3 by incorporating hypermedia controls in their responses, but nearly all these APIs still needed to publish documentation, indicating that the “Glory of REST” was not being achieved.
JSON taking over as the response format should have been a strong hint as well: JSON is obviously not a hypertext. You can impose hypermedia controls on top of it, but it isn’t natural. """
Personally, the section "The Crux of REST: The Uniform Interface & HATEOAS" says it all.
Things have been quite brittle and requiring constant updating, or more than seems reasonable. Many devs don’t know a better or simpler time.
There seems to be some emerging options, whether it’s the livewire type technologies, or the most recent new curves that libraries like svelte, flutter, alpinejs and more have taken, providing most of the bang with less overhead.
We tried to move past this by going peddle to the metal with Javascript, and now we are slowly realizing that, while we can do this, there are major scaling issues.
And so now the hot new thing is things like svelte.
Rinse and repeat. And honestly at this point I'm more and more leaning towards dart + flutter might be the future or maybe something else with strict unified standards and is design with apps and multimedia in mind (like what flash was partially).
Whatever happened to the interest in static site generators a few years back? It seemed like we were finally going to move away from heavy javascript just to show some text and images, and yet the javascript seems to be getting ever heavier.
I'm still interested... I forked the pug templating language in order to write my own tools since the existing ones didn't do what I want. Have been working on it on my free time.
1. Encapsulated GUI components are a non-negotiable requirement for most projects, even apparently "static" stuff like technical manuals (e.g. search widgets), but there is/was no widely accepted standard for server side UI components. The closest was things like JSP taglets, but that is now considered legacy technology, gone in favor of React SSR. Why? Well, because components are most useful when you can instantiate and mutate them, which means they're most useful on the client, but server side tag libraries lost all the componentization when crossing the wire leaving you with a "tag soup".
2. HTML and CSS, being as they are committee driven and implemented multiple times in security sensitive C++, evolve very slowly. In practice people's ideas about how to present content, even so-called "static" content, change faster than the standards can keep up, so people fall back to JS to cross the gap. But then you get the impedance mismatch that comes from mixing several different programming technologies together (HTML, JS/TS, CSS, C++), and the DX goes to hell.
3. Why is it a "mess", well that's mostly an unarticulated social preference. Developers seem to prefer open source bazaars on the frontend, even if it means a horrific DX in which solutions have to be stitched together out of lots of tiny half-abandoned libraries. This is probably a legacy of the churn of the 2000-2010 era in which many large, well thought out proprietary frontend app frameworks ended up being abandoned by their owners or experiencing severe strategic product management errors. Delphi, VB6, Flash, Silverlight, .NET WinForms, .NET WPF, Java Swing, JavaFX, GTK, (to some extent) Cocoa and Qt and so on ... these all provided much more seamless and coherent platforms for writing apps but ended up leaving stranded user bases behind after the backing companies didn't execute properly on sandboxing/security/deployment/cross platform support.
Or to put it crudely, a part of the reason people target the web is the lack of product managers involved in defining the platform. Chrome has them in theory but in practice a lot of stuff they add to the platform is deliberately unambitious incrementalism, and the web's enormous base of valuable but unmaintained content means they aren't able to break backwards compatibility on the core tech despite having huge budgets. The downside is that anything not provided by the base platform experiences the opposite effect where the gaps get filled by enthusiastic volunteers who don't plan together or even stick around very long.
There's nothing fundamental about this choice, as the wholesale adoption of iOS and cloud tech shows. Devs will buy into fully proprietary platforms in a heartbeat if it's convenient to do so, but this is partly because those vendors have proven to be quite good about backwards compatibility and incremental development: there has never been an "AWS 2" effort and Cocoa's evolution has been quite smooth from the NeXTStep days.
One way to escape the complexity of front-end development is to write for non-HTML platforms instead. Historically this was quite painful because most work gets done on desktops, but whilst desktop development could be quite pleasant desktop distribution was extremely painful. I've spent a couple of years working on that problem and it's now way easier than it once was (check out [2]) so deployment complexity for devs is increasingly no longer a concern. If you want to write in Jetpack Compose or JavaFX or Flutter, or indeed Electron, then you can do that and the whole deploy/update story has got nice and easy. The big gap that remains is certificate cost, but we're looking at fixing that by signing for you if your app fits inside a sandbox. We're scouting around to understand demand at the moment.
[1] https://docs.google.com/document/d/1oDBw4fWyRNug3_f5mXWdlgDI...
Whereas the platform (the web) and the languages (HTML, JS and CSS) do not offer a cohesive answer to those expectations. It's all just bits and pieces of improvement here and there.
Often the frameworks that aim to solve this use abstractions over these languages and APIs (e.g the DOM, routing). And these abstractions (ts, jsx, react, css-in-js, tailwind ...) are not a cohesive unit and bring back the same/more friction that's inherent to the web - 3 languages trying to play with each other) - ...
... But this time with even more "parts" and abstractions.
Web technologies weren't designed to build web apps. And since a re-design/rewrite is off the shelf, we're content with small improvements that add improvements but also increase friction.
We do need to rethink this.
JS, along with other things, was bolted on as an afterthought, because HTML had too much momentum for people to stop and consider a proper way to have sandboxed, portable applications, which is what WASM seems to be, after all this time.
And when there's a quirky platform underneath (HTML + JS), people invent a million opinionated ways to achieve the same thing, because there's no single right way to do it. And each comes with it's own quirks.
"No universal import system" - Rust has it's own module system and Cargo is used for managing dependencies, no need to worry about different module systems.
"Layers of minification, uglification, and transpilation." Just compile Rust to WASM file for the browser, same as using any other compile target.
"Wildly different environments." Something that you'll still need to deal with. Some runtime dependencies are system-specific (code running on the browser usually needs access to Web APIs, and JavaScript, code running on the server can't access WebAPIs but can access the system clock and filesystem. Sometimes separate libraries or separate runtime configs are needed (e.g. configurable time source)
"Overemphasis on file structure." Not a problem for imports, but you may still have file structure dependencies things like CSS, image resources etc.
"Configuration hell." Pretty much non-existent once you have your Rust compiler setup locally.
"Development parity." Just use trunk: https://trunkrs.dev/, to watch, build and serve, config is minimal.
I don’t know which, if any, nodejs alternative will succeed it, but if say Deno were to do so, the stack would be immeasurably simpler.
Right now, hopping between 2 different JS projects both of which do the exact same thing, means you may have to learn completely different build processes, completely different minting rules, completely different typescript compilers, completely different module import syntaxes/formats/configurations, completely different test runners, test description languages, etc. completely different standard libs (one may have lodash while the other is importing individual functions from npm), etc.
Heck, even your nodejs may not be nodejs but rather could be yarn, pnpm, etc
I believe nodeJS’s decision to essentially outsource all basic functionality while the JS ecosystem figured itself out was a huge reason for its success, but now that many things are more established, it’s causing a lot of unnecessary complexity.
this.parent.aList.push(item)
or globalState.customersList.push(item)
Ideally I'd write it in JavaScript instead of using an API. I would accept globalState(customerList, "push", item)
if calling a function of the library is the only way to trigger the UI update code.Edit. I add an extra nuisance that could be solved by a better and more straightforward syntax
I have to work with code like this
store.js:
import * as model from './modules/model'
component.vue:
import { mapState } from 'vuex'
computed: {
...mapState('model', ['model']),
}
this.$store.dispatch('model/method', {...})
which calls "method" defined in "model" and which is very convoluted compared to what we are used to in other languages import Model from `store/model`
Model.update(args)
If often think that framework and library authors don't try hard enough to build tools with simple interfaces. That reminds me about Erlang/Elixir handle_call/handle_cast.Some of it is to optimize the code delivery so you're sending just the bare minimum source code and not wasting user's bandwidth.
Of course if you had just one newest browser then you could do away with most of it but at the end of the day you have to make sure your frontend can run everywhere including mobile devices, hence the complexity.
The amazing part is ever since vite has been on the scene a lot of it has been abstracted away. There is no need to even compile anything during dev which has been a game changer.
I wish I could describe my most recent attempt at migrating our app to the new Next 13 app router for an audience, on camera, on stage. The levels of confusion and dead ends, and configuration, and error screens, and the need for truly expert-level knowledge just to get things working as one would expect made me realize there's just no way this can survive as it currently stands. It's all an abomination. React is dead. FE is dead.
Please just give me back a simple React.renderToString mounted into an express wildcard route, hooked into react router. All of these perf concerns are for the .0001% of people who even notice this shit, or need things to run so ideologically fast that they're willing to throw out every bid of common sense in service to an abstraction that is DOA as soon as you use it to do anything complicated at all, or apply it to an existing codebase.
Not only are they apps, they’re networked apps with the necessary remote process calls (APIs) and all the added complexity that entails.
That said, one of the major problems is that there is no clear explanation for these boundaries in the tooling and code. Yes, you can `npm install lib` and unless it is documented in readme, it won't be obvious will this run in node only or in web too.
And as developers we tend to like the next thing and complexity
Then browsers improve but now you’ve got all that legacy stuff in there.
Plus of course some silly corporate is still on internet explorer 3.0 or whatever
It would be interesting if those studying biological evolution could see how much of their techniques, theories, and predictive abilities could apply in this realm.
What I did, is I removed all linters, like eslint, prettier etc, because they did disagree about syntax. Point is to make to code minimal change, that fixes something, or adds some feature. Not that most commits would be about fixing syntax.
I also merged all branches to one main branch, because merging between branches did take a lot of time.
What if there would be no build step? DHH has some nice ideas about that:
https://world.hey.com/dhh/you-can-t-get-faster-than-no-build...
Let's make progress towards less complicated stacks.