If I understand it correctly, it allows you to achieve reactive data flow in a single page app without any boilerplate.
Meaning - you update the database on the server and all the relevant UI(s) will automatically receive the updated data and re-render only the parts of the UI that display that data.
This would require a ton of PHP and Javascript dealing with networking, websockets, routing, serializing data and so on.
Haven't tried it yet, but very curious to see if it works.
> This would require a ton of PHP and Javascript dealing with networking, websockets, routing, serializing data and so on.
I don’t know how it was implemented, but Quora had this back in its early days (~2012). There was some mechanism that "remembered" which table lines were used to generate which UI component, and when that data changed you had a live-reload in your browser. That was really cool to see at the time; I’d love to have more background on its implementation.
I really don't like that idea. It seems to me inefficient and error prone.
Let's say you're updating a database. You add some records to one table, and you update some records to some other tables.
If the program automagically updates the UI, then on the first change it will attempt to update the UI (causing lots of processing) , then on the 2nd change to the database it'll update the UI again, and on and on for each change.
Wouldn't it be better to make all your changes to the database then only after that run an updatePage() funiction that updates the web page?
"Sounds really slow and chatty right? Actually, NO!
This is not RPC or ORM. The key is to make the language, compiler and runtime in charge of the network, like the JVM owns the heap. Idealized client/server network IO (better than could ever be coded by hand) is an explicit design goal.
How does it work? Functional programming:
- `photon/defn` is a macro that compiles Clojure syntax (s-expressions) to a dataflow signal graph (DAG). - The DAG is lifted and compiled into Missionary reactive signals. Missionary manages reactive execution (incremental maintenance such that a small adjustment to inputs results in a small adjustment to outputs)."
This sounds absolutely terrifying from a security perspective...
What's to stop a malicious client from broadcasting code that deletes my entire database?
> code
Some overlap, but these are essentially two different things.
> What's to stop a malicious client from broadcasting code that deletes my entire database?
Your backend.
Unfortunately (or fortunately) it's also a corner-stone of computing in general
For those working in Typescript, Blitz.js seems to do a great job at drastically decreasing the plumbing you have to write to shuttle data between Postgres and React. There’s also a ton of goodies like auth built in. From my first impression it’s the closest the JS community has ever come to a Django, and that’s very high praise in my book.
I wish there was somewhere to follow updates that wasn't Twitter though.
This project looks very cool! I like the focus on composition, Meteor was lacking that (and really, most other frameworks do as well).
The LiveView lead resurgence in server side rendering is exciting. Does anyone have any insight as to why ShareDB never really took off?
Then specifically for Derby/Share JS they didn't put enough resources into the project to make it good enough compared to the alternatives.
It's really too bad they bound it so tightly to the framework, as I think there's a chance it could have succeeded as a language in itself. But these reactive shared-code things never seem to work out.
I think what we are seeing with these tools to make data synchronization in the frontend more invisible will continue to proliferate.
I am looking forward to the next, rich landscape of interactivity on the web powered by WASM, WebGL, etc.
All of which will likely be a broken mess on iOS thanks to Apple
However I think it’s missing the point of de-coupling. Security would be very hard to reason about, as would handling of intermittent network connections when the real structure of the client and server are abstracted away from you.
Ultimately I think GraphQL with live queries is the best model for this type of reactive work. You get a decoupled client/server, reactivity, support for mobile clients as you have an API, as well as full type-safety on the client.
Nonetheless I applaud the creativity on display here and I hope I’m proven wrong. Maybe this will be the next paradigm shift? Who knows
At first I tried Macromedia Coldfusion, later acquired by Adobe. Now that I checked on it, it seems to be going strong, to my surprise. It was too hard for me. And it was closed software. There was no way for me to learn it without spending money on it. And it wasn't what I was really looking for.
But I just needed something simple. Something to tinker with. So I found PHP. It was exactly what I needed at the time. Later I also found MySQL.
The amount of garbage required to build a single website is enormous. So enormous that we have gone a full circle and people start using static site generators to create pure HTML sites. Because of speed and few other reasons.
So I got to thinking, why is it that we're building webshops with all these open source technologies, with a huge amount of overhead and "bloat", when all you really need is a few simple things.
Well, as the creator said it, there are a lot of unknowns. Huge learning curve etc. But that's how Linux got started, as a tinkering platform. I really think this is the right path to take. Making an open source web programming language, that handles all the needs directly built-in. I totally agree with the philosophy and if you will, proposed abstraction, of the problem at hand.
But also, it makes me shiver to look at the code and not understand it. So much to learn. But it gives me hope to see, that other's have come to the same conclusions. Looking forward to hearing more!
Once that lightbulb of the power of a Lisp goes off, there's no turning back. As Eric S. Raymond said, "Lisp is worth learning for the profound enlightenment experience you will have when you finally get it; that experience will make you a better programmer for the rest of your days, even if you never actually use Lisp itself a lot."
There are concrete similarities to PHP, which is effectively what this server/client macro setup gives you at a superficial level. However, from what I'm seeing, this will behave more like a LiveView or Hotwire, with the focus being on optimizing network requests in an automated manner.
My big question with the tool is this: if you're passing environments back and forth, how secure can this feasibly be? Is there an automated limit on what will be considered based on the generated code? How do you handle malicious environments?
FoxPro (& dBase) is a realization of the concept: To deal with databases, you need a database language.
Exist a lot of minor things that our apparent "general-purpose" languages lack in the moment you need to deal with certain niches. From very small stuff as not-even available decimals, dates, currencies, units types, to lack of simple way to transform data, to ad-hoc queries, to ad-hoc data validations, to lack of relationship modeling, etc.
Even if you say "linq!, ActiveRecord!, functional!, lisp!, pandas!..." and others all that are a shadow of what the dBase family provides.
How far? I was not in worry about all the stuff everyone worry about today (injection? orms? impedance mismatch? reactivity? <- an over-complicated patch on top of unfit languages for it, so kudos for this idea!). That is what make me put some time aside in build a language in the spirit of it, because is so much details that are not available if the languages is not designed with data(as of the kind of business) in mind.
So, in short, most languages, even php, python, ruby, ... are not that good for web programming (and worse for database programming!), just that are not that terrible, either.
These are problems with any framework, but the more all-in-one a framework attempts to be the harder it is to get in between the joints with your glue gun to fix things up.
That said, this is Clojure and usually you have pretty easy access to all the intermediate bits and bobs and macros so maybe it’ll be great.
- A system that starts with our database schema
- a language in the front-end that abstracts away server connection and db access
- this imaginary language should allow defining react-like components but treat the db as a local datastore
- Most clients are UI stricture interpolated with that user’s data queried from the central db. So this imaginary front-end language should allow querying from user-level views from the db
Most of what I read from the post, looks like a realization of this dream.
- a language in the front-end that abstracts away server connection and db access"
Optimal data layout for storage and for processing / presentation can be quite different. Automatically mapping one to another I think can not be efficiently implemented in automatic fashion. I've tried different frameworks that claim to achieve it but at some point you always hit the wall. As a result I've long abandoned all those attempts and do manual transformation in code that are optimal for my particular situations
Clojure and Clojurescript are the same language (more or less). They just target the JVM or Javascript.
The author is mixing the frontend (Clojurescript and Reagent) code with the backend (Clojure and Datomic) code in the same expression. Then through their magical system and the beauty of lisp, they pull the frontend and backend parts out to serve them separately.
I would say that I am very happy with the FE stack of reagent / reframe at the core. I have long chased the dragon of co-located queries ala GraphQL instead of basic re-frame subscriptions, or redux.connect and pulling fields off a map. Obviously having the ability to be more expressive with data queries is great, but in reality I have come to settle on basic subscriptions into maps, syncing data into my db via events. It's not super pretty but it scales!
This seems like it's trying to push the needle, and I will it.
Well done, @dustingetz!
buildsystems are increasingly doing the heavy lifting of telling backend how to deliver a pre-hydrated frontend, and telling frontend how to speak backend's language
would be nicer if this was just types and schemas, so you didn't have to use a full-stack framework to get full-stack accelerations and linting
The runtime figures out an efficient and reactive way of parceling out work to the server as needed, and refreshing it only when necessary.
From https://book.fulcrologic.com:
"The core ideas are as follows:
Graphs and Graph Queries are a great way to generalize data models.
UI trees are directed graphs that can easily be "fed" from graph queries.
User-driven operations are modeled as transactions whose values are simply data (that look like calls).
Arbitrary graphs of data from the server need to be normalized (as in database normalization):
UI trees often repeat parts of the graph.
Local Manipulation of data obtained from a graph needs to be de-duped.
Composition is King. Seamless composition is a key component of software sustainability."
[1] https://imba.io
So now there's a React hook (useDeno) that takes a callback that is only executed on the server-side, and the returned value is sent back to the client side transparently.
To my taste, all of that is way too overcomplicated. I don't know why we need to make writing and maintaining web pages more complex with every year that goes by. To my mind, this industry looks completely derailed.
This lets you write a function where some of the code runs on the client, some of the code runs on the server, and the compiler figures out which and emits the network RPC calls for you.
I'm thinking about autocomplete that on new user input (needle='ad') filters previous result from server (needle='a') in the client before server returns a new response from new input (needle='ad').
Essentially can inner parts of expression update even when they are somewhat dependent on reactive data from server that comes from their parent expression?
That's not the kind of apps I want to build. I want workspaces where I can make and edit and work freely. I don't care to be online to do it, and conserving bandwidth is not a constraint that should define how I use it.
The DAG goes from me to me.
We kinda have that with livewire[1] and inertia[2] and as awesome as they are (no separate api etc) they also suffer from the “magic”
[1] https://laravel-livewire.com/ [2] https://inertiajs.com/
https://en.wikipedia.org/wiki/Fallacies_of_distributed_compu...
I think more and more the lines are blurred between open source projects, and products. This is good and bad.
It's good because people making money out of open source projects probably means more open source projects, more support available, and a healthier tech industry.
It's bad because as someone with no intention of turning a few open source libraries into a full time job, there's still an expectation of a certain level of polish to them that makes more sense for products. Open source projects with clever names, logos, mission statements, a domain name and marketing site/landing page, marketing copy, flashy documentation, a live preview environment, etc. These are all a lot of work for an open source project, but the stakes are raised to this level by the productised open source projects that can afford to fund this sort of thing.
This is true, but I also think it's worth evaluating whether the lines along which we've been decoupling applications is the right one. Typically, the line of demarcation has been the client/server boundary, for a bunch of reasons: security considerations (it's also a trust boundary); different computational environment. This split has reinforced itself with the organization of companies into frontend/backend teams.
But there are a bunch of things that it makes a pain-in-the-ass. I've encountered this most with data-heavy apps, where I want to do some analysis on the server (e.g. Python/Pandas), but I want low-latency recalculations on the client in cases where the data is small enough. Doing it “right” requires implementing the same data-level logic in both Python and Javascript. Nobody has time for that, so we end up in our current world of laggy janky SaaS that needs to run off to the server every time you click a button.
Which is to say, I'm excited that people are looking at alternatives (not just this; LiveView was on HN yesterday as well).
Do we build the client-side on the sever, and render? Why no, that would be PHP.
Let's build the client-side in the browser, and rig a series of complex code-generating primitives disguised by the beauty of a language, to AJAX our way to a presumably good-enough solution.
I think its a serious question whether apps built with this are more performant, easier to use, (and so on) than the equivalent PHP-approach.
Maybe I'm misunderstanding this project but it seems like abstractions on top of abstractions and has little to do with being a "web language". That was PHP, for better or for worse.
That's amazing!
The model is this: run-once-and-die + build-it-on-the-server.
Those two ideas are extremely powerful, a little like immutability, in that they rule out a huge number of issues. The issue with PHP wasn't this model which became associated with the morass of amateurs using the language. A shame.
Case in point: PHP worked well, but framework like Rails and Django became very attractive because their offered more, and used languages that could be good outside of their niche.
And what did the PHP community to stay relevant ? They developed great frameworks, and improved the non web language capabilities.
Turns out the web moves fast, and coding a web app is more than web programming anyway.
I don’t recall Clojure being a web centric language like PHP, ASP, etc.
Clojure (JVM target) can do back-end web as well as Java can, as well as anything else that Java can.
Clojure is a general-purpose programming language, so it can do pretty much anything.
)))))))))))
Maybe I’ve been in Node.js land too long, but I don’t get why this is better for my productivity or my ability to create efficient web apps.
Also, I can’t edit it right now, but would like to apologize for saying it’s “insanely hard to read”. That was rather harsh and uncalled for.