I would convince my boss to pay for a service that provided easy-to-use React components and implemented all the flows for inviting users to workspace, share by email or link, list who can access a resource, send invitation emails, etc, so I could focus on the damn app!
Doing this again, and again, and again....
...truly sucks.
After not working on a codebase for a year, or two, I usually need to set up everything again -- and often run into very stupid/pointless issues that waste an hour, and often much more, just to get to the point where I can start the actual coding process.
This is pretty much why I've stopped coding. I don't have the time to waste any more.
Nix, NPM, Brew, Pip, etc all have basically the same blind trust security posture and should thus not be trusted. I generally suggest Debian in a container for a dual use dev/compile container made of signed/vetted/reproducibly-built dev/build/debug dependencies.
Edit: And if you know, how does it differ from devbox?
I cd to a project and run "make shell" and I am back in a docker container with every tool and the current working directory mounted in it with permissions all mapped 1/1, scripts in path, etc.
The only tools I need on a dev machine are an editor, a browser, a container runtime, and make.
That was done using VM images that he owned and maintained, which was not optimal for me, but there was also documentation laying out the steps needed to create the environment so it was possible to maintain a bare-metal clone of the official environment and use that. Good times!
Everything is in there. Every library and piece of software, every config and preference. User creation (minus any sensitive data). Everything.
Made a bootstrap script I run on a fresh install, and then I can push the ansible config onto it from a raspberrypi ansible server on my LAN.
Wanted a different file system in december. Wiped the system, changed filesystems, resetup. Took less than 1 hour.
Seem like you never got on board the declarative-infra-as-code boat, it's actually rekindling my software flame, with things like docker-compose and NixOS, everything you ever work on can be stored as code, and set up with 0 effort.
Edit: Last attempt https://M3O.com
Trivial example: libfftw3 has --enable-threads and --enable-float configure/compile time options. Your project may need/want both, one or neither of these options (it may, in fact, want two different versions of the library).
Rust "proposes" to deal with this by essentially compiling its libraries as a part of building your project (ie. "vendoring" every 3rd party library dependency).
Most C++ developers don't consider vendoring in this way to be particularly sane, even though many of us do it anyway.
Crabs in a bucket.
I have a sea of documents, both physical and electronic, and it's always a struggle to scan/organize/find them. I'd pay good money for a software/service that manages my documents, from scanning to archiving.
This project is not dead! Rather, this project is now maintained as a community effort at https://github.com/paperless-ngx/paperless-ngx, a direct continuation of this project and drop-in replacement. Please see their project page and the documentation at https://docs.paperless-ngx.com/ for details.
I really like Paperless NGX, have been using it for some months now, mostly for scanned paper mail, I use [0]
[0]: https://apps.apple.com/in/app/scan4paperless/id1629964055
Countless inane layers of overengineering no one can reason about anymore.
Fast forward to the present, after noticing that the project is still going strong I did a deep dive in it and got really impressed. I think I will be moving my personal projects from the unmaintained Ansible mess I created to Dokku.
All while not knowing a lot about devops at all.
So much hassle to support it when that shit breaks.
Also, to add to you list - Railway and Vercel are also super great
So you've found a problem -- now build a solution. :)
In the ancient times known as "the 90s" you could find PC technicians who would run anti-malware, dig into the register, and overall tune your computer while leaving everything more-or-less as it was.
Those technicians are long gone, replaced by people who make easy money by wiping your computer clean and putting your old files in a folder called "Backup".
I'd gladly pay someone to do it right, but I just can't find anyone. So whenever I visit my parents I know there will be a parade of slow devices waiting for me to tune them.
There was fairly cheap commercial options even then but that shop had a bad "dont buy what you can build" attitude combined with a maddening shyness about making any other use out of the things they'd built. Having written our own presort software, we couldn't then sell it or even open source it because that would be "a burden distracting from the core mission of the business."
I read this in 1999
https://philip.greenspun.com/panda/
and came to the conclusion that the basic need for a "web framework" was a system of authentication that did what most commercial sites do: let people create new self-service accounts with email verification and all of that. That was the essence of the tcl-based framework that Phil Greenspun was pushing but I didn't like tcl, so I wrote something in PHP that was meant to integrate with 'best-of-breed' PHP applications (install the authentication system, then modify various applications to use your authentication not then) as opposed to the "PHPNuke" approach which was popular in the industry which was "install some portal which had worst-of-breed implementations of most of the functionality you think you need".
What I found baffling was that nobody cared about authentication frameworks until they became something that worked "as a service" about 10 years later which is silly for so many reasons, not the least that a company that offers a service like that is going to either run out of money and shut down the service or get aqui-hired and shut down the service.
- 3 or more input directories which have specific roles like "main archive, prioritize", "temp folder, remove from here first"
- multiple levels of equivalence test, including file name, exif tags, checksum, perhaps perceptual hash (e.g. for flagging downscaled images to be deleted)
I usually value two things: one, who first made the user aware about the product (could be organic, FB, SEM, Organic, Instagram, reddit, hacker news, product hunt, anything) and then which channel was responsible for final conversion event we want. Most products in the market only care about last. First touch is important to figure out where to get more users from at a lower cost. In the products, they tend to overwrite that and just use last touch. So, you store the JSON at the time of generation in your own database, map it to userid, and use it for internal calculation.
So with that said, I prefer restic over borg for the S3 support.
It takes a day or two to set up though, if you are not super technical like myself
Not a plug. I'm unaffiliated and just impressed by it. Should've thought of it myself.
The only I have used before was django auth (great), but never get anything like that in other langs. I'm in Rust know and is my biggest desire...
Then there is also the problem you are facing and that is tooling not being language-agnostic across the board and you will have to re-implement similar patterns when changing the backend language or framework.
There are many static site generators that spit out a directory with HTML files ready to be served. But getting that output in S3, netlify or your self hosted nginx is a very different process. Configuring things like cache-control, redirections, compression or error pages is done differently on all these platforms.
Wouldn't it be nice if it was possible to basically generate a tar.gz with some metadata that would automatically configure the web server and deploy the site? Kind of like a docker image but for static files?
It's surprisingly hard to find. I contributed to one open source that a co-worker started, and I made my own in my own time. So I've almost made it twice.
But when I say teams I mean it needs to be designed in a way that it can easily reflect a CMDB architecture, with a tree of objects, API and LDAP auth. Technical details aside, the goal is of course to integrate it with other systems.
On top of that, not all UIs are simple enough to be used by the entire organization either, but as long as it separates backend and frontend that can always be fixed.
Honestly my current plan, and this would be the 3rd time I build something like this, is to make a simplified frontend for Hashicorp Vault.
it's custom to every user and every development environment
It can probably be configured by someone in your company as a docker image you can run, but usually that's only necessary for large teams where IT is provisioning development laptops and there's a lot of consistency and common dependencies, even then I don't see it being worth the effort as those preferences would still vary across team roles and projects.