I don't need most of the history, but there's 0 chance in hell I'm auditing that many lines to decide what I need and what I don't need.
History is useful enough to exist as a feature, I up-arrow routinely, but it doesn't actually matter when it doesn't exist.
I find the idea of going out of your way to preserve and migrate years of shell history and make it searchable in a db about like:
You have a problem that water is flooding your kitchen floor. Normally you deal with a spill with a mop or towels. There is now too much water and so you decide that your normal towels aren't good enough and so you get more & better towels, or even put a sump pump in the corner to keep pumping all this water away.
I've written a lot of complicated pipelines and awk and sed etc, but they were either one-offs that are of hardly any value later, or I made a script, and the few things that are neither of those, are so few they automatically don't matter because they are few.
It's not illegal or immoral, just goofy.
That's absurdly naive to think the simplistic constraints of your own workflow is a general rule.
But version control systems do work, so we use them, we keep history and we tag releases and don't really need to bother with source archives any more.
Nobody is saying history is a substitution for documentation or an audit trail or anything else, but it is a useful tool if it works. Consider a case where you're exploring a new dataset you find online. You download it into a directory, run some commands to transform it, load it into a database etc. You don't know if this will ever be useful. But if it does turn out to be useful, you now have a log of everything you did to get there. Trying to document everything up front would be insane and you'd never get any exploratory work done.
While I agree it may not work with everyone's workflow, maybe it could be a powerful change to folks workflow. I'm going to try it out and see for myself!
hey! what issues were you having here? slow to open or slow to search?
we have a whole bunch of people with way more than 170k lines, so that shouldn't be happening :/
I Love the project (will donate now, I remember when the project wasn’t taking donations, and I suggested they should) Thanks
Edit: I should add: I have the client on about 15 to 20 different VM‘s, all with various OSes and versions. The server part I’m running w docker (I think, via the exact steps suggested in the docs). All Works great in my use case, and I do have some very long and complex commands that it’s storing
Note: I am a user, and not the author (Ellie Huxtable)
Wishing her the best of luck making it her job.
Where atuin really shines is in keeping a single unified history across multiple shell windows, which my incantations could never get to work correctly on all the platforms I use (zsh/bash on OSX/Linux/msys/cygwin/babun).
I’ve also enjoyed running SQL queries on my atuin-history to learn more about my own workflows to see where I can optimize.
Thanks!
> I’ve also enjoyed running SQL queries on my atuin-history to learn more about my own workflows to see where I can optimize.
if you could share more about what you found useful there, that would be amazing
With fish's shell history for example I can just type 's' and it completes it to 'ssh user@host.tld', because that's the last one starting with S I used. If it's not right, I can type it up to 'ssh' and press arrow up to pick the ssh command I want.
Then I might remember that I did this fancy jq thing once to parse a field in a specific way, I can easily use Atuin to look for it with a nice text-mode UI just by pressing C-r and typing 'jq' as the initial filter.
Another use case I feel pays off is complicated one-liners where I need to do something similar but not quite the same again - good starting time saver. This depends on you being a mostly cli kinda person obviously, if you instinctively reach for excel over awk then ymmv.
<my> | <cool> | <one liner> # add script for this
The # comment makes it easy for me to search through my history to find one liners I want to use to build a shell script from
...but if you effectively have this in emacs, I can see why you wouldn't need it in the shell.
Oh, also, I use tmux to split my terminal and do separate things in each one, so I like that atuin consolidates all the histories from my separate panes.
With fish's own autocomplete I get most of what I need. Add fzf to search the history (ctrl-r), and it's highly comparable to the post topic.
Multi device sync is not there without some effort but I don’t really care to mix my personal history with my work machines anyway.
Yep, obviously there are many benefits to per-device history, but I think I'd find it more annoying having it synced between devices, especially if there are commands that either won't work on a particular machine or might even be dangerous in a different environment.
I only map atuin to ctrl+r and use fish's native up arrow search for simpler stuff
Also my biggest problem with bash, sometimes it does not keep part of recent history, if bash process gets killed. Fish does not have this problem.
I usually keep useful commands in notes, and sync my notes instead.
Atuin is there when I need to find something more complex I remember doing 3 months ago and could actually repurpose today.
In 2017 wrote my own bash script (later optimized for zsh) to just record everything in sqlite with hooks on prompt. [1]
I mostly work right now on Mac, don't need to support Linux anymore, so wrote an app for Mac, that syncs the history over iCloud, and has a GUI interface. [2]
Anyway, storing years of shell history somewhere, where you can do complex searches, and actually find some magic command you run a few years ago, is priceless.
- [1] https://www.outcoldman.com/en/archive/2017/07/19/dbhist/
Global aliases (which you can use anywhere, not only at the beginning) are also nice to compose aliases with each other.
Ellie's Show HN was in 2021: https://news.ycombinator.com/item?id=27079862
2022's must be somewhere else.
Atuin tends to get shared more readily on Mastodon/Twitter than it does HN, which explains everything other than the 2023-HN-spike. We've also been on a few podcasts and newsletters
I maintain a couple of open source package for emacs -- it's a labour of love. I'm happy to help folks with their issues, but it's easy to say "sorry, I don't have capacity to add this feature" or "no, I don't think this is a good fit for the project". If I depended on this for money.. well this would change the whole approach, wouldn't it?
Looking at OP it looks like it does " recording additional command context"
Often (meaning once every few months) I have to SSH to some less used machines and remember a few incantations where path and time is crucial.
In bash there is a hacky way to add current path and timestamp to history, but I've never gotten it to work exactly right. If you add timestamp then it seems to duplicate the timestamp when you repeat the command.
You can even filter to commands for your current directory by just pressing Ctrl+R a few times
- command - directory - timestamp - duration - session ID - exit code - hostname
currently thinking of neat ways we can store extra things too (git remote, etc)
I just deployed this to my "everything" NixOS server with `services.atuin.enable` and synced a few of my machines up with it. Very cool! I hope this move goes well for Ellie!
3 Trackers with EFF Privacy badger.
Best of luck Ellie :o)
At this point I just auto-assume there's a domain for every word I'm interested in, though some are not open to registration or prohibitively expensive.
The only feedback I want to call out is sometimes when I close the terminal tab, the atuin server may run in the background and I got a warning message.
The product is great, and I discovered it just the other day!
Via the Console email news letter I think! :-)
- zsh - bash - fish - nushell
I should probably make it a function now that I think about it.
> Atuin will continue to be open source and available for free in its current form as a self-hosted tool. By going full-time I hope I can focus on adding new premium hosted features for advanced users, and begin to support business usage.
dont recall i needed to dig up old commands typed, like ever.
i dont think this one is for me, but good luck!
1. If someone steals my laptop & breaks in, can they get access to all my history
2. After breaking, if they run `atuin key` will get them the key for my history which they can use from any device (if they know the userid)
3. If you are running servers passing passwords as command line arguments in that device, they have all that.
Yes, but this is the case anyway with current shell history. I think if someone breaks into your laptop you have bigger problems than your shell history. It's best to get into the habit of not pasting secrets into your shell
> 2. After breaking, if they run `atuin key` will get them the key for my history which they can use from any device (if they know the userid)
They would need your username, your password, _and_ your encryption key
> 3. If you are running servers passing passwords as command line arguments in that device, they have all that.
Yes. If you're doing this, then all of your passwords are currently stored as plaintext in your home directory - with or without Atuin. I'd consider them no longer secure if this is the case, as any program you run could read .bash_history
Atuin by default comes with a set of filters to ignore secrets and not record them to history - AWS creds, slack creds, GitHub tokens, etc etc. So it may well reduce the impact of this
I make a point out of never doing that. It’s way too easy to accidentally expose things. For instance, doing a live demo with an audience, and using Ctrl-R out of muscle memory? Suddenly you flashed your password in front of everyone.
Generally, I’d recommend using a tool like Unix `pass` or your default OS keyring to store your secrets, then you can run `command1 --password=$(command2)` to feed a password from one command to another. If I really have to type something sensitive, I prefix the whole shell command with a space, which in many shells can be configured to mean that it doesn’t enter history. If you do so by accident, the shell history file can be edited in vim.
The issue isn't "I can't remember how to run a command I just ran." The issue is that the universe of CLI tooling you use is too large, too inconsistent, and too complex to remember how to use them all. Conventions may be wildly different between Windows and Unix, BSD and GNU, many tools have existed for over 50 years now and have accumulated enormous feature creep. Many newer tools try to improve upon perceived complexity of past tools, but by being different, they introduce even more complexity into the overall set of tools for anyone who can't abandon the past tools. There are huge debates about environment variables, config file formats, whether parameters should use one dash or two, what even is a parameter versus an argument versus a flag, how a tool should use STDOUT versus STDERR, how it should use exit codes, whether output should be structured or free text, and nobody agrees on the answers. There is very little standardization, and where standards exist, you can't count on anything to actually follow these.
Contrasts these with the tools of a painter or wordworker. They're similar enough that learning to paint in high school art class will transfer muscle memory near perfectly to every brush and surface you ever use for the rest of your life. Creating a tool like this is throwing up your hands and saying no human can ever hope to remember how to use their tools, so they need an additional tool that remembers for them. But now we also need to remember how to query this memory augmenter, so you've introduced yet another thing to learn for anyone who isn't willing to just stop trying to learn other tools at all and rely 100% on yours.
It isn't to say it can't be useful, but you're trying to solve an ecosystem problem with a tool. You can't. At best, you can alleviate a tiny portion of the difficulty for a very small number of users sufficiently similar to you. Then you run into the culture of not having to pay for these things mentioned elsewhere. On systems like Windows and Mac, they may be paid systems, but once you pay, you automatically the full suite of system utilities and CLI tooling. BSD and GNU were free creations made largely by university professors and industy professionals in their spare time for the purpose of sharing, not for making money. Fair or not, the expectation became and will likely remain that these tools either come as part of a larger package, or they're donated from the spare time of their own users.
Exceptions are few and far between. You've got things like curl and openssl that sustain themselves reasonably well as open source CLI packages, but even those don't charge for the tool itself. They only succeed because they're so ubiquitous that if a barely perceptible proportion of users ever donate or pay for support, that is still enough. That model doesn't work if your userbase isn't virtually the entire world of computing.
In general, people just want free stuff, companies rarely pay for support, and SaaS providers will steal your business if they can. I can think of several apps that macOS users are paying for, such as Bartender, Alfred, or MailMate. Clearly, there's a market for utilities, but only with scarcity.
However, we hardly ever marketed our project and never tried to "build a following" or beg for donations in any way, so maybe the author will do a lot better than us. The server component should also help remind people it's not free.
Edit: One thing I forgot: I think command line utilities are in a worse financial position than GUI utilities, because people are accustomed to paying for GUI apps, but aren’t accustomed to paying for things in the shell at all.
I think that's because command line tools are more appealing to technical people. For convenience sake and ease of use, most average computer users will use GUI tools or applications.
And it has an optional GUI...
I wish we wouldn't act like producing software and making gobs of money are inextricably linked. Yes, we absolutely need to find a way to fund people who are building critical infrastructure. But sometimes, "I quit my job to work on open source" can be more akin to "I quit my job to hike the Appalachian Trail." I wish tech had a lot fewer people who were here for the money.
Don't you see the contradiction? Critical infrastructure costs gobs of money. The software has to pay for itself, or it has to survive on crumbs; that's just reality. Software is really, really expensive to create and maintain, because it takes a lot of time, and time costs money.
I wish Richard Stallman hadn't duped a generation into thinking that they have to use licenses that make Amazon richer instead of just using proprietary licenses to protect yourself, as the licenses were designed to do, so that people with more lawyers can't just steal your work.
Why did our whole generation listen to a guy who was caught on camera eating something off of his foot?
https://opensource.guide/getting-paid/
https://www.state.gov/supporting-critical-open-source-techno...
https://new.nsf.gov/tip/updates/nsf-invests-over-26m-open-so...
This on a ecosystem that grew on Windows developer culture, where paying for developer tools is quite common.
Let alone in other comunities, quite hard indeed.
However, OSS is not incompatible with some form of monetization, coming up with a plan to sell courses, custom services, or cloud options is probably a safer road.
The apps themselves are closed source and making money, but the extensions and add-on functionalities are mostly open source.
Possibly of interest to anyone pondering doing open source:
The former is easy(ish); the latter is trickier since I didn't want to provide a hosted service but there aren't easily usable APIs like s3 that are "bring your own wallet" that could be used. So I punted and made it directory based and compatible with Dropbox and similar shared storage.
Being able to quickly search history, including tricks like 'show me the last 50 commands I ran in this directory that contained `git`' has been quite useful for my own workflows, and performance is quite fine on my ~400k history across multiple machines starting around 2011. (pxhist is able to import your history file so you can maintain that continuity)
I neither love nor hate it as a sync mechanism, but I ended up satisficing with storing the history in my dotfile repo, treating the sqlite db itself as an install-specific cache, and using sqlite exports with collision-resistant names for avoiding git conflicts.
I have a lot of feelings, but I don't have a blog so far. I feel that universities should alloc some of their funding to many of these open source projects and open source community should be better managed rather than donation. As for me, my plan is to start my own company and work on hardware .
While Sonic Pi is also beautiful and much easier to start with as a beginner, I later found the hard way that its architecture is incredibly messy - lots of unrelated parts glued together with duct tape. The simplicity and cleanness of Glicol's code is what made me immediately love it!
Thank you for such a beautiful project!
I'm looking forward to seeing what you do with hardware - I'm sure it'll be cool!
Can you expand on this? Why would universities fund these projects?
There's a lot of problems that I do think universities could be working on. Creating free software is one of them.
Happy to answer any questions :)
Like you said, you won't be paying your rent with sponsors any time soon, but you've already quit your job. Are you living on your savings, and trying to come up with valuable paid features meanwhile? Is it the plan?
Anyway, good luck with whatever you're up to! Building a good monetization strategy is hard
Correct! I do already have a bunch of ideas I'd like to explore, will share more soon.
I'm lucky to have had a pretty good career so far, so my savings are more than enough for the next year or so
Thank you!
You know how this sounds like, to me? “I quit my job to play with and walk my dog full time. I am hoping this will give me time to develop skills to eventually earn some money by maybe walking other people’s dogs, or maybe doing public performances with my dog.”
If you do open source work for like Linux, Server or watever which is already avaiable in high cost, then It could be considerable as great hobby. Open sourcing the innovtive, invensitions aren't good in software industries they are remains fee by commericial firms. All your intentions just go waste. Most of people see this comment negative. The other side, One day come, major jobloss will directly affect you as well even you have settled well. You will understand!
If prices were tripled you could afford to staff for now and the future (r/d).
When small you need to go higher. When you are big you can use size to scale.