This trend of "new modern shells" that runs and start as slow as some javascript code (powershell) needs to stop
People forgot what shells are for, and what scripting is for
Who cares, as long as it's not my shell. The old shells and terminals won't disappear, and if some new relevant standard emerges, people will port it.
> People forgot what shells are for, and what scripting is for
Shells are for the people, and scripting might be for people, depending on whether it's a oneshot-script. Maybe you forgot too what they are for?
Does start-up time matter? Who cares? (I have three terminals that have been open for almost a year.) Are we concerned about only first start, or primarily the faster second starts once all the dynamic libraries have loaded? Does anything actually require an i5? Whats wrong with requiring a GPU?
Doesnt it get tiring, being so grumbly about other people having fun & doing cool things? Do you really think we should do as you say & just freeze time, insist on doing nothing?
With that said, the counter argument should've been that although powershell does start up slow, (and many other things) it is better than bash in many cases and more performant.
I'm glad that you never encountered this before and I sincerely wish you never will: getting on call 3am in the morning due to server outage and you couldn't diagnose remotely. You rushed to the server room, which was only 50F btw, connected to the machine and brought up a rescue shell. Oh, did I tell you that none of them has integrated GPU?
See, it's not about the time you sit in front of your M1 MacBook and have a nice cup of tea -- it's about the situation where everything goes south and your tools and infrastructures can still have your back
Yes, me
> being so grumbly about other people having fun & doing cool things
The downstream effect of people "doing cool things" (making insanely bloated crap) is that we often have to use it.
I don't understand this attitude that every claim and endeavor is immune from criticism as long as you can frame it as someone "having fun" (I'm sure these corporate software projects are super duper fun) or being experimental.
Yes.
> Who cares?
Me, and my colleagues.
> I have three terminals that have been open for almost a year.
I have 10s of them open at times. Close and re-open at least a dozen ones every day, because what I do requires that.
> Does anything actually require an i5?
Yes "modern" terminals really require some heavy lifting, because "smooth scrolling!"
> Whats wrong with requiring a GPU?
A lot. Showing text shouldn't need gaming level hardware. Then people moan about their battery life.
> Doesnt it get tiring, being so grumbly about other people having fun & doing cool things?
Haha no. Because they do the cool things, and they backtrack and return to roots as they move down the path. Watching this is delightful.
> Do you really think we should do as you say & just freeze time, insist on doing nothing?
Why not try to increase efficiency and try to do cool things without sucking the living light out of our systems, E17 style?
This is such a bad idea I don’t know where to start. Shell commands are a dangerous, but precise tool, somewhat like using a scalpel or a surgical tool. Dumbing it down so it can “guess what you want it to do” is going to result in more people (Specifically people who don’t bother to read the docs) breaking things.
> Warren Teitelman originally wrote DWIM to fix his typos and spelling errors, so it was somewhat idiosyncratic to his style, and would often make hash of anyone else's typos if they were stylistically different. Some victims of DWIM thus claimed that the acronym stood for ‘Damn Warren’s Infernal Machine!'.
> In one notorious incident, Warren added a DWIM feature to the command interpreter used at Xerox PARC. One day another hacker there typed delete *$ to free up some disk space. (The editor there named backup files by appending $ to the original file name, so he was trying to delete any backup files left over from old editing sessions.) It happened that there weren't any editor backup files, so DWIM helpfully reported *$ not found, assuming you meant 'delete *'. It then started to delete all the files on the disk! The hacker managed to stop it with a Vulcan nerve pinch after only a half dozen or so files were lost.
The Jargon File http://www.catb.org/jargon/html/D/DWIM.html
(I'm talking about a TUI, this was solved 60? years ago)
Having written a non-trivial command-line parser in C, and having used a bunch of them in other languages, it seems to me that this task would benefit from some more standardization and maturation. What is the JSON of the command-line? How can we do to increase the level of interoperability between how information is encoded on different tools' command-lines? e.g. think of ImageMagick "convert" versus "find" versus "ffmpeg": totally different universes, but all of them in their own way turn command-line arguments into mini-DSLs.
However, you also called out some very specific commands that are that way for a reason. For example the order of options for ffmpeg matters very much, as that’s used to construct the processing pipeline. It does make sense for certain things to be custom, but that should only be done when there’s a good reason.
A tool I write has a use-case for understanding the syntax of at least ~common CLI tools well enough to pick out args that will be other executables (sudo cat, find blah -exec...), so I have been idly pondering whether there's a humane, declarative, descriptive grammar that can express nearly all CLI interfaces.
It's probably not worth the work for my case, but it might get to be more tractable if it was also an input for better completion, help, linting, etc. tools.
Ideally something that drives enough all-around value that projects would start up streaming the grammars (and maybe adopting an associated parser?)
ls | cat </dev/stdjson | string_proc_the_json_for_some_reason
With the direct ability to process in line:
ls -a | json.files[0].last_modified
I'd probably want multiple output formats (including s-expressions).
$ ls | get 0 | select modified | to json
{ "modified": "2022-08-16 16:38:28 -04:00" }
The internal data format looks pretty JSON-like, with the added ability to keep Nushell types intact.
While I'm not ready to replace Fish with Nushell, it's definitely taken the place of jq for me.
https://unix.stackexchange.com/questions/197809/propose-addi...
I see much potential in adding stdjson as well, but I do caution against opening the floodgates to std* being implemented for every pet format and insignificant corner case.
int main(int argc, char *argv[], char *envp[], JSON *json)
for some JSON data type that is part of C, kind of analogous to a FILE stream? I'm not sure how the the json info would get into that fourth argument (has to be independent of argv), but it would keep std{in,out,err} as is.Additionally, as JSON is text, you can use awk/sed/grep and so on.
"Crescendo" has been marketed as a solution and looks cool, but it means relearning the tool or documentation being less useful. The sheer amount of existing time people spent learning arcane git syntax means they're not going to switch to a hypothetical "New-GitCommit" function, even if it accepts arrays or PSCustomObject as input.
It seems very well reasoned, has stable API, excellent backwards compatibility and does not require GPU and i5 as this one might? Its author also has proven record and actual experience, which I'm not sure authors hpf TFA have, judging solely from their writing.
Admittedly I haven't tries making it my default yet, but seemed fairly well adaptable for tiling WM/Openbox needs.
I've found the gradient text trend to be interesting for titles and single lines, but I don't think it works for multi-line text.
I mean, for starters:
-> unix introduced text as a universal interface
-> bash made reusing stuff a lot easier via file descriptors, etc. (think: <(input to regard as file))
-> powershell allowed for object oriented scripting
-> some older systems (name forgotten/unknown) even had interactivity in the cli: click on parts of command-output and stuff happens, even after other commands have been run alreadyBut what they are doing is way beyond colorful only. It’s smooth scrolling of text boxes within text boxes. That alone is bonkers.
But they also have an easy API for smart/partial redrawing, sensible UI components, I mean, their progress bars are exquisite. It’s truly impressive.
The command line should/ought bridge & integrate better. Making it more usable from these higher (more pre-baked/automated) levels is one side. And then reciprocally, how wonderful it would be to see execution flow expressed less in terms of stack traces & more in terms of networks of communicating processes. Create boundary layers, make the cli tools visible & known operations sequenced by (but still visible within) higher level systems.
Cli on and on!!
~$ls *{png,jpg}
oldmeme.png
~$imgcat oldmeme.png
/----------------\
| oldmeme.png |
| appears right |
| here in the |
| terminal |
\----------------/
~$rm oldmeme.png
Terminfo man page shows some evidence of support for "bit_image" commands but none of the terminals in my terminfo files seem to have it. I have over 2000 terminfo files though, I like the idea that if I found some literal teletypewriter from 1973 and figured out some way to hook it up, I am probably prepared with the proper escape sequences.PS: I use nnn[1] with the preview plugin which make use cases like yours very easy to solve.
"This is an EFL terminal emulator with some extra bells and whistles such as the ability to display in-line images, video and even play music files, background images, videos, Z-Modem like sending (e.g. SSH into a server and use tysend to send a file back to the local terminal), GPU Accelerated rendering (optional - just set the EFL Elementary toolkit engine to use OpenGL) and much more."
Kitty provides a similar but incompatible protocol: https://sw.kovidgoyal.net/kitty/graphics-protocol/
This list is not complete, KDE konsole supports this too (and other formats) for example.
Junegunn Choi is really talented designer. More please.
I think the future of the command line lies in the direction of flow-based programming and spatial representation of complex commands.
I would like to see a terminal that, as I type, _generates_ a flow-based view of my command. Every command would be visualised as a component: ls, awk, sed... Every |, < or > that I type would append a link and a new component to my flow, and ultimately I would be able to manipulate my flow instead of typing: click the ls component, have it output Creation Date instead of Modified Date, then click the awk component and add another output to a new sed component and so on.
Missing file, device, and information manipulation applications that shell programmers string together would be replaced by JavaScript functions from a library. If you really want JSON, use the JavaScript Serialized Object Notation to serialize JavaScript objects in JavaScript.
Why do people write "what I want is..." articles and comments when they could be writing solutions that scratch their itch and meet their needs?
Stop making me have to engage in bizarre escaping rituals and let me just toggle between "string mode" and "array mode".