It's surprisingly handy to have something like this hanging around; I just use mine to fix up screen caps.
Commenting mostly because when I did this I thought I was doing something very silly, and I'm glad I'm not completely crazy.
ffmpegCmd := exec.Command("ffmpeg",
"-ss", fmt.Sprintf("%.3f", position.Seconds()),
"-i", p.path,
"-vf", strings.Join(filters, ","),
"-vframes", "1",
"-f", "image2pipe",
"-vcodec", "bmp",
"-loglevel", "error",
"-",
)Define "valid"? If you mean "doesn't give an exit error", `tar --help`[0] and `tar --usage`[1] are valid.
[0] For both bsdtar (3.8.1) and GNU tar (1.35)
[1] Only for GNU tar (1.35)
(There's Kitty Graphics too, but I couldn't figure out how to make terminal UI layout work with it.)
It looks like this app is shelling out to ffmpeg to get the bitmap of a frame and then shelling to something called chafa to covert to nice terminal-friendly video.
- I haven't implemented audio support yet, but it would be nice
- I like --dry-run
- I didn't use a TUI widget library, but now it's at the point where it's tedious to refactor the UI / make it prettier
- I like OP's timeline widget
- Wanted to focus on static binaries. I got chafa static linking working for Linux, but haven't bundled ffmpeg yet
- which reminds me of licenses -- chafa and ffmpeg are LGPL iirc
- a couple other notes from early on: https://wonger.dev/posts/chafa-ffmpeg-progress
im in the process of switching to neovim as my main editor just so i can have the same setup everywhere. IDEs like vscode are 'cross platform' but only work on desktop, and there are IDE-like editors for android but none of them work on desktop. oddly enough neovim on android/termux is actually easier to use than any of the IDE editor apps mainly due to everything being keyboard based
when it comes to writing my own mini programs/scripts, is basically the promise of things like flutter where you can write something once and run it everywhere, only it takes hours instead of days to throw something together and its not as overkill because im just using python or bash and then fzf or textual for any interactive parts
I think I understand the switches, and are demonstrably shown I have no clue.
These days, I'm basically relegated in following pre-LLM blogs and SO in hoping I find the right combination.
1.we have a command line program
2.command line args are traditionally parsed by getopt(or close relative) so we will use that(it's expected)
3.our command line program has grown tremendously in complexity and our args are now effectively a domain specific language.
4.congratulations, we are now shipping a language using a woefully inadequate parsing engine with some of the worst syntax in existence.
see also: ip-tables, findI think it would behoove many of these programs to take a good look at what they are doing when they reach step 3 and invest in a real syntax and parser. It is fine to keep a command line interface, but you don't have to use getopt.
Surprised to see the "just" ffmpeg package name. Who's maintaining it? Afaik there's loads of ffmpeg packages in winget[0]
0: https://github.com/search?q=repo%3Amicrosoft/winget-pkgs%20p...
What I've found to be trickier is dividing a video into multiple clips, where one clip can start at the end of another, but not necessarily.
I just think there are other closely related use cases where a separate program can add more value, especially in the terminal. I wouldn't suggest most people should use ffmpeg instead of a gui, those are too dissimilar. Another example is cutting out a part of a video, with ffmpeg you need to make two temporary videos and then concatenate them, that process would greatly benefit from a better ux.
# make a 6 second long video that alternates from green to red every second.
ffmpeg -f lavfi -i "color=red[a];color=green[b];[a][b]overlay='mod(floor(t)\,2)*w'" -t 6 master.mp4; # creates 150 frames @ 25fps.
# try make a 1 second clip starting at 0sec. it should be all green.
ffmpeg -ss 0 -i "master.mp4" -t 1 -c copy "clip1.mp4"; # exports 27 frames. you see some red.
ffmpeg -ss 0 -t 1 -i "master.mp4" -c copy "clip2.mp4"; # exports 27 frames. you see some red.
ffmpeg -ss 0 -to 1 -i "master.mp4" -c copy "clip3.mp4"; # exports 27 frames. you see some red.
# -t and -to stop after the limit, so subtract a frame. but that leaves 26...
# so perhaps offset the start time so that frame#0 is at 0.04 (ie, list starts at 1)?
ffmpeg -itsoffset 0.04 -ss 0 -i "master.mp4" -t 0.96 -c copy "clip4.mp4"; # exports 25 frames, all green, time = 1.00. success.
# try make another 1 second clip starting at 2sec. it should be all green.
ffmpeg -itsoffset 0.04 -ss 2 -i "master.mp4" -t 0.96 -c copy "clip5.mp4"; # exports 75 frames, time = 1.08, and you see red-green-red.
# maybe don't offset the start, and drop 2 at the end?
ffmpeg -ss 2 -i "master.mp4" -t 0.92 -c copy "clip6.mp4"; # exports 75 frames, time = 1.08, and you see green-red.
ffmpeg -ss 2 -t 0.92 -i "master.mp4" -c copy "clip7.mp4"; # exports 75 frames, time = 0.92, and you see green-red.
# try something different...
ffmpeg -ss 2 -i "master.mp4" -c copy -frames 25 "clip8.mp4"; # video is broken.
ffmpeg -ss 2 -i "master.mp4" -c copy -frames 25 -avoid_negative_ts make_zero "clip9.mp4"; # exports 25 frames, all green, time = 1.00. success?
# try export a red video the same way.
ffmpeg -ss 3 -i "master.mp4" -c copy -frames 25 -avoid_negative_ts make_zero "clip10.mp4"; # oh no, it's all green! func BuildFFmpegCommand(opts ExportOptions) string {
output := opts.Output
if output == "" {
output = generateOutputName(opts.Input)
}
duration := opts.OutPoint - opts.InPoint
args := []string{"ffmpeg", "-y",
"-ss", fmt.Sprintf("%.3f", opts.InPoint.Seconds()),
"-i", filepath.Base(opts.Input),
"-t", fmt.Sprintf("%.3f", duration.Seconds()),
}
I think the best way of getting frame accurate clips like that is putting the starting time after the input (or rather before the output), which decodes the video up to that time, and reencode it instead of copying. Both of these commands gives the expected output: ffmpeg -i master.mp4 -ss 0 -t 1 -c:v libx264 green.mp4
ffmpeg -i master.mp4 -ss 1 -t 1 -c:v libx264 red.mp4There are incantations that can dump for you metadata about the individual packets a given video stream is made up of, ordered by timecode. That way you can sanity check things.
This is terribly frustrating. The paths of least resistance either lead to improper cuts or wasteful re-encoding. Re-encoding just until the nearest keyframe I'm sure is also possible, but yeah, this does suck, and the tool above doesn't seem to make this any more accessible either according to the sibling comment.
I wonder if there is a solution which would just copy the pieces in between the starting and ending points while only re-encoding the first and last piece as required.
https://github.com/stax76/awesome-mpv?tab=readme-ov-file#vid...
Appreciate you mentioning the MPV route for making clips, I might actually go through and process all the game recordings I saved for clips over the years.
Im sure other video players like VLC support this, but I found VLC's apis very lacking.
Just bundle it
How do I know? I built one (https://github.com/rclone-ui/rclone-ui)
file 'file1.mp4'
file 'file2.mp4'
file 'file3.mp4'
Then call ffmpeg like this: ffmpeg -f concat -i files.txt -c copy output.mp4
And I guess you could make an LLM write a {G,T}UI for this if you really want.