Asciinema is awesome. I use it to capture all of the demos for TerminalTextEffects. Their Asciinema GIF Generator tool (AGG) produces high quality terminal recording GIFs by processing the asciicast file. This is the best solution I've found for producing short terminal recordings that can be shared in a README/docs as seen in the TerminalTextEffects repo/docs.<p>Using the raw file output and/or termsvg type solutions do not work for TTE due to the huge amount of data being written to stdout.<p><a href="https://docs.asciinema.org/manual/agg/" rel="nofollow">https://docs.asciinema.org/manual/agg/</a><p><a href="https://github.com/ChrisBuilds/terminaltexteffects" rel="nofollow">https://github.com/ChrisBuilds/terminaltexteffects</a>
I pass my server motd / startup through a random tte[1] and it makes me happy every time. Thank you!<p>1: <a href="https://keeb.dev/static/login.mp4" rel="nofollow">https://keeb.dev/static/login.mp4</a>
Awesome, glad you like it. The next release 0.13.0 will have a 'random_effect' option so you're not forced to handle it via shell function. It also significantly reduces the data being transferred and the CPU usage by adjusting the framerate to a more appropriate 60fps. It should perform much better on low end devices which are typically accessed via SSH such as raspberry pi's.
I ask Manus to write a script to demo my CLI tool and use asciinema to record it. It's perfect<p>Whole process: <a href="https://www.pgschema.com/blog/demo-with-manus-and-asciinema" rel="nofollow">https://www.pgschema.com/blog/demo-with-manus-and-asciinema</a>
Replay: <a href="https://manus.im/share/8fEln1OzxpnsRSU1PnHweG?replay=1" rel="nofollow">https://manus.im/share/8fEln1OzxpnsRSU1PnHweG?replay=1</a>
Very minor and tangential request for future recordings - show the shell prompt and/or put the commentary in shell comments (prefix with "# "). A step closer to IRL.<p>Speaking of which, an interesting thing to contemplate is if it is worth automating what you did, or if making the videos happens rarely enough that you'd start from scratch with a new manus or other ai session.
I just wanted to say this is absolutely gorgeous. I don't know if I have a use for this, but I love it and please keep doing this. It's mesmerizing and beautiful.<p>TTE reminds me of Compiz window manager from eons ago, the thing that got me to ditch Windows for Linux, except for the terminal.<p>Is there a way you can add something like TTE to tmux or vim as a screen saver or something that would trigger occasionally but not all the time? Do you pipe it? Make aliases?<p>How do you typically use it? What was your intent when you first wrote it, and what do you use it for now?<p>Keep it up!
> Is there a way you can add something like TTE to tmux or vim as a screen saver<p>I'm not a big vim user, but anywhere you can run a shell command, you should be able to run TTE. As long as ANSI control sequences are respected, the animation should play. TTE will accept piped text or a file input.<p>> How do you typically use it?<p>The most common invocation method is either piped text from some other command or passing a text file via command line arg. It also works as a library and can be imported into existing Python applications to produce animations and animated prompts.<p>> What was your intent when you first wrote it<p>A post here on HN featured an animation of the Sneakers terminal decryption effect and I thought, I can do that in Python. I wrote a handful of simple effects and really enjoyed the process of writing effects and upgrading the engine to support new features. The effect requirement -> engine feature loop is very satisfying. So, I keep working on it when I have free time.<p>> what do you use it for now?<p>It's really more of a toy, and the delay it causes requires some thought when using it, but I've had people reach out with some interesting use cases. Shell startup / SSH motd is pretty common. It can be imported and used as a library, so you may find it as a splash screen or animated prompts in scripts. A few people have shared examples of using it as an animation tool to create advertisements or background animation for electronic music displays. Framework computers recently tweeted a video of TTE running on boot for one of their laptops.<p>Eventually, I'd like to fully document the API from an effect writing perspective and produce some tutorials. I won't consider the project 'finished' until that happens. I don't want to abandon it before people are able to write their own effects.
Compiz. Beryl. Compiz Fusion. Unlocks some fun memories.<p>Also this video has some 6 million views now: <a href="https://youtu.be/xC5uEe5OzNQ?si=GOvwOTHV-RVQnxWv" rel="nofollow">https://youtu.be/xC5uEe5OzNQ?si=GOvwOTHV-RVQnxWv</a> . I remember watching it and being awed by the cube effect.
That's certainly a trip down memory lane...<p>Coming from Windows XP, Compiz Fusion was nothing short of revolutionary. If Linux had a game ecosystem back then, it would have beaten the pants off of Windows. It was so much cooler than Windows.
* ... Compiz [...] the thing that got me to ditch Windows for Linux [...] *<p>I thought that was the role of Enlightenment [1] but I guess I'm just old. :)<p>[1]: <a href="https://www.enlightenment.org/" rel="nofollow">https://www.enlightenment.org/</a>
t-rec is another great tool, which can record any window and produce a video or a gif.<p>Not quite the same thing, but sometimes I find it easier if I just want to share a gif of a terminal.
15MB GIF file. I can't man, I just can't. I literally can <i>not</i>.<p>Can't seek. Can't select text. And I get to use 15x the bandwidth I should have to load it.<p>Please. Stop. <i>Please</i>. It's gross to take beautifully compressable, seekable, accessible text, shove it into the absolute worst format possible for video, and then call that good.<p>If you must insist, please at least link to the raw asciinema, or their web viewer. So that I can load it in a fraction of the time, pause it, seek forward and backward, and copy-paste text. I know, I know, who would want any of that. Much better a GIF that loops so fast it's basically meaningless to anyone but the creator.
asciinema.org is currently being hugged to death! The btop stream is from one of the servers, currently reporting 95% CPU usage: <a href="https://asciinema.org/s/olesiD03BIFH6Yz1" rel="nofollow">https://asciinema.org/s/olesiD03BIFH6Yz1</a><p>But it's chugging along!<p>asciinema server is Elixir/Phoenix, runs on BEAM, and even with lots of connections and high CPU usage it serves request fine. Power of the BEAM. Phew.<p>Currently it runs on just 2 VMs, 2 GB of RAM each, so I will probably need to scale it up very soon anyway :)
In an age where everything is about clouds, it’s pretty amazing to see a high-profile service like this running smoothly on just a couple of 2 GB VMs. That’s less RAM than a mid-range laptop, and those machines struggle to run a modern browser.
Webservers require an insignificant amount of CPU time in comparison to any graphical interface.
Modern hardware in conjunction with smart software choices can be very capable, but it's pretty negligent to run a service people depend on with such limited resources.<p>Sure, not every site needs to run on a highly-available elastic cluster of nodes, but the alternative mindset of actively rejecting this architecture is arguably even more harmful.<p>A healthy deployment should be able to scale up and down easily, if not automatically. I wouldn't want to be woken up at 3am because prod1 needs a CPU/RAM/disk increase and a reboot, and the users wouldn't like that either.
There’s few reasons for this. First, I don’t have infinite budget for this, in fact the budget is close to 0 (sponsorships and donations help). Second, the servers are sponsored by Brightbox, and I’ve been running asciinema.org there for over a decade because BB is reliable and they’re great chaps, but I don’t have infinite resources available there. Finally, I don’t want devops complexity in my side projects.
> service people depend on<p>why wouldn't you cap a free service?
imo it's more irresponsible to demand other people keep a thing you don't pay for running just because you happen to depend on it.
Just because a service doesn't require direct payment doesn't mean that its users shouldn't have any expectations of availability. There should be fair usage limits, of course, even if the service is commercial, but being caught by surprise by a traffic surge and boasting that it runs on underpowered hardware when the project is 10+ years old is irresponsible IMO.<p>Also, any service can have a financial purpose, even if it's "free" for users. This very forum is "free", yet the cost of hosting it is indirectly paid for by being an ad for the company that runs it. HN also famously runs on underpowered hardware that routinely fails to meet demand, but that is beside the point.
Sounds preferable compared to be woken up at 5 am by your cloud providers billing alert.
Just scaled it up, seems pretty stable and responsive again.
Tip: you might want to decrease the btop refresh rate. It's likely that at 100ms a considerable amount of CPU time is taken up by btop itself.
That btop stream is lagging hard now, and jumping between full cpu and idle 0%.<p>Is the service crashing and restarting?
Thanks for the great service! Is there a way to mirror the streams? Like a Wikipedia dump?
Long live the BEAM!
The Asciinema clients have now gone from Python to Golang back to Python now to Rust<p><a href="https://docs.asciinema.org/history/" rel="nofollow">https://docs.asciinema.org/history/</a><p><a href="https://blog.asciinema.org/post/and-now-for-something-completely-different/" rel="nofollow">https://blog.asciinema.org/post/and-now-for-something-comple...</a>
It's one of those projects where it really doesn't matter what language it's in, all of them will accomplish the job roughly the same. I'm happy for the dev to rewrite in whatever language motivates them to work on it, because with such a small codebase the chance it will affect functionality is low enough for it to not matter.
That's... interesting.<p>Most of the gripes about Go could've been apparent before a single line was written, with just some preliminary research. The packaging issues are valid for 2016, even though they are now resolved with Go modules.<p>Then the rewrites in ClojureScript, Elixir, and now Rust... Sheesh. All this tells me is that the authors are trend chasers rather than solid engineers, which erodes any trust I had in this project.
Asciinema is one of the best products/tools I’ve ever used. I mean the auth/upload CLI flow is so straight to the point and easy to use, despite having to cross a few layers (yep I know other CLIs have something similar, it’s just that Asciinema has never “gotten in my way” when doing something with it)
I love Asciinema and have made a small contribution, support the project with a donation: <a href="https://docs.asciinema.org/donations/#individuals" rel="nofollow">https://docs.asciinema.org/donations/#individuals</a><p>If you want to see how Asciinema looks when people are troubleshooting Linux systems: <a href="https://replay.sadservers.com/" rel="nofollow">https://replay.sadservers.com/</a>
This is great, congrats for such a huge accomplishment!<p>My only wish is if asciinema natively supported saving into svg or gif. This would allow you to easily add it to markdown files without the need to install side apps to convert the output to those formats.
I'm a huge fan of Asciinema, and this is awesome work.<p>Regarding the live streaming feature, I hacked a similar thing on top of s2.dev streams[1] (disclaimer: I am a co-founder), which could alleviate the need for a relay in this architecture[2]. Naturally showing off `btop` was the highlight for me as well :-D.<p>[1]: <a href="https://s2.dev/blog/s2-term" rel="nofollow">https://s2.dev/blog/s2-term</a>
[2]: <a href="https://docs.asciinema.org/manual/server/streaming/#architecture" rel="nofollow">https://docs.asciinema.org/manual/server/streaming/#architec...</a>
> introducing terminal live streaming<p>Now we just need someone to develop ASCII-art vtuber avatars which overlay on top of the terminal.
I’ve sometimes heard Twitch streamers mention that they use two computers for streaming. One is the computer you are seeing on stream, and the other computer is running OBS and using a HDMI capture device.<p>With this new live streaming feature in Asciinema, I could imagine that a small subset of programming streamers could skip buying a HDMI capture device. Specially the kind of developer who works exclusively in the terminal. This small group of people could now stream their terminal from the dev machine to the OBS machine using Asciinema 3, instead of capturing HDMI output.
That shouldn’t really matter for asciicinema streaming which 0 effect on any kind of performance.<p>The 2 device setup for game streaming was born because of the heavy CPU usage of x264 encoding. So you could have a PC for the game only while the streamig PC takes the encoding load.<p>But even then nowadays a lot of people moved on from that since you can use your GPU for encoding (Nvidia NVENC) which almost has 0% overhead and providing the same or slightly worse quality. Really OBS x264 should be only used for offline video recording say for a Youtube video
Another feature of this is that if your primary PC crashes, which happens sometimes especially with some games, your stream doesn’t go down too. This is more important for streams with more viewers and longer runtimes, as restarting the stream drops all the AFK viewers who may still be contributing ad revenue (or just boosting active viewer count which has other flow on benefits too).<p>You can see this effect in long streamathons/subathons as twitch automatically kills long streams so you’ll see multiday streams get cut up either manually by streamers or automatically by twitch every 24h or so, and the viewer count drops significantly and takes quite some time (many hours) to recover.
YouTube recompresses everything anyway so I found the higher quality you upload the better the result, especially for complex stuff like high-paced gameplay.<p>And at very high bitrates nvenc quality is just fine. It's mostly at the lower bitrates x264 really shines.<p>Of course for livestreaming there are different constraints.
Twitch streamers do that because they are streaming GPU intensive workload (video game) that would compete with streaming software for GPU resources.<p>It also prevents driver crash caused by a video game to crash a stream.
Additionally, running on a different computer allows for eg adjusting scenes in OBS or moderating Twitch, without losing focus from the running app/game.
Yeah that makes sense. I was thinking that the two computer setup was also something they did to keep their streaming stuff completely separate from the rest of their life, to not accidentally doxx themselves on-stream and to not accidentally execute malware on the same host they do their email and banking etc on.
i think the setup you describe is used primarily to offload video encoding to a second pc, not something programming streamers (not running graphically intensive games) probably need not concern themselves with it
Asciinema was previously written in python
The live btop demo is really cool !<p>Congrats on shipping.
Oh I think I will be having some fun with the streaming capability. Sweet!
Wow nice release. Particularly impressive is the demo live stream of btop!
The power of rust and proper architecture allows an underpowered 2 vCPU VM to stream contents to hundreds of people.
whoa the live streaming is a game changing game changer no Twitch content pun intended (tho possibly maybe it was ...)
My favorite elixir project of all time now written in Rust.<p>Love to see it.<p>Have you guys added the ability to cleanse / watch command strings for sensitive items like secrets, keys etc? Seems easier than ever with advances in lightweight LLMs.
The CLI has been rewritten in Rust. The server is still Elixir/Phoenix - perfect for this feature.
> Have you guys added the ability to cleanse / watch command strings for sensitive items like secrets, keys etc? Seems easier than ever with advances in lightweight LLMs.<p>Please this has to be satire, right?
Some games have this feature called "streamer mode", where some text that would normally be on screen is removed, like username or party codes. Makes sense for a terminal to do the same for other sensitive data.
It would be nice to censor my username from the prompt without changing my shell prompt.<p>No idea if it does this already though.
Why would it be satire? Seems like a real use case
> My favorite elixir project of all time now written in Rust.<p>It was written in Python before, no?
I, for one, look forward to our new RUST overlords...