the_dunk_tank
It's the dunk tank.
This is where you come to post big-brained hot takes by chuds, libs, or even fellow leftists, and tear them to itty-bitty pieces with precision dunkstrikes.
Rule 1: All posts must include links to the subject matter, and no identifying information should be redacted.
Rule 2: If your source is a reactionary website, please use archive.is instead of linking directly.
Rule 3: No sectarianism.
Rule 4: TERF/SWERFs Not Welcome
Rule 5: No ableism of any kind (that includes stuff like libt*rd)
Rule 6: Do not post fellow hexbears.
Rule 7: Do not individually target other instances' admins or moderators.
Rule 8: The subject of a post cannot be low hanging fruit, that is comments/posts made by a private person that have low amount of upvotes/likes/views. Comments/Posts made on other instances that are accessible from hexbear are an exception to this.
Rule 9: if you post ironic rage bait im going to make a personal visit to your house to make sure you never make this mistake again
view the rest of the comments
Powershell uses a structured data pipeline, as does elvish and nushell and many others. I don't think fish does, it's trying to be mostly POSIX-compliant except where they don't agree with POSIX. I might be wrong about that it's been a long time since I used fish.
Feel free to keep reading my ramblings but the landing pages for both elvish and nushell that I linked above illustrate the power of structured data pipelines, in particular nushell:
Accomplishing something like this with a flat text pipeline like bash would be much longer and far less readable.
POSIX-compliant shells use plain text in their pipeline, which means that any output you pipe from one command to the next has to be done so in either plain text or a format the consuming command can process. What that means in practice is that unless the consuming command has been designed specifically to consume the output from the preceding command in the pipe, users have to do a lot of string manipulation with tools like awk, sed, or grep to structure the output from one command into a format that can be consumed by the next.
Where it really shines is when interacting with APIs that expect to use structured data in their interfaces, as you can just pipe data from one to the next without worrying about restructuring it. It's why PowerShell was created--unlike POSIX-compliant OSes, Windows is built on the component object model, and every API built into the OS already used structured data.
The web is built on structured data (json, xml) as well, and that lucky coincidence made powershell a much more useful shell for over-the-wire API interaction than bash et al.
oh whaaaat! this is so cool, i have not seem any of this before. reminds me of plan9 videos i was watching, dreaming of what could have been...
thinking about it more, a correction to my post:
It's not that you don't have to restructure data that you pass through the pipeline with modern structured-data-pipeline shells, you definitely still do. It's just that restructuring that data is trivial because you don't have to use a ~~text~~string manipulation tool to re-construct the output structure from the flat text pipeline, you can access that output structure directly.
Of the examples I gave above, PowerShell's probably the most accessible (in terms of tutorials and whatnot), I ran it as my shell on linux for a couple years before switching to Elvish.
yo i just installed elvish on my proxmox pve host, kinda confusing but it seems pretty cool, ty for the recommendation!
hell yeah comrade. it's a little weird to get used to but I like it. I don't use the object pipe that often in interactive use but it's killer for scripts
fr fr no capp!