[HN Gopher] No script is too simple ___________________________________________________________________ No script is too simple Author : nicbou Score : 216 points Date : 2020-09-22 15:55 UTC (7 hours ago) (HTM) web link (nicolasbouliane.com) (TXT) w3m dump (nicolasbouliane.com) | maest wrote: | Sounds like the main function here is to create concepts at the | correct level of abstraction, a bit like an API. | Justsignedup wrote: | I spent a fairly sizeable time on this in one of my previous | jobs. Most things were now bin/do_this.sh or bin/setup.sh or | whatever. | | Setting up a pc went from 1-3 days to a 3 hour "run this | setup.sh" script and call me if it crashes. | | It will become obvious when you hire someone that this needs to | be done. As a project lead, schedule time for it! | danpalmer wrote: | I'm a big fan of this but I often find it in tension with | necessary flexibility. The more flexibility I add to my scripts, | the less useful they are for this purpose. | | I usually come down on: things where the output is shared should | probably be scripts, things where it's just for me probably | shouldn't be (so that I can be more flexible, and there will be | less bit-rot of those scripts). | | Any other thoughts on this? | chubot wrote: | Yeah this is a good question. I lean toward making shell | scripts work for a specific project. All the paths should be | the same for all developers, so I usually use paths relative to | the git repo root of the project. | | A key point is that when you want to "abstract", then the right | way is usually to write a command line tool invoked from shell | scripts in multiple projects. That is the natural way to get | flexibility and reuse. | | But otherwise it's a bunch of commands dumped in one place. | | Examples: | | http://www.oilshell.org/blog/2020/02/good-parts-sketch.html#... | | Here's a random example -- some scripts to generate source code | as part of the build process: | | https://github.com/oilshell/oil/blob/master/build/codegen.sh | | The paths are all hard-coded, which is a good thing. | | In my mind, the goal is to save time and reduce mistakes. And | having a consistent dev environment between everybody on a | project is almost a prerequisite for that, and shell can | actually enforce that consistency! (i.e. the shell scripts | don't work if people have quirks on their machine. They can | check the environment too.) | theptip wrote: | A follow-up recommendation I give (which I suspect might be | unpopular with many around here) is to use Python for all but the | most trivial one-liner scripts, instead of shell. | | Since 3.5 added `subprocess.run` | (https://docs.python.org/3/library/subprocess.html#subprocess...) | it's really easy to write CLI-style scripts in Python. | | In my experience most engineers don't have deep fluency with Unix | tools, so as soon as you start doing things like `if` branches in | shell, it gets hard for many to follow. | | The equivalent Python for a script is seldom harder to | understand, and as soon as you start doing any nontrivial logic | it is (in my experience) always easier to understand. | | For example: subprocess.run("exit 1", | shell=True, check=True) Traceback (most recent call | last): ... subprocess.CalledProcessError: | Command 'exit 1' returned non-zero exit status | | Combine this with `docopt` and you can very quickly and easily | write helptext/arg parsing wrappers for your scripts; e.g. | """Create a backup from a Google Cloud SQL instance, storing the | backup in a Storage bucket. Usage: | db_backup.py INSTANCE """ if __name__ == | '__main__': args = docopt.docopt(__doc__) | make_backup(instance=args['INSTANCE']) | | Which to my eyes is much easier to grok than the equivalent bash | for providing help text and requiring args. | | There's an argument to be made that "shell is more universal", | but my claim here is that this is actually false, and simple | Python is going to be more widely-understood and less error-prone | these days. | freedomben wrote: | As someone who hasn't used Python in ages, and finds it quite | unreadable, I would be saddened and dismayed if instead of | shell (which is the lingua franca) a project used Python | everywhere. | | The exception being if it's a Python app in the first place, | then it's fine since it can be assumed you need to be familiar | with Python to hack on it anyway. For example I write a lot of | Ruby scripts to include with Sinatra and Rails apps. | | For a general purpose approach however, everybody should learn | basic shell. It's not that hard and it is universal. | ajuc wrote: | Ever tried to parse XML or json in linux shell? | | It's not a matter of familiarity with linux shell. It's a | matter of wasting time debugging/implementing stuff in shell | that is trivial to implement in any modern scripting | language. | [deleted] | freedomben wrote: | Haven't parsed XML in quite a number of years but I parse | JSON to extract values from a curl command all the time | with jq and it's really not that bad. If there's anything | more than simple extraction required then I agree it's not | good for the shell. For that I turn to the primary language | of the project preferably (which these days is Elixir and | sometimes Ruby for me). | | But even JSON and (X|HT)ML you'd be amazed how often a | simple grep can snag what you want, and regular expressions | are also mostly universal so nearly everybody can read | them. | ajuc wrote: | We (2-person team doing embedded QT app on linux) did it | the exact way you're arguing for - starting with shell, | struggling with it for several months every time | something went wrong or we had to change it, and finally | giving up and rewriting it in python. | | Python wasn't our main language (the app was in C++ and | QML which is basically java script). But we both knew | Python and it's the easiest language for these sort of | things that we both knew and that comes with the system. | | My main conclussion is - I'm starting with python the | next time. | | Life's too short to check for the number of arguments in | each function. | verall wrote: | I think there is a big difference between this and the | parent. If you are working on projects where the "primary | language" is Ruby or Python or Elixr than sure, use that, | but if your project's primary language is C++, like most | embedded applications, you do NOT want to use that. | | Any complex or cross-platform C++ projects will need a | scripting language in addition to shell and a build | system. | reificator wrote: | I do JSON pretty frequently with jq and idiomatic python[0] | can't even approach its ease of use. | | [0]: I'm sure there's a jq style library for python | somewhere but it's definitely not the norm. | fra wrote: | Every project I work on uses the `invoke`[1] Python package to | create a project CLI. It's super powerful, comes with batteries | included, and allows for some better code sharing and | compartmentalization than a bunch of bash scripts. | | [1] http://www.pyinvoke.org/ | jiofih wrote: | Ruby is a much better candidate for this. Nicer syntax and a | vastly better standard library. | nfrmatk wrote: | Nicer syntax? Vastly better standard library? Those seem like | subjective statements to me. Can you provide some examples to | support those claims? | deeg wrote: | I dunno about a better standard library but I do think Ruby | has nicer syntax. You can use back-ticks to run system | commands, like: | | listing = `ls -oh` | | puts listing | | puts $?.to_i # Prints status of last system command. | nfrmatk wrote: | That might just come down to the usual differences | between Python and Ruby. A Python programmer would likely | appreciate an explicit call to `subprocess.run`. | | Some equivalent Python 3.7+ would be | import subprocess listing = subprocess.run(("ls", | "-oh"), capture_output=True, encoding="utf-8") | print(listing.stdout) print(listing.returncode) | | As an experienced Ruby programmer I imagine the code you | provided looks like a no-brainer. To a Ruby neophyte the | backticks, $? sigil, and .to_i method don't strike me as | intuitive (but maybe they would be do someone else). We | may just have to disagree about the nicety of syntax. | | As cle mentioned[1], the strongest criteria is | | > what would the majority of people on the team prefer to | use? | | and I'd agree that ultimately that's what makes the most | sense. | | [1]: https://news.ycombinator.com/item?id=24558989 | deeg wrote: | The back-ticks and $? are shell standards so I would | expect those to be familiar to shell scripters. I agree | that if one knows nothing about any of the languages then | neither is much better. | cle wrote: | This kind of conversation is never productive in a team | setting. The strongest criteria is "what would the majority | of people on the team prefer to use?". That'll yield the | highest productivity, regardless of whatever syntactic sugar | it has. | teknopaul wrote: | for the record... | | if __name__ == '__main__': args = docopt.docopt(__doc__) | | means nowt to me. | | In fact I don't really understand your post and I've written a | lot of python. | | No language has a simpler api to executing a cli command than | bash itself. By definition. | divbzero wrote: | The conditional expression __name___ == | '__main__' | | simply tests whether this Python file is being executed from | command line. | | It's one of the ubiquitous Python idioms that doesn't | actually follow Python's zen of "readability". | theptip wrote: | Agreed on this point, it's the `public void static main` of | Python. | | Perhaps I shouldn't have tried to make two points in one | post; I wouldn't advocate using docopt or a main function | for a simple script, I was more making the case for how | easy it is to add proper parameter parsing when you need | it; that is easier to remember than what you'd end up | writing in bash, which is something like: | PARAMS="" while (( "$#" )); do case "$1" | in -a|--my-boolean-flag) | MY_FLAG=0 shift ;; | -b|--my-flag-with-argument) if [ -n "$2" ] && | [ ${2:0:1} != "-" ]; then MY_FLAG_ARG=$2 | shift 2 else echo "Error: | Argument for $1 is missing" >&2 exit 1 | fi ;; -*|--*=) # unsupported | flags echo "Error: Unsupported flag $1" >&2 | exit 1 ;; *) # preserve | positional arguments PARAMS="$PARAMS $1" | shift ;; esac done | # set positional arguments in their proper place | eval set -- "$PARAMS" | | I think you've got to write a lot of bash before you can | remember how to write `while (( "$#" )); do` off the top of | your head; the double-brackets and [ vs ( are particularly | error-prone pieces of syntax. | wahern wrote: | Idiomatic shell code would be: while | getopts "ab:" OPTC; do case "$OPTC" in a) | MY_FLAG=0 ;; b) | MY_FLAG_ARG="$OPTARG" ;; *) # | shell already printed diagnostic to stderr exit | 1 ;; esac done shift | $((OPTIND - 1)) | | If you don't know shell, just say you don't know shell. | | Yes, I realize it's more difficult to support long | options (though not that difficult), but the best tool | for the job will rarely check all the boxes. Anyhow, the | arguments for long options are weakest in the case of | simple shell scripts. (Above code doesn't handle mixed | arguments either but GNU-style permuted arguments are | _evil_.) | | Also, I realize there's a _ton_ of Bash code that looks | exactly like you wrote. But that 's on Google--in | promoting Bash to the exclusion of learning good shell | programming style they've created an army of Bash zombies | who try to write Bash code like they would Python or | JavaScript code, with predictable results. | rmetzler wrote: | But the point of the article is not to make complicated | scripts readable again, but to put even simpler commands | (maybe even one long mysqldump command) into very short | scripts. | | I agree that a cli framework is often easier to use than | bash, but it also is a dependency. I think everyone | should use what he's familiar with. | dllthomas wrote: | > No language has a simpler api to executing a cli command | than bash itself. By definition. | | Things that are true "by definition" are not usually useful. | | "No language has a simpler api [...] [b]y definition" only if | by "executing a cli command" we mean literally interpreting | bash (or passing the string through to a bash process). But | that's never the terminal[0] goal. | | The useful question is whether, for the _task_ you might want | to achieve, for which you might usually reach for the shell, | is there in fact a simpler way. | | It may very well be that the answer is "no", but support for | that is not "by definition". | | [0]: Edited to add: ugh. Believe it or not, this was not | intended. | cjohnson318 wrote: | Shell is definitely not universal on Windows machines. If you | work with mechanical engineers, at least half of them will be | using Windows. That being said, Python is gross on Windows, | too. | nicbou wrote: | At my last company, I used Python for most of our scripts. Bash | scripts were too tedious to write, harden and maintain. | yjftsjthsd-h wrote: | How does this handle pipes and such? What's the nicest python | equivalent to `grep ^$(whoami) /etc/passwd | awk -F : '{print | $NF}'`? | jniedrauer wrote: | Nothing is quite as succinct as shell. The python equivalent | to this would probably be at least 10 lines of code. | tyingq wrote: | Everyone tends to rag on Perl, but it's excellent in this | space. | | Edit: The equivalent of above would be: | #!/usr/bin/perl my $shell=(getpwuid($>))[8]; | print "$shell\n"; | | Though it does have simple syntax for pipes as well. | yjftsjthsd-h wrote: | https://news.ycombinator.com/item?id=24558719 does it in 5, | which is, I think, proof that you're both right. (Of | course, perhaps shell is cheating, since I chained 3 | commands into that one-liner, and 2 of those are really | their own languages with further subcommands.) | d0mine wrote: | The answer to" how I pipe shell commands in Python" is that | you can do it (e.g., using plumbum, fabric, pexpect, | subprocess), but you shouldn't in most cases. | | In bash, you have to use commands for most things. In Python, | you don't. | | For example, `curl ... | jq ...` shell pipeline would be | converted into requests/json API in Python. | johndough wrote: | Here are three options: # Option 1 (no | pipes) import os for line in | open("/etc/passwd").read().splitlines(): fields = | line.split(":") if fields[0] == os.getlogin(): | print(fields[-1]) # Option 2 (Cheating, but not | really. For most problems worth solving, there exists a | library to do the work with little code.) import os | print(os.environ['SHELL']) # Option 3 (pipes, | not sure if the utf-8 thing can be done nicer somehow) | import subprocess username = | subprocess.check_output("whoami", encoding="utf-8").rstrip() | p1 = subprocess.Popen(["grep", "^" + username, | "/etc/passwd"], stdout=subprocess.PIPE) p2 = | subprocess.Popen(["awk", "-F", ":", "{print $NF}"], | stdin=p1.stdout, stdout=subprocess.PIPE) | print(p2.communicate()[0].decode("utf-8")) | yjftsjthsd-h wrote: | Thanks, that does answer my question quite thoroughly:) I | intended the question not as "how would I get my shell" but | as a "how would I pipe commands together", which now that I | think about it is perhaps the real issue: I think about | solving many problems from the perspective of "how would I | do this in /bin/sh?", but perhaps the real answer is that | if you're doing it in Python (or whatever) then you should | be writing a solution that's idiomatic in that language. Or | if you like, perhaps one doesn't need to "standard library" | of coreutils if one has the Python standard library, which | means that many of the thing's I'd miss in Python are hard | because they aren't the right solution there. | sixstringtheory wrote: | Quickest and dirtiest way I've found, but yes, not | idiomatic Python: | subprocess.check_output("grep ^$(whoami) /etc/passwd | | awk -F : '{print $NF}'", shell=True, | encoding="utf-8").strip() | | I didn't test that particular line, but in general this | is how I execute shell pipelines in Python. | [deleted] | pseudalopex wrote: | pwd.getpwnam(getpass.getuser()).pw_shell | snake_case wrote: | If you're not opposed to installing another tool for scripting, | then check out my project mask [1]. It's a CLI task runner (made | with rust, single binary) that supports multiple scripting | languages and it's defined with a simple, human-readable markdown | file :) | | I used to have a scripts directory for each project, but I really | wanted basic argument/options parsing and subcommands. That's | mostly why I made mask. Now I use it daily as a project-based | task runner, as well as a global utility command. | | [1]: https://github.com/jakedeichert/mask | umaar wrote: | I've been moving most of my projects over to Makefiles, instead | of npm scripts [1]. Unless I am missing something obvious, it's | nicer to run: | | make vs. npm start make thing vs. npm run thing | | But also more importantly, the Makefile itself feels much | cleaner, rather than stuffing scripts in package.json files. The | thing I'm confused about, why do npm scripts seem to dominate? I | don't see many people using Makefiles. | | [1] https://github.com/umaar/video-everyday/blob/master/Makefile | mstade wrote: | I had your point of view until I started working in corporate | environments where windows reign supreme, and then I understood | that the reason why no one wrote Makefiles or shell scripts is | simply because no one had access to those runtime environments. | JavaScript, however, truly is everywhere these days. | | Sure, you _can_ run Make and even shells on windows now, but | that doesn 't mean you have the power or even capability to do | so in a corporate environment where IT dictates only | reluctantly provides you with a machine in the first place. | mark_and_sweep wrote: | I actually find using Make on Windows to be really simple. I | usually drop a prebuilt make.exe into the project directory | since it's just 244kb | (http://www.equation.com/servlet/equation.cmd?fa=make, | https://github.com/MarkTiedemann/make-for-windows). | | If the Makefile needs to be cross-platform, that's easy to | do, too: | | ``` | | ifeq ($(OS),Windows_NT) | | # Windows | | else | | # *nix | | endif | | ``` | stouset wrote: | Life is too short to work in these kinds of environments, if | you can avoid it. | cjauvin wrote: | One related thing that I began to do a while ago is to always | have a `<project>.org` file going along any of my projects, which | is a free-form Org-Mode journal of all the things and ideas | related to that project, including of course esoteric commands | for certain things (which sometimes can be hard to find in my | bash history). With Org-Mode it's easy to fluidly change the | structure of this journal, which makes it a very powerful tool. | ijustlovemath wrote: | When I was working on a NASA project in university, one of my | colleagues would always preach the virtues of org mode; as a | person whose brain now thinks in vim, is it worth learning | emacs just for this tool? | dhagz wrote: | I was where you are about a month ago. I ended up using | Doom[0] as a bootstrapped config and haven't looked back. | | I love org-mode. It's the killer feature of Emacs in my mind. | I also feel like (note, this is entirely anecdotal and not | based on hard facts) Emacs has better LSP integration than | Vim. I mainly use Go, so it could also be that gopls has | become more stable than it was a year ago when I was first | trying to get Vim working with it. | | [0]: https://github.com/hlissner/doom-emacs | asciimov wrote: | Not op, but yes org-mode is worth it. | | As a vim user, my recommendation is to skip spacemacs, and go | for straight emacs (with evil mode if you like modal editing, | i do) | | I liked Uncle Dave's Tutorial series on youtube: https://www. | youtube.com/channel/UCDEtZ7AKmwS0_GNJog01D2g/vid... | | Start with his emacs tutorial, it is mostly learn emacs and | org mode as you learn to setup emacs. | gnulinux wrote: | Why skip spacemacs? I've been in this mentality since early | 2010s and I've been maintaining my own N k line of elisp | script for ~15 years. Last year I installed spacemacs and | TBH although not everything is exactly how I want, it's | refreshing not to maintain my own OS to be able to code. | Spacemacs still makes some things harder but overall I | prefer it to building everything yourself from ground. | Anyway, just my opinion. You can always customize Spacemacs | too, of course it's gonna be more complex than vanilla | emacs. | cyrialize wrote: | Not OP, but I recommend skipping Spacemacs so that people | gain experience with configuration and elisp in Emacs. | | Having some base level understanding makes understanding | other Emacs configurations (like Spacemacs, Doom, | Prelude, etc) much easier in my opinion. | TeMPOraL wrote: | I'm of two minds about it. I get the value it provides to | people new to Emacs. But once you reach the point in | which you'll want to dig in and adapt Emacs to yourself, | you'll be facing not just learning elisp and Emacs, but | also the complex framework Spacemacs built on top of | that. | | I've been using Emacs since early 2010s as well, so I'm | biased - I had my own convoluted elisp modules before | Spacemacs came around :). | asciimov wrote: | Some of the first advice I was given when starting to | learn emacs, was learn vanilla first then try doom or | spacemacs. | | Hearing the love so many have for spacemacs, I started | there first instead. | | Quite early on, I ran into problems. Every time I reached | out on various forums I was told either: you're doing it | wrong, that's a non-issue, RTFM (which isn't helpful when | you don't know what you're looking for), or my favorite | you have an XY problem (I didn't). So I'd go back to vim | and put emacs on the back burner for a while longer, | waiting for spacemacs to mature. | | After the third attempt at spacemacs, I gave up and | started looking for a good emacs tutorial. | | Again I ran into some issues, but I found the regular | emacs people very welcoming and helpful. Pretty soon I | was able to diagnose my own issues, and figure out what | settings I needed to change to meet my needs. | | In the end, that early advice was true. You need to have | some understanding for emacs to help diagnose spacemacs | issues. | | Will I give spacemacs another shot? Maybe one day, | probably around the time the update their main release. | It's been what, 2.5 years since they updated the main | branch? | gnulinux wrote: | vim vs emacs thing is extremely outdated imho. There is evil- | mode (Emacs VIm Layer) in emacs which is an emulator of vim. | You can have vim, or emacs, or both, or none, all being | equally viable. In fact, there is Spacemacs which is an emacs | distro that is built around evil-mode and comes with a whole | bunch of packages out-of-the-box. | | https://www.spacemacs.org/ | | This is not to preach of emacs or vim, really. I'm just | saying vim and emacs are _by no means_ mutually exclusive. I | personally never got used to vim stuff, so I use Spacemacs | with emacs keybindings, and my custom elisp scripts. Emacs | really is more of a programming environment /mini operating | system than an editor. Enjoy! | nextos wrote: | I couldn't agree more. Emacs is a text-mode Lisp VM, | whereas Vi is a modal editing UI. They are in different | categories. | | Emacs has a great Vi implementation, Spacemacs. Neovim is | also a good Vi implementation. Vim, I think, is a bit | outdated. For example, VimL scripting is full of quirks. | masklinn wrote: | Seems like it would make sense to make the file runnable, the | "esoteric commands" would work as subcommands / makefile | targets of sorts, and the rest of the orgfile would be what | they aleady are. | cyberbanjo wrote: | In org-mode there's a concept of `tangle` and you can | 'compile' (not sure if thats how org calls it) .org files | into a number of individually specified scripts or documents. | So you can have your top-level NOTES.org and also your | scripts/* entangled. | | https://orgmode.org/manual/Extracting-Source-Code.html | pbiggar wrote: | This is exactly what we do for Dark: | https://github.com/darklang/dark/tree/main/scripts | | Not only that, but each script automatically runs itself in our | docker container so it's fully repeatable. They all start with | . ./scripts/support/assert-in-container "$0" "$@" | | which is just this: if [[ ! -v IN_DEV_CONTAINER | ]]; then scripts/run-in-docker "${@}" exit $? | fi | | And this is run-in-docker: | https://github.com/darklang/dark/blob/main/scripts/run-in-do... | jason_zig wrote: | Did anyone else hold down shift + command and find the | achievements part of the site? Fun idea... | nicbou wrote: | https://nicolasbouliane.com/achievements - for the curious | wodenokoto wrote: | I have so many common one-liners I use in my current project | (that I access using fuzzy search via ctr-R) that I'm thinking | about having a file a'la "my-commands" and have it appended to my | history, somehow. | | That would truly be the opposite of this advice. | | Maybe I need to think about this a bit more :) | lytedev wrote: | Makefile! | blandflakes wrote: | You can also use direnv to add aliases when you enter a certain | directory. | sbt567 wrote: | I'm using navi (https://github.com/denisidoro/navi) for | commands that long enough and used less frequently. And it | works great. | dylan604 wrote: | > have it appended to my history, somehow | | isn't that what .aliases is for? | wodenokoto wrote: | I honestly don't know. | marcosdumay wrote: | Shell aliases are your friend. | | just give them some memorable names, and add them to your | .bashrc. Or, if they are very context sensitive (that's not | great), there is a way to source a file every time you enter a | directory, I just don't remember it. | chubot wrote: | Shell functions do the same thing as aliases, and have fewer | parsing problems, and they have more flexible arguments. | Instead of: alias ls='ls --color' | | Better: ls() { command ls --color | "$@" } | | (The "command" prefix avoids recursion) | mdiesel wrote: | There is a history command, with a way to reload, so this would | be possible by writing to bash_history from bashrc, then | reloading the history I think. Not tested. | | This is a short guide that is an eye opener as to just how much | can be done with bash history: | https://www.digitalocean.com/community/tutorials/how-to-use-... | camnora wrote: | Greenclips [1] works well for this if you're a rofi [2] user. | You can set a staticHistoryPath that points to a file. When | activating Greenclips, you can search for the desired command. | I've been using this on my Linux box for the last year or so | and haven't looked back. | | [1] https://github.com/erebe/greenclip | | [2] https://github.com/davatorium/rofi | gcmeplz wrote: | You can set up a per-directory bash history: | https://unix.stackexchange.com/questions/305524/create-histo... | | And a Makefile sounds like it'd be pretty helpful too! | jkubicek wrote: | I use fishshell, and I've gotten in the habit of creating a | function called `t` to run tests. This function captures whatever | test command I'm currently using. Just test one file, run one | test case, use a debugger, capture coverage, etc. I don't save | the function, so it doesn't persist across Terminal sessions. If | I need to change how I'm running tests, I update the function. | | It's a small, but noticeable improvement over the way I was | working before, either up-arrowing until I found the last time I | ran tests, or typing, `pytest...` and letting autocomplete figure | out what I was doing previously. | | edit: So yes, I am also a big fan of helping to enforce | consistency by scripting even the small things. | wheybags wrote: | I wrote a small python script, that I alias in my shell to "b" | (for build). When I run it in a given directory, it prompts me | for a command, if it doesn't have one already saved. Subsequent | runs just run the saved command, but it can save a different | command for each folder. I use it to clear and remake my build | directory using cmake on my c++ projects, with the various | compile options saved in as well. It's basically a persistent | version of what you describe. | | Here's the script, if you're interested. It's not super | complex. https://github.com/wheybags/stuff/blob/master/build- | dir-comm... | bauerd wrote: | Have a function similiar to this, reruns the last test command | function retest set --local cmd (history | grep -v | history | grep -Em1 "((bundle exec)?rspec|go test|jest)") | echo "Rerunning `$cmd`" eval $cmd end | Felk wrote: | > either up-arrowing until I found the last time I ran tests | | just as a substitute for up-arrowing, have you tried | ctrl-r/reverse-i-search? (if that's a thing in fishshell) | | edit: I see wodenokoto already mentioned this workflow in | another comment | jonfw wrote: | Check out FZF for fuzzy finding through your command history | in this manner- it is absolutely the superior way to search | command history | simongr3dal wrote: | in fish you can type some of the command and arrow-up. It | will show only commands containing that substring. | frutiger wrote: | So does libreadline (and therefore bash). The following in | ~/.inputrc does the trick for vi mode users: | | k: history-search-backward j: history-search-forward | | There is an equivalent for emacs mode. | Macha wrote: | If you type before up arrowing this is what the up arrow does | in fish | ch4s3 wrote: | Would you mind sharing a gist of that function? | dfinninger wrote: | I'm assuming they mean something like: t() { | bazel test //some/specific/thing/... # or pytest, | maven, sbt, etc etc } | | Add in whatever option you want to the test command there. | Then you just press "t" and it runs your tests. | dreamer7 wrote: | I have a similar practice too. For most projects I have *.cmd | files such as start.cmd build.cmd | run.cmd | gglitch wrote: | Same, although I leave off the extension. | https://www.talisman.org/~erlkonig/documents/commandname-ext... | ryanianian wrote: | Similar notion: Never, ever have more than a simple shell script | invocation in your CI configuration. | | So many projects I've worked on have had nontrivial build-logic | or implicit assumptions in their CI configuration which leads to | distrust of local reproducibility. Eliminate as many variables | and steps to reproducibility as possible. At the very least, | create a `ci.sh` script in your project and make all your CI | invocations go through that. | | If you avoid putting "raw" npm or whatever invocations in your CI | config in the first place you won't be tempted to add "just one | more option" to your CI config and forget to add that option when | running locally. | | Such a script also becomes a de-facto API for how to use your | project. This can help immensely if all of a sudden, for example, | you switch programming languages (or major versions) environment | assumptions, etc. | jolux wrote: | Often when you use the CI config instead of shell scripts it | gives you deeper integration with the CI provider, like better | error messages and such. I wish there was a way to bridge that | gap. | rmetzler wrote: | I used to run gitlab-runner locally to debug the CI pipeline | but my team now uses a lot of gitlab features to make the | pipelines more DRY and this means CI code is scattered over | projects. | | The easiest way I found to debug it, is to put set -x into | the code to print the command with substituted variables. | candiddevmike wrote: | I try to have CI only run make targets. Helps to reproduce | things really easily. | chubot wrote: | Yes definitely, many lines of shell in YAML (or Docker files) | is one of my pet peeves. It's always better to invoke a shell | script. Then you can use ShellCheck on it, your editor will | syntax highlight it, you can parse it with Oil (osh -n) [1], | etc. | | And yes there shouldn't be too much of a difference between | what developers do locally and what the CI does. | | It's sort of an anti-pattern if you have a bunch of automation | that can only the CI can run. The developer should be able to | run it on their machine too, without CI, and without YAML ... | | [1] http://www.oilshell.org/why.html#what-you-can-use-right-now | paulryanrogers wrote: | This is the ideal. In practice the CI environment may need | optimizations that don't make sense locally without a lot more | work. | a1369209993 wrote: | > have had nontrivial build-logic or implicit assumptions in | their CI configuration | | Wait, what? If your CI isn't testing the builds your bugs are | (possibly) in then what's the point of CI? | kmarc wrote: | This. | | OTOH, the monster I'm currently implementing as a set of ci.sh- | like scripts is already way too big. It all started as a couple | `kubectl apply`, but now I wish it was an ansible playbook. | | Some other people on the same project abused groovy to the | fullest to create some fancypants Jenkins pipeline. Local | reproducibility? Zero. | athenot wrote: | These scripts become documentation. Not about what the commands | are doing but _WHY_ they are being ran. This in itself brings | huge value. | dheera wrote: | I have tons of such scripts for personal use, not just projects. | Here's one I wrote a couple days ago called "jabra-stop-changing- | volume-goddamnit": #!/bin/bash while | sleep 0.1; do pacmd set-source-volume | bluez_source.70_BF_92_CD_77_32.headset_head_unit 60000; done | | Also, lots of scripts related to ffmpeg and other tools where the | command line arguments are too hard to remember. For example, | "ffmpeg-extract-sound-from-all-files-in-dir": | #!/bin/bash find *.mov | sed -e s/.mov// | xargs | --replace=qq --verbose ffmpeg -i qq.mov -acodec pcm_s16le qq.wav | can16358p wrote: | I think that's the equivalent of abstraction/encapsulation in | programming languages. Sometimes even one liners can be | encapsulated in something else as the implementation might change | but the purpose/role doesn't. | chubot wrote: | Shell functions can also accomplish the same purpose as tiny | little scripts. For example, a 2 line shell script can be | wrapped in a function instead: mymove() { | cp --verbose $1 $2 rm --verbose $1 } | | Shell has good abstraction capacities! It even has some that | other languages don't have: | | _Shell Has a Forth-like Quality_ | http://www.oilshell.org/blog/2017/01/13.html | | _Pipelines Support Vectorized, Point-Free, and Imperative | Style_ http://www.oilshell.org/blog/2017/01/15.html | | http://www.oilshell.org/blog/tags.html?tag=shell-the-good-pa... | jcynix wrote: | Yes, shell functions are cool. I use them to augment standard | commands too, e.g. make head(1) or tail(1) output more lines | than usual, depending on the terminal's number of lines: | head () { if [[ $# -eq 0 ]] then | /usr/bin/head -$[(LINES-1)/2] elif [[ -f "$1" ]] | then case $# in (1) /usr/bin/head | -$[(LINES-1)/2] $* ;; (2) /usr/bin/head | -$[LINES*5/12] $* ;; (3) /usr/bin/head | -$[(LINES-1)/3] $* ;; (*) /usr/bin/head $* ;; | esac else /usr/bin/head $* fi } | deeg wrote: | He mentions this in the original article: scripts can be | functional documentation. They are an easy way to learn the | common commands in a project, can call out the expected workflow, | and document all the commands to accomplish it. | chubot wrote: | Yes, I use shell as executable documentation, and I think it's | the best language for it. I outlined these ideas here: | | http://www.oilshell.org/blog/2020/02/good-parts-sketch.html#... | | http://www.oilshell.org/blog/2020/07/blog-roadmap.html#more-... | | As shown there, a lot of people are doing this under different | names... I hope Oil can provide something consistent. | cgarvis wrote: | I've been doing this with Makefiles for a while. It has | always bugged me that I'm not "making" an artifact. Think I"m | going to try this `run.sh` approach. | chubot wrote: | Yeah, as mentioned in the posts, make targets that are | verbs rather than nouns should really be .PHONY, but most | people forget that. | | While the idea of having Make's dependency engine is nice | in theory, it falls down for one-off automation in my | experience. For a couple reasons: | | (1) Make's dependency model has some well known | deficiencies. It doesn't play well with tools that produce | two files. It doesn't play well with tools that produce a | directory tree of "dynamic" filenames (not known when you | write the Makefile) | | (2) Most makefiles have bugs, especially when you do make | -j (parallel builds). It's basically like writing a C | program with a bunch of threads racing on global variables | -- your Make targets will often be racing on the same file, | leading to non-deterministic bugs. | | ----- | | So IMO it's better to mostly stick with the sequential | model of shell for this kind of project automation. | | But shell can invoke make! When you know your dependencies, | invoke them from run.sh! And when you're REALLY sure it's | correct, invoke make -j :) | | In other words shell is my default, and make is only for | when I want to spend the effort to write dependencies -- | which is quite difficult, because Make provides you | virtually no help with that. Bugs in dependencies are | common and hard to find. If you're trained to run "make | clean", then that's a symptom of a bug in the build | specification. | | Shell "gets shit done" without these types of bugs. | Debugging a shell script is very easy compared with | debugging a makefile. The remaining problems with shell | will hopefully be fixed by https://oilshell.org/ :) | | I do want to add some dependency support, but I didn't get | to it: | | _Shell, Awk, and Make Should Be Combined_ | http://www.oilshell.org/blog/2016/11/13.html | | Make is just a small elaboration on the shell model | (concurrent processes and files), but unfortunately it's | implemented as a completely separate tool that shells out | to shell! _facepalm_ | fao_ wrote: | > (1) Make's dependency model has some well known | deficiencies. It doesn't play well with tools that | produce two files. It doesn't play well with tools that | produce a directory tree of "dynamic" filenames (not | known when you write the Makefile) | | These are covered by pattern matching and prerequisites, | which were backported from mk(1) into GNU Make (Albeit | they changed the syntax to make it incomprehensible). | | > (2) Most makefiles have bugs, especially when you do | make -j (parallel builds). It's basically like writing a | C program with a bunch of threads racing on global | variables -- your Make targets will often be racing on | the same file, leading to non-deterministic bugs. | | mk(1) makes it easier to avoid bugs and easier to | comprehend what the makefile is doing, it also allows you | to invoke programming languages for specific targets as a | feature of the Makefile, and other goodies :) | 3pt14159 wrote: | What I do is make the same scripts from project to project then | have helpful aliases to run them quickly. For example, since I do | TDD and often work on a single test file for a while I have a | script in every project called: bin/run | | Then when I want to run it I just type the letter r, since I've | aliased r to run ./bin/run. It's super fast. To keep source | control clean I add it to: .git/info/exclude | | Which allows my .gitignore to be the same as everyone elses. I | used to used the global gitignore file, but I had issues at | times. | thenonameguy wrote: | I saw that the cost/benefit ratio of adopting | https://github.com/casey/just in non-trivial projects was worth | it as an alternative to bash scripts in script folders. | Galanwe wrote: | Seems like a great little tool. | | Though it looks a bit too young for my taste: it's not | available in most distro's base repositories yet, so it's going | to be a tiny bit painful to deploy on every developer laptop, | CI, and etc. I tend to prefer readily available tools like | make, with 90% of the same features, but 100% distro coverage | and previous developer knowledge. | Galanwe wrote: | It's a great habit. | | I use that same concept extensively for all my projects, whatever | the language/stack, though I prefer to use Makefiles since I feel | like it creates a more cohesive result. Very easy to chain, | depend and read in the CI too. | dahfizz wrote: | +1 for make. Its an amazing and underrated tool. | jscheel wrote: | It took me forever to come around on make, but I absolutely | love it for things like this now. | narwally wrote: | Any recommendations on learning to use make more | effectively outside of just reading the docs? I feel like | this is a glaring gap in my knowledge as I really only know | the basics. | jscheel wrote: | Hmm, unfortunately I don't. I didn't really bookmark | anything when I was learning. IIRC, I spent most of my | time in the docs though. | makapuf wrote: | I'd say gnu make (dont use any other make) can be learned | from the documentation, it readable. Also, it's super | simple to try things yourself (unlike, say, k8s by | example) : just use it in your project and use the doc as | needed. I am very fond of make, to the point I did a | video transcoding solution based on it and ffmpeg. | TeMPOraL wrote: | Related trick: GNU tools tend to come with Info pages - | it's to man pages what a book is to a listicle. These | Info pages can get you up to speed quickly, while giving | you a deeper understanding of the tool at the same time. | sneak wrote: | Read a lot of other people's Makefiles. I tend to shy | away from huge/overly complex ones, but there's a lot of | good examples out there where I got most of my | ideas/habits/patterns from. | JamesSwift wrote: | Yes, anything I do public facing is likely to use make/docker. | It really cuts down on the mental energy I need to expend | thinking about differing environments/bootstrapping. | pletnes wrote: | I love this! Used to do it, too, but currently we're stuck in | windows. What do you do then? (IT blocks WSL so not that) | JamesSwift wrote: | GNU make is available on windows. If you run it via git-bash | then it should mostly "just work" (though there will be some | idiosyncracies to work out). I tested this project and it all | runs on linux/osx/windows: https://github.com/J-Swift/cod- | stats | necrotic_comp wrote: | Can you give an example on how you use make for simple things | like this ? I want to get into it for non C++ projects, and | it's a little daunting to make the leap. | cgarvis wrote: | Here is a starting point for a AWS SAM Golang project. It | even comes with a `make help` command. | | https://gist.github.com/cgarvis/61d70eeb1288bfee540c59cad095. | .. | necrotic_comp wrote: | Thanks! | sneak wrote: | Here's one of mine for a project I'm working on today: | | https://git.eeqj.de/sneak/mothership/src/branch/master/Makef. | .. | choeger wrote: | Consider the task of doing _anything_ with your software a | program in a very specific DSL. Then the scripts are your verbs. | What you also need are your nouns. | | For instance, deploying to a specific target could be | "./scripts/deploy.sh stage", backporting a patch could be | "./scripts/patch_release.sh 1.1.0 dbcde45", and creating a | database migration script could be "./scripts/db_changed.sh 'add | new field for model'" | | IMO thinking about the verbs is the first important step, but one | should also always specify the nouns explicitly. | ed25519FUUU wrote: | > _backup-db.sh_ | | Just a small nitpick. I'd like it if we collectively moved away | from including file extensions for scripts. You never know when | you want to rewrite it in python or do something else. Nothing | more confusing than opening up "backup.sh" only to find it's | actually a ruby script and must be executed. | thefifthsetpin wrote: | I handle that by linking src/backup.sh as bin/backup Then if I | rewrite it as src/backup.py I just change the bin/backup link | to point there instead. | dvdgsng wrote: | We usually end up with implementing the same scripts in shell, | cmd and Powerhell, since some Windows folks prefer not to | install cygwin or use wsl. Its a PITA to maintain, but doable | if the scripts are simple and only check for requirements and | the actual work is done by python, groovy, go, whatever. | ultra_nick wrote: | Why break tasks up into 10 line files when you could break them | into 10 line Python/Similar functions? | | Same advantages as above, a more powerful cross platform | language, cross-task references can be linted def | run(params): "Run a built binary." | subprocess.run("./" + params.path) . . . | if __name__ == __main__: "Take CLI options and run | selected function.""" # Parse CLI arguments # | Call functions with options | masukomi wrote: | because it's not the unix way and the unix way has a LOT of | value. | | because not all the tools you use in your system can import | your python/similar file. Your deploy pipeline could involve 10 | languages running under multiple os's / versions | | or to put it another way... because the whole world doesn't run | in your favorite programming language. ___________________________________________________________________ (page generated 2020-09-22 23:01 UTC)