[HN Gopher] Julia 1.6 addresses latency issues
       ___________________________________________________________________
        
       Julia 1.6 addresses latency issues
        
       Author : leephillips
       Score  : 120 points
       Date   : 2021-05-25 18:07 UTC (4 hours ago)
        
 (HTM) web link (lwn.net)
 (TXT) w3m dump (lwn.net)
        
       | smabie wrote:
       | So I used to be a big proponent of Julia, and in some ways, I
       | still am. But I very recently tried to write a high performance
       | production system in it, and was sorely disappointed. The tooling
       | is just so buggy and it's clear that the community isn't really
       | interested in using it for anything besides modeling/research in
       | a Jupyter notebook.
       | 
       | Things that kind of suck about using Julia for production:
       | 
       | 1. Never could get Revise to work, had to restart my REPL
       | everytime I changed any code. Even though Julia 1.6 was a lot
       | faster than 1.5, it still took too long.
       | 
       | 2. Couldn't find a static type checker that actually worked (I
       | tried JET and StaticLint). I feel like static typing is just so
       | important for a production system, but of course the community
       | isn't really interested because of the research focus.
       | 
       | 3. Editor tooling. The LSP server absolutely sucks. I first tried
       | using it with emacs (both lsp-mode and eglot mode), but it would
       | crash constantly. I think switched to VSCode (much to my
       | chagrin), and that worked marginally better though still very
       | poorly. It was clear that the LSP server had no idea what was
       | going on in my macro heavy code. It couldn't jump to definitions
       | or usages much of the time. It could never correctly determine
       | whether a variable was unused or misspelled either. Coupled with
       | the lack of static type checking, this was _extremely_
       | frustrating.
       | 
       | 4. Never felt like the community could answer any of my
       | questions. If you have some research or stats question, they were
       | great, but anything else, forget about it.
       | 
       | Will all of that being said, I do still use Julia for research
       | and I find it works really well. The language is very nicely
       | designed.
       | 
       | All and all, I decided to ditch Julia and decided to go with Rust
       | (after some consideration of OCaml, but unfortunately the multi-
       | core story still isn't there yet) and am a _lot_ happier.
        
         | Tainnor wrote:
         | I would also add:
         | 
         | 5. The module system is very primitive.
         | 
         | 6. The testing framework is extremely barebones.
         | 
         | I agree with your assessment, Julia is great for crunching
         | numbers etc., but I wouldn't write a whole application in it.
        
       | Buttons840 wrote:
       | I'm a big fan of Julia. It does live up to its speed claims. I've
       | implemented the board game Go in Python, Rust, and Julia and
       | Julia is definitely closer to Rust in speed. Same algorithms were
       | used for all implementations.
       | 
       | Julia's time to first plot still has some problems. The Plots
       | library can build animations, but the time to first animation on
       | my computer is like 10 minutes, and the time to second animation
       | is another 10 minutes. Probably just a bug, I haven't found any
       | other case that takes so long.
       | 
       | I've also mentioned before that a reinforcement learning
       | algorithm I ported from Python/PyTorch to Flux was faster, not
       | because of training times, but because of all the other stuff
       | (and RL has more "other stuff" than supervised learning) that
       | goes on outside the core training loop is so much faster.
        
         | throwaway894345 wrote:
         | > I've implemented the board game Go in Python, Rust, and Julia
         | and Julia
         | 
         | Oof. I reread this several times consecutively as "I've
         | implemented the board game in Go, Python, Rust ...".
        
           | TchoBeer wrote:
           | Same, and I literally have a correspondence game going on in
           | another tab
        
         | clarkevans wrote:
         | Multiple dispatch and generic programming make Julia a
         | productive language to work with. However, a given program may
         | have unnecessary function specializations (which affect startup
         | compile time) or unexpected dynamic dispatches (which affect
         | runtime performance). These can be addressed with some
         | important development patterns: checking for type stability via
         | @code_warntype, using opaque structs or @nospecialize when
         | appropriate, etc. I've found the Julia community to be very
         | helpful with regard to performance on the forums, slack, and
         | zulip.
        
           | ku-man wrote:
           | This is exactly the problem with Julia, to achieve those (soo
           | much vaunted) c-like-speeds you need quite a few contortions.
        
         | dunefox wrote:
         | > but the time to first animation on my computer is like 10
         | minutes
         | 
         | Have you tried 1.6 already? I find it's substantially faster.
        
           | Buttons840 wrote:
           | Yes, I was doing it just this week with Julia 1.6.
        
         | vavooom wrote:
         | Do you have any documentation / Github repos where you build
         | those Go implementations? I am a huge fan of the game and would
         | be curious to see how you built it, specifically in Python /
         | Julia.
        
           | Buttons840 wrote:
           | https://github.com/DevJac/gobot
           | https://github.com/DevJac/julia_gobot
        
       | krastanov wrote:
       | I am very confused by the claim in the first sentence of the
       | article: "On March 24, version 1.6.0 of the Julia programming
       | language was released. This is the first feature release since
       | 1.0 came out in 2018". How is 1.6 a "feature release", but
       | 1.1-1.5 are not!? Especially given the enormous new set of multi-
       | threading features in 1.3.
       | 
       | Edit: ah, thanks for the response, it seem I just do not know the
       | difference between "feature" and "timed" release.
        
         | leephillips wrote:
         | That's the terminology used in the development process. The
         | releases > 1.0 and < 1.6 are called "timed releases". It
         | doesn't mean they don't contain any new features.
        
       | gugagore wrote:
       | Speeding up compilation itself is one approach to the latency
       | issue. And there's also the idea of blending compilation and
       | interpretation using e.g.
       | https://github.com/JuliaDebug/JuliaInterpreter.jl .
        
         | Zababa wrote:
         | V8 reduced the start up times of WebAssembly this way but with
         | a single pass compiler instead of an interpreter. Here's the
         | article: https://v8.dev/blog/liftoff
        
           | ufo wrote:
           | One difference is that web assembly is typed and is designed
           | to make these compilers possible.
           | 
           | In the context of Javascript, V8 did the opposite. Originally
           | they had a baseline compiler for Javascript but now they use
           | an interpreter, which reduces the startup latency.
        
       | up6w6 wrote:
       | I use and love Julia but I really wanted to see the general
       | purpose language that is claimed. On one hand, you see amazing
       | scientific libs like DifferentialEquations.jl, on the other side,
       | things like the PackageCompiler.jl mentioned just sucks at
       | generating binaries for daily basis.
        
         | krastanov wrote:
         | Isn't "generating binaries" just as bad for other interpreted
         | (interpeted-ish) languages? If you generate a "python binary",
         | you need to package python with your binary. Same for
         | perl/ruby. It just seems weird that people expect julia to be
         | able to do that. It is cute that PackageCompiler.jl exists and
         | it is cute that more AOT compilation work is being currently
         | done, but it seems crazy to expect Julia to be good at making
         | binaries (and I would say that about python and perl too).
         | 
         | And by extension, it seems weird to me to complain that Julia
         | is not a general purpose language because it can not generate
         | binaries. What stops me from making the same statement about
         | python, which is definitely general purpose?
        
           | orbots wrote:
           | Julia claims to "solve the two language problem". i.e.
           | prototype in python, rewrite in c++. The two language problem
           | is not solved with Julia if you can't effectively generate
           | binaries.
        
             | krastanov wrote:
             | I have never really heard the name "two language problem"
             | to refer to what you are describing. Whenever I have heard
             | these words it has referred to "I want a high-productivity
             | newbie-friendly introspective language like python, but I
             | do not want to write C modules when I need fast inner
             | loops". Julia seems to solve this already, without
             | providing compact binaries.
             | 
             | A sibling comment made a point about "compiling down to
             | shared libraries" which seems similar to what you are
             | describing, but that seems like it has little to do with
             | "the two language problem".
        
               | orbots wrote:
               | Right. It used to be referenced on the front page of
               | julialang.org Seems they don't really use that in the
               | sale pitch anymore. Maybe that proves my point. It's easy
               | to find references to julia claiming to solve the two-
               | language problem though. I am someone who this two-
               | language problem they speak of addresses.
               | 
               | I love Julia. Which is why it's so painful that I have to
               | rewrite all my elegant Julia prototype code in C++, so I
               | can compile into a shared lib for the users. Every.
               | Single. Time. Two languages.
               | 
               | Now that it isn't the main front and centre claim, I feel
               | a bit less bitter about using it as a prototyping
               | language.
               | 
               | Waiting another 5 years and maybe it really will solve
               | the two-language problem.
        
           | up6w6 wrote:
           | > And by extension, it seems weird to me to complain that
           | Julia is not a general purpose language because it can not
           | generate binaries. What stops me from making the same
           | statement about python, which is definitely general purpose?
           | 
           | I agree that generating binaries don't make a language
           | general purpose, I just tried to give an exemple of an ad hoc
           | non scientific thing that is considered "important" to the
           | community (its an official project) that is stuck. The common
           | sense would be just list the web frameworks but I dont think
           | its fair simply because there is no interest on it (yet).
        
             | hpcjoe wrote:
             | A non-scientific thing I've been doing for the last few
             | months at the day job, with Julia.
             | 
             | 1) querying a time series database of systems metrics at
             | scale for (large) fleets. This is being done via a JSON
             | API. Directly in Julia.
             | 
             | 2) Creating data frames from these queries, and performing
             | fleet wide analytics. Quickly. Millions to hundreds of
             | millions of rows in the data frames, typically 4-20
             | columns. Directly in Julia, no appeal to a 2nd language.
             | 
             | 3) leveraging the power of the language to post process
             | these data sets before analysis, to remove an
             | "optimization" that reduced data quality.
             | 
             | 4) operate quickly on gigabytes of queried data, threading
             | and sharding my requests, as the server can't handle large
             | requests, but it can handle parallel ones. Poor design, but
             | I can work around it ... trivially ... with Julia
             | 
             | 5) creating jupyter lab notebooks for simple consumption of
             | these more complex data sets by wider audiences, complete
             | with plots, and other things.
             | 
             | No science done here ... well ... data science maybe ...
             | and this is specifically in support of business analytics,
             | process optimization, etc.
             | 
             | Julia is an excellent language for this, 10 out of 10,
             | would recommend.
        
               | neolog wrote:
               | What's a fleet in this case?
        
           | reikonomusha wrote:
           | It's not at all bad for Common Lisp, which is a superlatively
           | interactive language.
        
             | krastanov wrote:
             | Could you elaborate or give examples? I guess all the
             | complaints about binaries come from people used to
             | something like Common Lisp, but while I have a general
             | understanding of what a lisp is, this type of "provide a
             | binary for your interpreted-ish language" is an incredibly
             | foreign idea to me.
        
               | reikonomusha wrote:
               | Common Lisp was designed to be interactive and have a
               | REPL. You can redefine functions, classes, etc. on the
               | fly with strictly-defined semantics. (You don't have to
               | guess what happens if you, say, re-name a field of your
               | class.) This is _insanely_ useful during development,
               | where you absolutely want to avoid doing full recompiles
               | every time you make a little change you want to test.
               | Some people call this "interactive and incremental
               | development". (Of course, you always have the option to
               | just re-compile everything from scratch if you so
               | please.)
               | 
               | Common Lisp was also designed to be compiled. Most
               | implementations these days compile to machine code.
               | Compilation is incremental, but ahead-of-time. That means
               | you can start running your program without having yet
               | compiled all the features or libraries you want. You can
               | --while you're in the REPL or even while your program is
               | running--compile and load extra libraries later.
               | Compilation is cached across sessions, so you won't ever
               | have to recompile something that doesn't change.
               | 
               | Despite Lisp being mega-interactive, incremental, and
               | dynamic, just about every implementation of Lisp allows
               | you to just write out a compiled binary. In the
               | implementation called "Steel Bank Common Lisp" (SBCL),
               | from the REPL, you just write:                   (sb-
               | ext:save-lisp-and-die "mycoolprog" :executable t :entry-
               | point 'main)
               | 
               | which will produce a statically linked executable binary
               | called "mycoolprog" using the function called "main" as
               | the entry point.
               | 
               | Unless you've specifically programmed it in, there will
               | be no JIT lag, no runtime compilation, etc. It will all
               | just be compiled machine code running at the speed of
               | your processor. (It's possible and even easy to invoke
               | the compiler at run-time, even in your static binary,
               | which people _rarely_ do, and when they do, they know
               | exactly what they're doing and why.)
               | 
               | All of this is a complete non-issue in Lisp, and hasn't
               | been for about 35 years (or more).
        
               | superdimwit wrote:
               | I am hopeful that Julia should be able to get this cross-
               | session caching of compiled code. Would make restarting
               | the REPL (to e.g. add a field to a struct) much less
               | frustrating.
        
               | reikonomusha wrote:
               | Yeah, restarting a Lisp REPL after you've compiled your
               | code is transparently essentially instantaneous, because
               | everything is cached and checked for changes, and 99% of
               | the time most of your code and nearly all of your
               | dependencies aren't changing hour to hour.
        
               | adgjlsfhk1 wrote:
               | Caching is easy. The hard part here is correctly
               | invalidating the cache. Specifically, if a user is using
               | different libraries between sessions, figuring out which
               | methods have been overridden (or inline code that was
               | overridden) becomes complicated.
        
               | nightfly wrote:
               | The way that common lisp does this though is pretty much
               | creating a core dump that you can execute, which isn't
               | what most people are expecting from an executable. It's
               | not a _bad_ way, it's just pretty unique to common lisp.
        
               | reikonomusha wrote:
               | What exactly are people "expecting" from an executable?
               | It is a piece of binary code on disk, full of machine
               | code, that runs without a VM or external runtime
               | libraries.                   ./mycoolprog
               | 
               | just works. From a user perspective, there's no
               | difference.
               | 
               | The binary itself isn't structured like a typical one
               | with C debugging symbols, etc. But it's also not some
               | "faux binary" like a bunch of .pyc bundled as data and
               | unzipped when the program runs. It truly is machine code,
               | just arranged differently than a bunch of C ABI
               | functions.
               | 
               | I claim most people running binaries don't care about the
               | memory layout of the binary. I certainly am never
               | thinking about that every time I run `grep`. You don't
               | debug Lisp programs with C's tooling. You use Lisp's
               | tooling.
               | 
               | (Unless, of course, you use an implementation like
               | Embeddable Common Lisp, which _does_ compile your Lisp
               | program as a C program, and _does_ produce a non-image-
               | based executable. That's the beauty of Lisp being a
               | standardized language with multiple conforming
               | implementations.)
        
           | adgjlsfhk1 wrote:
           | The reason Julia should be able to do this is that it uses
           | LLVM to generate machine code "just ahead of time". As such,
           | (at least for type stable code), it should be possible to
           | save the code we generate. The main place where static AOT
           | matters for Julia isn't full applications, but libraries.
           | Being able to generate static libraries would allow Julia to
           | replace C++ and Fortran much more fully in places like python
           | libraries. Furthermore, this capability is likely crucial in
           | getting major further improvement in time to first plot.
           | Currently `@time DifferentialEquations` takes about 11
           | seconds on my computer, but if more of the code could be
           | statically compiled at precompile time, that could be reduced
           | dramatically.
        
             | krastanov wrote:
             | This is the first time I see my confusion so clearly
             | addressed! Thanks, this makes total sense now!
        
             | neolog wrote:
             | > The main place where static AOT matters for Julia isn't
             | full applications, but libraries.
             | 
             | That depends on the use case. With improvements in static
             | compilation, julia could probably be a good application
             | language. Game development would be an interesting market.
        
             | galenlynch wrote:
             | This is true for many functions, but afaik the llvm code is
             | only generated for a function paired with the types of the
             | arguments that it was called with. Since Julia functions
             | are for the most part 'generic' and work with a wide range
             | of argument types, you would have to restrict the compiled
             | binary or library to a specific set of argument types. Some
             | functions also have type instability and can't be made into
             | pure llvm.
        
           | Zababa wrote:
           | > Isn't "generating binaries" just as bad for other
           | interpreted (interpeted-ish) languages?
           | 
           | I think this is the case for at least the most popular JIT'd
           | languages: Java, C#, JS, and PHP. Also for the most popular
           | interpreted languages: Python, Ruby and also PHP. I don't
           | know about Visual Basic and R though.
           | 
           | I know that an exception is Dart, that combines a JIT and an
           | AOT. I think EmacsLisp can now be also compiled, but I don't
           | know if it works with all the code and is just free
           | performance, or something more limited.
           | 
           | Edit: as pointed at by pjmlp, Java and C# already combine an
           | AOT and a JIT. What I meant by the comment on Dart is that it
           | can either be run with a VM or compiled to produce binaries.
        
             | pjmlp wrote:
             | Java and C# also have combined JIT and AOT since they
             | exist, .NET moreso.
             | 
             | Other examples are Lisp and Scheme variants, Eiffel, OCaml,
             | Haskell, Prolog.
        
               | oblio wrote:
               | The main SDKs and programming paradigms for Java and C#
               | both don't mesh well with AOT, though. Reflection, heavy
               | reflection based frameworks.
               | 
               | Not that many places use Java/C# AOT compilation, except
               | for games/iOS apps.
               | 
               | Almost every place I've seen using Java/C# was using JIT.
        
               | pjmlp wrote:
               | Android uses a mix of JIT/AOT, just as most Java embedded
               | development.
               | 
               | As for not everything being supported, well that is no
               | different from having C++ code with RTTI and exceptions
               | disabled, or being forced into a specific linking model
               | due to possible problems with a third party dependency.
        
           | neolog wrote:
           | > Isn't "generating binaries" just as bad for other
           | interpreted (interpeted-ish) languages?
           | 
           | Python is famously bad at this. I hope Julia's proponents
           | don't stop at "look we're only as bad as Python".
        
       | adsharma wrote:
       | For me the issue manifested as a 10 sec latency to format a Julia
       | file using Format.jl
       | 
       | Solved via flags to disable JIT and brought it down to a couple
       | of secs. Native binary would be much nicer.
        
         | enriquto wrote:
         | Two seconds to process a tiny text file enters well into the
         | realm of "completely unusable" in my eyes.
         | 
         | It wouldn't be so bad if the Julia developers acknowledged that
         | this is a valid concern (that they are not dealing with it
         | right now for whatever reasons) and that the ecosystem will not
         | be considered complete until this fundamental problem is
         | solved. But this is infuriatingly _not_ the case. Instead, they
         | tell you that you are  "holding it wrong" and that this is not
         | really a problem, that your usage is "niche", that the
         | interpreter is alright as it is, and that the time to first
         | plot is unlikely to ever go below a hundred milliseconds. I
         | find it really depressing for the language itself is incredibly
         | beautiful. My only hope is that an independent, fast and unix-
         | friendly implementation of the language arises, thus freeing
         | the reference implementation of the efficiency burden and
         | allowing it to be simpler. Something like the lua/luajit split.
        
           | sgt101 wrote:
           | I'm not saying you are wrong - but why 100ms and not 50ms or
           | 200ms or 2000ms?
        
           | KenoFischer wrote:
           | I promise ecosystem will not be considered complete until
           | this fundamental problem is solved. That said, you can have
           | time-to-first plot below a hundred milliseconds right now if
           | you put Plots into your system image - that's always been an
           | option. System images have workflow issues which is why
           | they're not used more.
        
             | enriquto wrote:
             | Sounds great, thanks! That is certainly reassuring to hear.
             | I'm very happy to see Julia evolving.
             | 
             | EDIT: also, sorry for the mis-characterization of Julia
             | developers! I may have dealt until now with users and
             | "fanboys" not real devs.
        
           | DNF2 wrote:
           | I wonder where you got the impression that latency and
           | precompilation performance are not valid concerns. This has
           | been the _main_ focus area for the devs for a long time. It's
           | pretty much all anyone has been talking about for over a
           | year, and serious improvements have been made.
           | 
           | Here's a blog post that goes into some detail about the
           | ongoing efforts to improve compiler latency:
           | https://julialang.org/blog/2020/08/invalidations/
        
             | enriquto wrote:
             | > I wonder where you got the impression that latency and
             | precompilation performance are not valid concerns.
             | 
             | Now that you ask it, I realise it's been mostly through a
             | few HN interactions! Every time I raised the issue in Julia
             | posts over the last few years, I have been consistently
             | ridiculed by purported Julia defenders. For example, in
             | this very thread you can find a case of that.
        
           | krastanov wrote:
           | > the time to first plot is unlikely to ever go below a
           | hundred milliseconds
           | 
           | How is that controversial or disappointing!? Why would anyone
           | bother optimizing this? Nor is matlab/octave/python any
           | faster.
           | 
           | `python3 -c "import matplotlib.pyplot as plt;
           | plt.plot([1,2,3]);"` takes 600ms on my (powerful) workstation
           | and that does not even include creating the plot window.
           | 
           | To be clear, I do believe there is much more work to be done
           | to decrease latency in Julia, but your targets are
           | ridiculous. And as a regular on their bugtracker and forum,
           | the devs definitely acknowledge these issues and have many
           | times said it is one of their main priorities.
           | 
           | By the way, if you want streaming live updated plots, this
           | latency to first frame is not a problem. It is already
           | straightforward to make such fast live plots in Julia
           | (although it does not fit my personal taste for how to do
           | it).
        
             | enriquto wrote:
             | > your targets are ridiculous
             | 
             | Only because you are not used to somewhat fast programs:
             | $ /usr/bin/time -v gnuplot -e 'set term png; plot sin(x)' >
             | sin.png         ...         User time (seconds): 0.02
             | System time (seconds): 0.00
        
               | krastanov wrote:
               | Come on, it is silly to compare a full general purpose
               | language against a special-purpose tool. Yes, grep is
               | also better than julia at searching for a string in a
               | file.
               | 
               | Julia is a terrible replacement for gnuplot and gnuplot
               | is a terrible replacement for julia.
        
               | ku-man wrote:
               | "... it is silly to compare a full general purpose
               | language against a special-purpose tool..."
               | 
               | Wait, wait... isn't Julia a general purpose language? At
               | least is what the fanboys keep repeating ad-nauseaum
        
               | enriquto wrote:
               | Sure! That's why I set a somewhat reasonable target at
               | being just 5x or 10x times slower than a specific-purpose
               | plotting tool. But even _that_ is considered to be
               | chimerical! (or, in your words,  "ridiculous")
        
               | krastanov wrote:
               | Now you are just putting words in my mouth. I would
               | completely agree that 10x latency to first plot is a
               | reasonable target (i.e. first plot in about a second,
               | like you get in python and much faster than what you get
               | in matlab/mathematica). And plenty of devs closer to core
               | of Julia and the plotting libraries in it would agree.
               | 
               | And to be clear, I do expect my second plot to be ready
               | in tens of milliseconds.
        
               | reikonomusha wrote:
               | Why is it such a ridiculous comparison? Gnuplot is still
               | interpreting a language, loading plotting code, etc. If
               | Julia folks wanted to, they could bundle pre-compiled
               | plotting code that loads as fast as memory moves bytes.
               | They don't want to, of course, likely because it's
               | inelegant, but they could, and a general-purpose language
               | doesn't stop them from doing that.
        
               | krastanov wrote:
               | You can already bundle pre-compiled plotting code in your
               | Julia sys-image if you want. But Julia is not a plotting
               | tool so it would be ridiculous to optimize it just for
               | plotting. I want ODE solvers to have less latency, should
               | I start expecting gnuplot to have built-in ODE solvers or
               | the official installer of Julia to have the ODE libraries
               | pre-compiled?
               | 
               | Maybe this example would make it clearer: why does your
               | argument not apply to Python? Should we expect python
               | libraries to come pre-cached so that the first time I
               | load `sympy` I do not need to wait for tens of seconds to
               | have .pyc files created. Or about matlab?
               | 
               | Again, I am all on board with the idea that julia needs
               | lower latency and if you look at what their devs say,
               | they also agree with that. But expecting Julia to be
               | super low-latency (lower-latency than python/c/matlab)
               | for your pet task is silly.
        
               | reikonomusha wrote:
               | I gave a proof-of-concept argument as to why something
               | doesn't need to take as long straight out of the box with
               | no customization. Python is doing it sub-1s. You can also
               | include a non-optimizing interpreter. My point is that
               | being a general purpose language doesn't inherently limit
               | you in any way; instead it's one's choice of
               | implementation strategy.
               | 
               | Another strategy: when a user installs Julia, they select
               | "fast-loading" libraries. You'd be surprise how small
               | changes in UI/UX make huge perceived differences in
               | quality and performance. I bet "Julia can already do
               | this" too, but nobody does it because it's not idiomatic
               | and it's not recommended up front.
               | 
               | At the end of the day, people don't complain about Python
               | or MATLAB as much because they feel nicer. If it feels
               | nicer because of some other reason than absolute time,
               | then they're doing something about UX that Julia is not,
               | because everybody really does feel Julia is extremely
               | sluggish to use.
        
               | [deleted]
        
       | gugagore wrote:
       | It is amazing and frustrating to me how much latency affects my
       | productivity. I wish I could more effortlessly switch between
       | tasks, or just meditate and relax while I wait for something I
       | just did on the REPL to finish. But I don't. More often than not,
       | a 30-second delay to e.g. plot something destroys my ability to
       | stay in a productive zone.
       | 
       | I have been using Julia 1.6 since the release, and I'm so
       | grateful not only that some computations run a bit faster, but
       | that the interactivity is so improved.
       | 
       | Even seeing a progress bar can help me stay focused, because it
       | can be fun to watch (parallel precompilation is especially fun).
       | When a command just hangs, I feel left in the dark about how much
       | boredom I'll have to endure.
        
         | ssivark wrote:
         | Very interesting UX observations on interactive programming.
         | 
         | Kinda like mirrors in waiting areas, I wonder whether judicious
         | logging messages about the compilation process (not a wall of
         | text, but just enough) will serve to keep users engaged while
         | also educating them about the compilation happening on the
         | backend. That will help users feel more agency, and also
         | improve their mental models of how to structure their
         | code/activity for faster compilation.
        
           | thecupisblue wrote:
           | That actually helps. Talking from years of experience doing
           | Android development, where in the early days (and still on
           | some projects) you would have +5 minute rebuild time, and
           | it's supper annoying to check minor things. Having more logs
           | actually helped it seem faster, even tho it wouldn't be, but
           | also you could know where you are stuck and why, which helps
           | identify bottlenecks in build processes.
           | 
           | Sometimes it would be enough to just google what the long
           | task does and see "oh wait I don't actually need that step
           | for my daily development" (e.g. resource crunching, crash
           | reporting initalization etc) or point you in the direction of
           | "why isn't this caching itself".
           | 
           | Quite a useful thing, and should be available as an option at
           | least. In a REPL it might be context noise, but should still
           | be there as a --verbose option.
        
           | bobthepanda wrote:
           | This is the theory behind spinners over separate page loads.
        
       ___________________________________________________________________
       (page generated 2021-05-25 23:00 UTC)