[HN Gopher] Symbolics.jl: A Modern Computer Algebra System for a...
       ___________________________________________________________________
        
       Symbolics.jl: A Modern Computer Algebra System for a Modern
       Language
        
       Author : natemcintosh
       Score  : 293 points
       Date   : 2021-03-05 13:53 UTC (9 hours ago)
        
 (HTM) web link (discourse.julialang.org)
 (TXT) w3m dump (discourse.julialang.org)
        
       | rowanG077 wrote:
       | Computer algebra systems are incredibly useful. It's a shame
       | another project is started that's not in C. The world is really
       | waiting for a library that can be used in effectively all
       | programming languages.
        
         | 3JPLW wrote:
         | Julia will be able to generate c-compatible .so/.dlls in a not
         | so distant future.
        
       | davnn wrote:
       | Really interesting work! I'm really happy that people at MIT are
       | pushing forward Julia. In my opinion, Julia is incredibly
       | powerful once you understand the language. It often feels like
       | writing pseudocode that performs like C++. The ecosystem is still
       | pretty barebones compared to Python, but, while I'm using both
       | tools extensively, I begin to prefer Julia over Python more and
       | more.
        
         | dunefox wrote:
         | > The ecosystem is still pretty barebones compared to Python
         | 
         | There's help: https://github.com/JuliaPy/PyCall.jl
         | https://github.com/JuliaInterop/RCall.jl
         | 
         | Works like a charm.
        
           | swagonomixxx wrote:
           | While this is an decent interim solution, I wouldn't want to
           | call out to Python for anything running in production from
           | Julia. Good for prototyping at home perhaps.
           | 
           | A lot of the "scientific" Python packages (NumPy, SciPy,
           | etc.) actually just call out to C libraries for the majority
           | of their computation. I imagine Julia can do that already, or
           | it can call out to something similar in the stdlib, so it
           | doesn't really need integration with these Python packages
           | other than just API familiarity. But is that worth the cost
           | in performance?
        
             | amkkma wrote:
             | People definitely have used PyCall in production.
             | 
             | Julia at this point covers everything in NumPy, SciPy and
             | much more. For optimization, bayesian stuff, scientific,
             | and the convergence of the above with ML, it's far ahead-
             | https://sciml.ai/
             | 
             | Even has relatively mature web frameworks
             | (https://github.com/GenieFramework/Genie.jl)
        
             | dunefox wrote:
             | I don't care about production or performance much; I care
             | about data analysis and machine/deep learning for NLP.
             | Whatever lets me use the best language and packages is best
             | and so far it's Julia with the ecosystems and tools from
             | Python and R.
             | 
             | I'm very certain that you can just import the underlying
             | packages directly but this way is easiest, especially since
             | I'm not familiar with R.
        
           | davnn wrote:
           | That seems to work really well. I didn't have use cases where
           | I would prefer to call Python from Julia instead of using
           | Python directly.
           | 
           | Additionally, I think you also have to think about things
           | like Documentation and Tooling.
        
         | adenozine wrote:
         | How do you deal with multiple dispatch? It really doesn't match
         | up to any mental metaphors for me, I've always preferred my
         | bag-of-functions to do things. I've tried and tried with Julia,
         | is there any good resource for how dispatch programs are
         | supposed to be built and thought about?
        
           | ddragon wrote:
           | Objects aren't bag-of-functions though (they have state,
           | inheritance, initializers/destructors, interface/abstract
           | classes, classes vs objects and tons of other concepts and
           | patterns) and any complex program can become a large
           | hierarchic tree of classes and graph of objects that goes way
           | beyond a simple bag-of-function. Even modules that are almost
           | literally bag-of-functions will scale quickly to something
           | more complex.
           | 
           | The point is that simple concepts are nice to explain for a
           | beginner, but what actually built your intuition in how to
           | use objects is the years and years learning and experiencing
           | it's benefits and pitfalls. With multiple dispatch it's the
           | same, but since few languages use it (and even fewer, if any,
           | pushes it everywhere like Julia does) most people didn't
           | experience this process.
           | 
           | For me when I'm using a function I just consider them as
           | self-contained abstractions over the arguments. For example
           | there are hundreds of implementations of sum (+), which in
           | practice I ignore and only think about the concept of
           | addition no matter what arguments I give and I trust the
           | compiler/library to find the optimal implementation of the
           | concept or fail (meaning I have to write one myself). If I'm
           | writing a method (or function) I consider arguments as
           | whatever acts the way I need so that I can implement the the
           | concept on them (for example if I'm writing a tensor sum I
           | just consider arguments as n-dimensional iterable arrays and
           | implement assuming that - and declare for the compiler when
           | my method is applicable, without having to care about all
           | other implementations of sum - if anyone needs a scalar sum
           | them that person can implement it and through collaboration
           | we all expand the concept of sum).
           | 
           | And the fact that whoever uses a function can abstract away
           | the implementation, and whoever writes a function can
           | abstract away the whole extension of the arguments (through
           | both duck typing and the fact that the compiler will deal
           | with choosing the correct implementation of the concept)
           | means everything plays along fine without having to deal with
           | details of each side.
        
           | davnn wrote:
           | First and foremost you should have a general understanding of
           | type inference. You also have to understand the difference
           | between a function and a method (in Julia's terms), see:
           | Functions: https://docs.julialang.org/en/v1/manual/functions/
           | Methods: https://docs.julialang.org/en/v1/manual/methods/
           | 
           | Once that's understood, multiple dispatch is simply using all
           | of a function's arguments inferred types to choose which
           | method should be invoked.
        
             | StefanKarpinski wrote:
             | Maybe you're using the term loosely, but one definitely
             | shouldn't have to understand type inference to write
             | working Julia programs. Unlike static type systems like
             | Haskel or ML where inference is part of the spec, inference
             | in Julia is just an optimization and doesn't affect
             | behavior at all.
        
               | davnn wrote:
               | Relating to other programming languages, it was the first
               | term that came to mind. You have to know the type before
               | you can specialize a function, don't you?
               | 
               | I think it's a good mental model to constantly keep the
               | types in mind, but I have made the experience that people
               | working exclusively in dynamically typed languages, i.e.
               | the majority of data scientists, don't share that mental
               | model.
        
           | SatvikBeri wrote:
           | I found this article by Chris Rackaukas to be pretty helpful:
           | https://www.stochasticlifestyle.com/type-dispatch-design-
           | pos...
        
           | DougBTX wrote:
           | Multiple dispatch is not too different to method overloading,
           | so you could start there for comparable examples, maybe in
           | programming languages you are more familiar with: https://en.
           | wikipedia.org/wiki/Function_overloading#Rules_in_...
           | 
           | Bag of functions isn't a bad way to think about it, even for
           | Julia. In some languages, only the name of the function
           | determines which function is called. In others, the number of
           | arguments is used too, so eg foo/1 and foo/2 may be different
           | functions. In Julia, the types matter too, so foo(::Int) and
           | foo(::String) are different functions ("methods" in Julia
           | terminology), and which is used is based on the type of the
           | argument, rather than the number.
           | 
           | That's where the magic in Julia happens, as if you define a
           | function foo(x), without specifying any types, then the
           | specific functions that foo call will only be determined once
           | the type of the arguments to foo are known. But once they are
           | known, that type information can ripple all the way down,
           | picking up the specific implementation for each function
           | depending on the actual types used.
        
           | 3JPLW wrote:
           | I actually find it far more linguistic, that is, more akin to
           | natural languages.
           | 
           | In my view, it's not multiple dispatch per se that is the
           | bigger departure from traditional OOP, it's the fact that
           | methods are no longer contained in the classes. Julia draws a
           | separation between data (structs: the nouns) and behaviors
           | (functions: the verbs). Traditional OOP never really made
           | sense to me; why should each class define and own its own
           | methods? It feels far more sensible to me to just have global
           | behaviors that are well defined. Those verbs can sometimes
           | apply to _anything_ (leaning on more rudimentary operations
           | that you need to define; duck-typing), and sometimes they
           | require you to explicitly define your behavior, and sometimes
           | you just want to add an additional optimization that's
           | available in your particular situation.
           | 
           | Once you have that mental model down, multiple dispatch is
           | just how Julia chooses which method to call... and it's
           | really not much different from single-dispatch.
        
             | snicker7 wrote:
             | > why should each class define and own its own methods?
             | 
             | State mutations. That's it. By ensuring that your data can
             | only be mutated by a your API, it can never get
             | "corrupted".
        
               | 3JPLW wrote:
               | Sure, there is a subset of behaviors for which this style
               | makes sense, but it's just as well supported by simply
               | defining your own functions alongside the struct.
        
       | reikonomusha wrote:
       | CASes are one of the most difficult things to write and they're
       | never complete. There are always bugs, performance concerns,
       | anemic mathematical domains, etc. Every major free or commercial
       | CAS that's still in use is under active development to battle
       | these inadequacies. Dozens of CASes have come and gone in the
       | past 30 years, the majority of which have bitrotted or stopped
       | being maintained. And not a single CAS has reigned supreme as the
       | CAS that beats all CASes.
       | 
       | It's very exciting to see more work in CASes being done, but I
       | worry that "starting a CAS from scratch" isn't the right
       | approach. The Axiom [0, 1] project rightly identified that
       | building a general-purpose CAS for working practitioners of
       | computational mathematics is an effort requiring nearly
       | generational timespans [2], and that you _must_ have the right
       | language to describe mathematical objects and their
       | relationships. They had a literate programming policy, where
       | _all_ math code must be accompanied by publication-quality
       | documentation, precisely because it's so hard to build and
       | maintain these systems. Some of the greatest computational
       | discoveries and expositions came out of the development of Axiom,
       | like the richest and most complete implementation of the renowned
       | Risch algorithm for doing symbolic integrals.
       | 
       | Axiom fell into disuse for a variety of reasons, but from my
       | perspective, found new life in a fork called FriCAS [3, 4], which
       | is actively developed and allows a more "software engineer
       | friendly" approach to the development of the system. The code
       | they have is enormously complex and has mountains of knowledge
       | from foremost experts in computer algebra.
       | 
       | I really wish new computer algebra initiatives attempted in
       | earnest to make use of and extend Axiom/FriCAS so that we could
       | continue to build up our knowledge of this exceedingly delicate
       | and tricky subject without constantly starting from zero. Axiom
       | has a manual that is over 1,000 pages of dense mathematics and
       | that's _really_ hard to rebuild correctly.
       | 
       | (The only project I know who honestly tried to build upon and
       | subsequently extend CAS functionality is Sage [5], which builds
       | upon a plethora of existing open source general-purpose and
       | specialized computational math systems.)
       | 
       | [0]
       | https://en.m.wikipedia.org/wiki/Axiom_(computer_algebra_syst...
       | 
       | [1] http://www.axiom-developer.org/
       | 
       | [2] Quote from Axiom manual
       | (http://fricas.sourceforge.net/doc/book.pdf):
       | 
       | > With that in mind I've introduced the theme of the "30 year
       | horizon". We must invent the tools that support the Computational
       | Mathematician working 30 years from now. How will research be
       | done when every bit of mathematical knowledge is online and
       | instantly available? What happens when we scale Axiom by a factor
       | of 100, giving us 1.1 million domains? How can we integrate
       | theory with code? How will we integrate theorems and proofs of
       | the mathematics with space-time complexity proofs and running
       | code? What visualization tools are needed? How do we support the
       | conceptual structures and semantics of mathematics in effective
       | ways? How do we support results from the sciences? How do we
       | teach the next generation to be effective Computational
       | Mathematicians? The "30 year horizon" is much nearer than it
       | appears.
       | 
       | [3] https://en.m.wikipedia.org/wiki/FriCAS
       | 
       | [4] http://fricas.sourceforge.net/
       | 
       | [5] https://www.sagemath.org/
        
         | UncleOxidant wrote:
         | From looking at the FriCAS github:
         | 
         | "Additionally, FriCAS algebra library is written in a high
         | level strongly typed language (Spad), which allows natural
         | expression of mathematical algorithms."
         | 
         | One could argue that Julia also allows natural expression of
         | mathematical algorithms. Coupled with Julia features like
         | multiple dispatch, high performance (due to Julia's LLVM
         | backend) and growing ecosystem of AD and ML libraries, it seems
         | like Julia is probably the more "software engineer friendly"
         | approach at this point. It doesn't seem odd that the Julia folk
         | would want to implement their CAS in Julia. That's not to say
         | that maybe bridges from Julia to FriCAS couldn't be built as
         | has been done with both R and Python.
        
           | reikonomusha wrote:
           | Axiom, in its original commercial incarnation, took 20 years
           | to build. My comment meant to suggest not spending another 20
           | for the dubious promise of a bit of additional "software
           | engineer friendliness", which is why FriCAS forked Axiom
           | instead of building one from scratch.
        
           | ChrisRackauckas wrote:
           | We definitely can, should, and will interface with it.
           | There's no reason to throw away great work. Just like how
           | DifferentialEquations.jl always had a focus on pure Julia
           | solvers, it made sure to wrap every solver it could. Building
           | such wrappers is required for good research anyways, it's
           | required for both timing and calculating efficiency in other
           | ways (like amount of simplification or percentage of
           | integrals solved).
        
       | cb321 wrote:
       | There is also FriCAS: https://fricas.github.io/
        
       | [deleted]
        
       | metreo wrote:
       | I've always found computer algebra systems a particularly
       | fascinating niche in computing, implementations in accessible
       | languages like Julia are really doing a service to enthusiasts
       | and learners everywhere!
        
       | UncleOxidant wrote:
       | I won't pretend to understand all or this, but from what I can
       | understand the Julia ecosystem is about to be light years ahead
       | of anything else out there. It's an amazing community at the
       | intersection of programming language theory and numerics/math.
       | Exciting to see what's going to emerge.
        
         | wiz21c wrote:
         | I've used Julia a bit and I fail to see that "light years
         | ahead" point. Julia is sure a cool language to work with, with
         | a very nice type system and a quite smart compiler, but well,
         | the stuff I've used is still much of meta-programming (like
         | Python does, except here it's JITed right). But maybe my
         | epxerience is so limited I don't see the interesting bits...
        
           | amkkma wrote:
           | Were you doing anything with:
           | 
           | 1. custom units 2. custom GPU kernels 3. Custom array types
           | 4. custom bayesian priors. 5. AD through custom types 6. Task
           | based parallelism 7. symbolic gradients with modeling toolkit
           | 8. Agent based modeling 9. physics informed neural networks
           | 10. abstract tables types ...
           | 
           | or various combinations of the above?
        
             | glial wrote:
             | Is Julia better for these use-cases? I'd interested and
             | would love some links/examples.
        
               | UncleOxidant wrote:
               | > 2. custom GPU kernels
               | 
               | You can write them in Julia, whereas in Python you have
               | to write them in C/C++ and then use the FFI to call them.
        
               | helgie wrote:
               | You can write them in python using numba
        
             | WanderPanda wrote:
             | I think the biggest thing that is "light years ahead" is
             | that numpy is built right in and insanely extensible.
             | Libraries can be very lightweight / maintainable because
             | they don't need to roll their own numpy like tensorflow
             | pytorch etc. do (which makes them depend on bigcorps to
             | improve and maintain). In julia somehow everything is
             | compatible and composable. On the other hand counting from
             | 1 is is quite a big issue for me and leads to constant
             | errors when I switch between python and julia. Also the
             | startup time it takes is a huge bummer. They should have an
             | interpreted mode for debugging (as long as the debugger is
             | unusable for e.g. breakpoints)
        
               | UncleOxidant wrote:
               | I haven't tried it yet as it's still a release candidate,
               | but startup time is said to be much improved in v1.6.0.
        
               | adgjlsfhk1 wrote:
               | I think the key is less that it has good multi-
               | dimensional arrays built in, and more that Multiple
               | dispatch makes Julia more composable than python. For an
               | example of this, consider that BandedMatrices is a
               | package that can be used in conjunction with any library
               | that is expecting a Matrix like object, despite the fact
               | that most of them weren't designed to do so.
        
       | hcarvalhoalves wrote:
       | Computer Algebra is criminally underused - it has the potential
       | to make math-heavy critical-path code a lot more clear, testable,
       | observable and bug-free by design.
       | 
       | I feel I need to mention "Structure and Interpretation of
       | Classical Mechanics" by Wisdom and Sussman, and the accompanying
       | "scmutils" library, which first implemented in Scheme many of the
       | same features, although this Julia library seems to be more
       | complete.
       | 
       | There's also a great one-to-one port to Clojure by Colin Smith
       | [2] in case you want to use it on a more production-friendly
       | environment. The Strange Loop talk [3] is a good showcase of the
       | power and simplicity of these kind of systems.
       | 
       | [1] https://groups.csail.mit.edu/mac/users/gjs/6946/
       | 
       | [2] https://github.com/sicmutils/sicmutils
       | 
       | [3]
       | https://www.youtube.com/watch?v=7PoajCqNKpg&ab_channel=Cloju...
        
         | linspace wrote:
         | I was amazed the first time I used Mathematica. I have used
         | later professionally Maxima to compute some Taylor series and
         | also simpy for some hobby projects. I found simpy less powerful
         | than Maxima (not to mention Mathematica) but the ability to
         | integrate it with the rest of the program is wonderful.
        
         | benrbray wrote:
         | > "Structure and Interpretation of Classical Mechanics" by
         | Wisdom and Sussman
         | 
         | I come across this book every so often and find it really tough
         | to read due to the complete lack of "types" in any of the code.
         | Math, especially physics, relies heavily on units and function
         | signatures to be understood.
        
         | sritchie wrote:
         | SICMUtils co-author here, if anyone has any questions on the
         | Clojure port.
         | 
         | One beautiful thing about a Clojure computer algebra system is
         | that it can run completely in the browser. This includes
         | automatic differentiation, numerical integration, all of the
         | hardcore Lagrangian and Hamiltonian mechanics work,
         | differential geometry... it is startling stuff.
         | 
         | For example, here's a(n interactive!) derivation of Kepler's
         | Third Law in the browser (thanks to Nextjournal's lovely
         | integration), if anyone wants to play:
         | https://nextjournal.com/try/sicm/ex-1-11
         | 
         | Many more exercises live here: https://nextjournal.com/sicm/
        
           | ChrisRackauckas wrote:
           | Yes, we found these and this (along with Mathematica) was the
           | impetus for building automated Latexification into
           | Symbolics.jl. Here for example is a teaching notebook used in
           | Alan Edelman's MIT Computational Thinking course where
           | Symbolics.jl is used to visualize the numerical methods as
           | they iterate, and all of the outputs convert to LaTeX:
           | 
           | https://computationalthinking.mit.edu/Spring21/newton_method.
           | ..
           | 
           | Thanks for the ideas!
        
             | philzook wrote:
             | Those are gorgeous. How were they done? Using Julia
             | packages?
        
               | ChrisRackauckas wrote:
               | This is a Pluto notebook over Symbolics.jl and
               | ForwardDiff. All of the packages used are at the top of
               | the page. You can click the edit button on the top right
               | to open it up.
        
         | tonyarkles wrote:
         | Using Maxima in my day-to-day work has been a complete game
         | changer. I use it via org-mode and use the `tex()` command to
         | have it output TeXified results. These automatically get
         | formatted into beautiful readable equations.
        
           | hcarvalhoalves wrote:
           | Cool! I managed to do something similar with org-mode +
           | sicmutils, works quite well.
           | 
           | Render (pt_BR): https://github.com/hcarvalhoalves/math-fin-
           | training/blob/mas... Source:
           | https://raw.githubusercontent.com/hcarvalhoalves/math-fin-
           | tr...
           | 
           | I would love to see how you're using org-mode for that if
           | possible :)
        
       | max_streese wrote:
       | Could someone explain to me what the difference between a
       | computer algebra system like Symbolics.jl and a theorem prover
       | like Coq is?
       | 
       | Is that more in the nuances or is there a fundamental difference
       | between these two (referring to the terms and not their specific
       | implementations in Symbolics.jl and Coq respectively)?
       | 
       | Or is this question unreasonable to ask in the first place?
        
         | reikonomusha wrote:
         | Computer algebra systems are usually large, heuristic systems
         | for doing algebraic manipulation of symbolic expressions by
         | computer. Roughly, they're there to aid a human in doing
         | mechanical algebra by working with symbols and not just
         | numbers. Generally, the results coming out of a CAS are not
         | considered to be "proved correct", and ought to be verified by
         | the programmer/user.
         | 
         | Proof assistants aim to allow one to write down a mathematics
         | assertion in a precise manner, and to help the user write a
         | formally verifiable proof for that theorem.
         | 
         | Extremely crudely, a CAS is like a super-powered calculator,
         | while a proof assistant is like a super-powered unit test
         | framework.
        
           | davnn wrote:
           | The intersection is also interesting when you look at
           | equational proofs in Mathematica.
           | 
           | https://reference.wolfram.com/language/ref/FindEquationalPro.
           | ..
        
       | meta2meta wrote:
       | Rackauckas has done a lot of work to create the whole scientific
       | machine learning (SciML) ecosystem (with collaborators).
       | 
       | Symbolics.jl is the new step to a universal CAS tool in Julia to
       | bridge the gap between symbolic manipulation and numerical
       | computation
       | 
       | It is especially useful as a foundation for equation-based
       | simulation package ModelingToolkit.jl. In the foreseeable future,
       | I expect ModelingToolkit.jl can be comparable with the Modelica
       | ecosystem to provide fast, accurate modeling and simulation
       | capability and easy integration with machine learning methods for
       | Julia ecosystem, which is crucial for engineering application and
       | scientific research.
        
         | ChrisRackauckas wrote:
         | Thanks! The tutorials on the Modelica-like features of
         | ModelingToolkit.jl are just starting to roll out. For example:
         | https://mtk.sciml.ai/dev/tutorials/acausal_components/ .
         | Indeed, there's a lot to do in this space, but we already have
         | some pretty big improvements that we'll start writing down and
         | share (hopefully) at JuliaCon 2021!
        
           | meta2meta wrote:
           | Thank you for your hard work on SciML ecosystem and
           | community, Rackauckas!
           | 
           | The new Modelica-like usage of ModelingToolkit.jl is really a
           | game changer for acasual DAE system simulation in Julia. It
           | makes composable hierarchical component-based simulation of
           | large and complex system possible with pure Julia.
           | 
           | Based on full featured dynamic simulation capability and
           | machine learning ecosystem (Flux.jl etc), Julia will be a
           | perfect choice for reinforcenment learning research, where
           | simulation and training process can all be implemented in
           | pure Julia with promising performance. It delivers the
           | promise of "solve two language problem" of Julia language.
        
           | vsskanth wrote:
           | Hi Chris, I am a very heavy Modelica user and having an
           | equivalent system in Julia is very welcome. Using Sciml's
           | solvers in our models would be a killer feature. Obviously, I
           | am tempted to contrast ModelingToolkit.jl with Modelica and I
           | have some questions, if you don't mind:
           | 
           | Could you elaborate on the design choice to model components
           | as functions as opposed to objects ? functions seem compose
           | well from your example but what about initial conditions and
           | type safety ? In Modelica each component is an object where
           | you can specify initial conditions inside each of them with
           | constraints and can eventually make a big system, and if it
           | compiles you can be reasonably confident you aren't
           | mismatching units, missing outputs etc.
           | 
           | Do you have any plans for graphical representation ? Some of
           | the systems I work on in Motorsports are absolutely massive
           | with 150k equations, and having a diagram to see the
           | connections are really helpful. An auto generated one from
           | code would be more than good enough.
           | 
           | How do you handle FFI and interaction with other Julia
           | objects inside ModelingToolkit.jl since it requires symbolic
           | reduction ?
           | 
           | The FMI standard is a very popular export standard for these
           | models. Any plans to support it here ?
           | 
           | I understand these are early days and I am very excited to
           | know there's more on the pipeline. Thanks for your
           | contribution.
        
             | ChrisRackauckas wrote:
             | Most of the answers to this will be in a big JuliaCon talk
             | this summer. But I'll give a few hints.
             | 
             | >Could you elaborate on the design choice to model
             | components as functions as opposed to objects ? functions
             | seem compose well from your example but what about initial
             | conditions and type safety ? In Modelica each component is
             | an object where you can specify initial conditions inside
             | each of them with constraints and can eventually make a big
             | system, and if it compiles you can be reasonably confident
             | you aren't mismatching units, missing outputs etc.
             | 
             | There's things for type verification and all of that. The
             | decision comes from how it interacts with the compiler. You
             | can easily redefine functions and create them on the fly,
             | less so for structs. That turns out to make it easier to do
             | features like inheritance easily in a functional workflow.
             | Symbolic computing seems to work very well in this setup.
             | 
             | >Do you have any plans for graphical representation ? Some
             | of the systems I work on in Motorsports are absolutely
             | massive with 150k equations, and having a diagram to see
             | the connections are really helpful. An auto generated one
             | from code would be more than good enough.
             | 
             | Somewhat. Auto-generated ones from code already exist in
             | the Catalyst.jl extension library
             | (https://catalyst.sciml.ai/dev/). That kind of graphing
             | will get added to MTK: we just added the dependency graph
             | tooling to allow it to happen, so it's just waiting for
             | someone to care.
             | 
             | A full GUI? There's stuff we're thinking about. More at
             | JuliaCon.
             | 
             | >How do you handle FFI and interaction with other Julia
             | objects inside ModelingToolkit.jl since it requires
             | symbolic reduction ?
             | 
             | All of the functions are Julia functions, and you can
             | easily extend the system by registering new Julia functions
             | to be "nodes" that are not traced. See https://symbolics.ju
             | liasymbolics.org/dev/manual/functions/#R... . So if you do
             | `f(x,y) = 2x^2 + y`, then it will eagerly expand f in your
             | equations. If you do `@register f` (and optionally add
             | derivative rules), then it'll keep it as a node in the
             | graph and put its function calls into the generated code.
             | This is how we're doing FFI for media libraries in a new
             | HVAC model we're building.
             | 
             | >The FMI standard is a very popular export standard for
             | these models. Any plans to support it here ?
             | 
             | There is a way to read in FMI that will be explained much
             | more at JuliaCon, with some details probably shared
             | earlier. It's somewhat complex so I'll just wait till the
             | demos are together, but yes we have examples with FMU
             | inputs already working.
        
       | centimeter wrote:
       | One problem with CAS systems in general is that they either make
       | restrictive assumptions about the types of algebraic objects, or
       | they require you to provide extremely detailed type information
       | yourself, beyond that which most people are capable of explicitly
       | expressing. There are so many ambiguities and domain-specific
       | assumptions people make when they perform algebraic manipulation
       | that only a very small fraction of people are actually equipped
       | to express. This is especially problematic if you mix multiple
       | kinds of algebraic extensions - for example, I've had a hard time
       | getting CAS systems to correctly deal with functions over tensors
       | over Clifford (sub)algebras.
       | 
       | I think the only way you could get something like that to work
       | would be a type system so aggressive that it would turn off most
       | mathematicians, who tend to have a narrow understanding of type
       | theory.
        
         | reikonomusha wrote:
         | I think Axiom/FriCAS thought about this problem hard and mostly
         | solved it.
         | 
         | The dominating CAS paradigm for most popular CASes is to assume
         | everything is a real number and see how far you get. Later many
         | major CASes bolted on features to say "well, except, this might
         | be a complex number" and the like. This of course flies right
         | in the face of the kind of thing you want to do. Axiom didn't
         | take that approach and instead created a robust system for
         | specifying and implementing mathematical objects.
         | 
         | I tried doing quarternion algebra in Maxima many moons ago and
         | it was painful. They've since added packages for doing Clifford
         | algebra but it's not exactly well integrated in the rest of the
         | machinery.
        
           | amkkma wrote:
           | Symbolics.jl is keeping things generic from the getgo.
        
         | amkkma wrote:
         | Julia solves this by using interfaces (functions defined over
         | one or more custom types that abstract over concrete
         | information).
         | 
         | Just overload those on your type, and voila it works with the
         | CAS.
        
           | centimeter wrote:
           | I'm skeptical that duck typing can really be considered a
           | full "solution", but at the very least it seems like a good
           | bet.
        
       | cobaltoxide wrote:
       | Can I use this now? Is there a getting-started guide or gallery
       | of examples?
        
         | ChrisRackauckas wrote:
         | Start with the first tutorial:
         | https://symbolics.juliasymbolics.org/dev/tutorials/symbolic_...
        
       | Roark66 wrote:
       | Is there a tutorial somewhere for absolute beginners with some
       | examples of use? I used octave in the past and I used
       | pandas/numpy before, but reading documentation linked to from the
       | github site I have no idea how this can be used.
        
         | ChrisRackauckas wrote:
         | Nice to see the interest!
         | https://symbolics.juliasymbolics.org/dev/tutorials/symbolic_...
         | is a tutorial that incorporates some nice features like
         | building parallel sparse matrix code from symbolic arithmetic.
        
       | junippor wrote:
       | I'll make one (probably unpopular) comment.
       | 
       | On HN every time the subject of language wars or platform wars or
       | browser wars comes up, the idea also comes up that "everything is
       | about the number of users". The self-reinforcing cycle is that
       | the language/platform/browser with the most users also has the
       | most development time put into it, which then attracts more
       | users, etc.
       | 
       | Fine, I don't deny that the phenomenon exists. But I think that
       | it's often overlooked that it's not just about the number of
       | users. It's also about the quality of the users. If we could
       | quantify "an excellent developer", I wouldn't claim that Julia
       | has more of those than Python. But I'm convinced that Julia has
       | more "excellent developers who are also excellent at numerics"
       | than Python. I think the idea that the productivity as function
       | of "developer excellence percentile" is a power-law applies even
       | more strongly in multi-domain expertise situations, like
       | numerical computing. So forget about 100x coders. The
       | contributions of some people like Chris et al are closer to
       | 10_000x as significant as that of an ok contributor.
       | 
       | It's not just about the quantity, it's also about quality.
        
         | ihnorton wrote:
         | I won't comment on the relative numbers because there are top-
         | notch developers in many language communities.
         | 
         | I think the more important point is that Julia has attracted
         | _enough_ first-rate people to self-sustainably build out an
         | ecosystem -- and even more keep joining. Several aspects of
         | Julia's design and core tooling interact to provide compounding
         | leverage to this group. I think it's a similar situation to the
         | development of the NumPy ecosystem where standardizing on a
         | common array data structure and API led to an explosion of
         | interoperable libraries. Julia arguably takes that a step
         | further by allowing any code to interoperate with high
         | performance and fluent APIs. Julia's performance
         | characteristics also reduce the barrier to entry because people
         | can make deep contributions throughout the ecosystem without
         | needing to develop low-level programming expertise on top of
         | their domain-specific knowledge.
        
       | Robotbeat wrote:
       | This is very cool. Computer Algebra Systems are like a super
       | power. I've been able to use the TI-89's arbitrary precision to
       | do pretty sophisticated cryptography work. Symbolics.jl has the
       | ability to modify some of the rules of the algebra, which should
       | allow even easier implementation of number theory related
       | cryptographic concepts (as well as compression and error
       | correcting codes... or quaternions or bra-ket quantum mechanics)
       | without needing specific libraries. I love this as I'm trying to
       | teach myself the fundamental concepts in mathematical terms and
       | not just something in a specialized black box library. (And
       | without paying the insanely high price of Mathematica if I ever
       | want to use it professionally.)
       | 
       | I've looked briefly into Julia in the past, but if stuff like
       | this becomes pretty standard (in the way numpy is for Python), I
       | think I could become pretty comfortable in Julia.
        
       | konjin wrote:
       | After playing around with them for a while types seem like the
       | wrong paradigm for symbolic work.
       | 
       | Term rewriting systems based on rules capture most mathematical
       | manipulation far better than ones based on functions and classes.
       | 
       | Differentiation was a breeze to solve when I sat down and figured
       | out how to formulate it so everything had an understandable
       | normal form. Then I could feed the output of that into another
       | algorithm that brought simple algebraic equations into a normal
       | form.
       | 
       | It's not a way I've seen much programming done outside of niches
       | in academia, but it was extremely powerful.
        
         | [deleted]
        
         | eigenspace wrote:
         | I disagree with this point of view. Fundamentally, types appear
         | all over the place in math.
         | 
         | If I give you a term                   x * y - y * x
         | 
         | and ask you to simplify it, you may tell me this term is just
         | 0, but if I had intended for x and y to represent quaternions
         | or square matrices that'd be a very foolish thing to do!
         | 
         | The rulesets that apply to a term depend on the types of the
         | elements of that term. You could bundle this information into
         | an `Assumptions` object that carries around some rules like
         | commutes_under_multiplication(x) => false
         | commutes_under_multiplication(y) => false
         | 
         | but imo, leveraging types are a very natural way to carry
         | around this particular class of metadata.
        
       | proindian wrote:
       | Dang is full of shit. Why supress comments of Indians defending
       | themselves from rubbish comments from Americans?
        
       ___________________________________________________________________
       (page generated 2021-03-05 23:00 UTC)