[HN Gopher] Python Programming and Numerical Methods: A Guide fo...
       ___________________________________________________________________
        
       Python Programming and Numerical Methods: A Guide for Engineers and
       Scientists
        
       Author : happy-go-lucky
       Score  : 196 points
       Date   : 2021-02-17 10:32 UTC (10 hours ago)
        
 (HTM) web link (pythonnumericalmethods.berkeley.edu)
 (TXT) w3m dump (pythonnumericalmethods.berkeley.edu)
        
       | amitport wrote:
       | This looks like a great resouce, thanks!
       | 
       | Note that in practice you can teach part II (numerical methods)
       | without 90% of part I (python programming) given that your
       | audience has some minimal programming experience (not necessarily
       | with python).
        
       | physicsguy wrote:
       | Not sure if this is more a difference between UK/US universities
       | as this says it's targeted at "serious undergraduates or graduate
       | students", but we covered all of this material in the 1st and 2nd
       | year undergraduate courses at the Universities I've studied at or
       | taught at here.
        
         | analog31 wrote:
         | I was a math + physics major. "Numerical analysis" was one of
         | the math courses, but you could also get CS credit for it. We
         | also used numerical methods in our physics courses. The CS
         | curriculum at the university near my house requires two math
         | courses beyond basic calculus, but does not require
         | differential equations.
         | 
         | I work at a place that has a mid sized engineering department.
         | The people who regularly handle math related programming tasks
         | tend to have degrees in the physical sciences, and
         | coincidentally are older than 50.
        
         | BeetleB wrote:
         | In the US, it is generally expected that incoming students have
         | no exposure to calculus.
         | 
         | Of course, many students do study calculus in high school and
         | can sometimes skip these courses in university, but the
         | curriculum assumes they haven't.
        
         | jrumbut wrote:
         | I was confused reading the table of contents until I got to the
         | final third which matched what I was hoping to see, and what I
         | think would be of interest to their target audience.
         | 
         | My understanding and limited experience is that US universities
         | focus more on software engineering and theoretical computer
         | science, with numerical methods (even very basic ones) left as
         | a later elective/graduate course.
        
         | mumblemumble wrote:
         | My graduate program was interdisciplinary, and attracted
         | students with a variety of backgrounds, including social
         | sciences, math, and CS.
         | 
         | So there was a huge need to have a survey class like this to
         | get everyone to a common baseline level of knowledge. Some knew
         | math but not programming, others knew programming but not math,
         | and still others knew a little programming and a little math,
         | but not quite enough of either.
        
         | dataflow wrote:
         | It matters what you're majoring in and focusing on. There's
         | only so much time to teach so much material in CS, for example,
         | and there's not much desire from the faculty to teach numerical
         | diff eq solving (and certainly none from most CS students to
         | learn it). The ones that do go to grad school tend to need
         | these more, but they haven't seen them in undergrad so they end
         | up learning what they're lacking in grad school (or late
         | undergrad).
         | 
         | That said, do you guys really learn/teach the shooting method,
         | Lagrange interpolation, predictor-corrector methods, QR
         | factorization, etc. as 1st/2nd-year undergrads? For what
         | majors? I feel like this would come at the cost of deferring
         | _other_ topics to later years.
        
           | the_svd_doctor wrote:
           | I learned those (not QR, but ODE methods, basic numerical
           | PDE, non linear equation solving methods, interpolation, etc)
           | in the first semester of my 2nd year of my engineering
           | bachelor. In Belgium.
        
           | capeterson wrote:
           | I'm about to graduate from a 4 year university in CA (edit:
           | CS major) and we are taught a fairly in depth numerical
           | methods class as part of our 3000 series of classes (i.e.
           | it's intended for third-year students).
           | 
           | It's certainly possible to take it in your second year where
           | I am, but usually students opt for other classes that will
           | help them get an internship before they go to numerical
           | methods, so it's a common third year class IMO. It doesn't
           | normally get pushed off farther than that though, since it's
           | a common prerequisite for fourth year classes.
        
           | physicsguy wrote:
           | We don't generally have majors/minors in the UK, so perhaps
           | this is part of it. You do a degree in for e.g.
           | Maths/Physics/Mech Eng, and all of your courses are in that.
           | 
           | Yes, we also covered other stuff like splines, finite
           | difference method, too. I think Finite Element was third year
           | where I last taught.
        
             | dataflow wrote:
             | > You do a degree in for e.g. Maths/Physics/Mech Eng, and
             | all of your courses are in that.
             | 
             | "Major" refers to the same things you mentioned (physics,
             | math, EE, etc.), it's not any different in the US.
             | 
             | > Yes, we also covered other stuff like splines, finite
             | difference method, too. I think Finite Element was third
             | year where I last taught.
             | 
             | Sorry to repeat the question, but again -- in what major
             | (or "for what degree" in your terminology) did you study
             | this? (CS? MechE? Physics? EE?)
        
               | physicsguy wrote:
               | My background is physics, but most recently was a TA for
               | classes for people in Aero/Mech/Astronautical Engineering
               | (courses were shared between all three programmes) doing
               | this stuff.
               | 
               | In that department, beyond the courses covering this
               | stuff which were compulsory, there were additional
               | optional courses for 3rd and 4th year students to do C
               | programming and more advanced numerical methods and stuff
               | like programming paradigms for HPC.
        
               | dataflow wrote:
               | Yeah, so I think this clears up why. I don't think that
               | "serious undergraduates or graduate students" statement
               | was aimed at physics folks, but at more like CS (and
               | possibly EE) folks. Who would also normally learn (say)
               | programming paradigms, algorithms, data structures, low-
               | level programming like C, etc. in their first 1-2 years,
               | which are topics I'm guessing you wouldn't be covering as
               | physicists. For students in CS (or to some extent EE or
               | more nearby fields) to go in this direction in their
               | first 1-2 years, they'd need to have a very unusual
               | amount of interest in numerical techniques to defer other
               | basic topics to later years (or just overload on
               | technical courses and defer humanities to later).
        
       | sireat wrote:
       | This looks like a nice introductory text (for non-programmers).
       | 
       | Is there a Github link to notebooks?
       | 
       | I've been teaching Python to adult non-programmers for a few
       | years now.
       | 
       | One issue is that of order on what to teach.
       | 
       | What do you teach after you teach variables and basic data types?
       | 
       | There is no one perfect solution.
       | 
       | This particular book teaches dictionaries, lists, sets (and NumPy
       | arrays!) before branching and iteration.
       | 
       | Good news - the examples in this book do not use L or list as
       | variable names, which even some seemingly respectable books do.
       | 
       | Personally I feel that branching and iteration (over strings)
       | should come first before progressing to more complex data types.
       | 
       | If you learn of lists before you know of iteration you can't
       | really do much with lists.
        
       | gostsamo wrote:
       | Time to first "use julia instead" comment: 3, 2, 1...
        
         | eternauta3k wrote:
         | You'll have to wait a bit longer, the interpreter is loading :)
        
         | [deleted]
        
         | cambalache wrote:
         | Now that people are "writing games and web servers" in Julia
         | (as per some commenters) and that the use case list for Rust
         | grows exponentially I cannot wait for the most epic of
         | collisions, Julia vs Rust, astronomers must be rubbing their
         | hands for this one.
        
           | gostsamo wrote:
           | It is funny, because those kinds of arguments are totally
           | meaningless. Languages are tools and a true craftsman would
           | master many of them and use them when they are the right tool
           | for the job. Instead, people who know only one tool try to
           | convince themselves and everyone else that the tool they have
           | is the only one they need and thus have it all backward.
        
         | Mauricebranagh wrote:
         | If its a pure numerical problem just use Fortran and buy the
         | NAG librarys.
         | 
         | oooh just checked and NAG do python library's now as well so
         | that might be teh best solution.
        
           | gostsamo wrote:
           | I'm actually a python developer, but I might be the only one
           | appreciating the identity investment that people on both
           | sides put in their chosen language.
        
       | hahahahe wrote:
       | I would like to see a polished Python package for data, math, and
       | science, similar to Mathematica. Or perhaps Wolfram can start
       | supporting Python? I'd actually use that over Pandas/Conda.
        
         | leephillips wrote:
         | Do you know about Sage? Or are you thinking about something
         | different?
        
         | gatestore wrote:
         | It is now possible to use Python inside Mathematica. Take a
         | look at:
         | 
         | https://blog.wolfram.com/2019/05/16/announcing-the-wolfram-c...
         | https://reference.wolfram.com/language/WolframClientForPytho...
         | 
         | It is also possible to run the Wolfram Kernel inside a Jupyter
         | Notebook:
         | https://github.com/WolframResearch/WolframLanguageForJupyter
        
       | musingsole wrote:
       | Looks like a solid textbook. Skimming the ToC, I thought it was
       | mostly basic stuff...and then I kept going. It goes deeper into
       | many topics than I would've guessed and that's before getting to
       | the numerical analysis stuff. The code examples look solid for
       | referring back to.
       | 
       | tl;dr: I bought a copy
        
       | ngcc_hk wrote:
       | Ch 15 I tried the code. Forget to do plt.close() and pythonista
       | has warning.
       | 
       | Btw, not sure how to get the label work.
        
       | reikonomusha wrote:
       | I've been a sort of oddball in the numerical/scientific computing
       | world in the past 10 or so years in that I mostly reject using
       | Python for serious numerical programming. This seems to be
       | against the common wisdom of my PhD scientist colleagues who grab
       | for Python any and every chance they get.
       | 
       | The Python community has done an _amazing_ job making a suite of
       | scientific software. Numerical libraries and plotting are
       | excellent in Python. Sympy and Sage are both very ambitious, and
       | in my opinion, quite successful projects. So what's the problem?
       | 
       | Longevity and stability. Make fun of it all you want, but old
       | crusty FORTRAN code, and in the past 20 years C and C++, are the
       | bedrock of serious scientific applications. Python usually isn't.
       | I'm not confident enough to say why with certainty, but maybe it
       | has to do with the fact that Python APIs and type signatures
       | (whether written out or not) are too handwavy and easy to change
       | without detection.
       | 
       | In addition to this, I rarely find that people, scientists
       | especially, are writing _applications_ in Python. Python is too
       | hard to deploy as an application sane people can use, is a mess
       | to configure, and difficult to make efficient if your slow parts
       | aren't rewritten. Most scientists slop code around in Jupyter
       | notebooks which bitrot and are forgotten about in 6 months.
       | 
       | The world is clearly in need of good numerical code and
       | environments, even with Python's amazing ecosystem. Julia has
       | been popular among my savvier programming colleagues and it has
       | attracted very high quality contributions, rivaling and exceeding
       | Python in some cases.
       | 
       | Back to me: what do I do? I write my numerical _applications_ in
       | Common Lisp. It's probably as unpopular of a choice as it gets in
       | the scientific world. I use Lisp because:
       | 
       | - The language is standardized and my code won't break when the
       | compiler updates
       | 
       | - I can write very high performance floating point code, using
       | SIMD/AVX, and readily inspect the assembly code at the function
       | level
       | 
       | - I can easily bind to C and Fortran libraries
       | 
       | - I can deploy applications as static executables
       | 
       | - I can avoid garbage collection and heap allocations with
       | careful programming
       | 
       | - I can write generic, extensible interfaces
       | 
       | - I can debug algorithms live--in the depths of stack traces--
       | without the compile-printf-edit cycle, which is _insanely useful_
       | when encountering tricky floating point accuracy bugs.
       | 
       | Clearly I'm a "programmer's programmer" though, which may be a
       | hint as to why Lisp remains unsuitable for scientist use. Most
       | scientists don't know or don't care what a stack allocation is,
       | and that's fine, because the tools have evolved around them in
       | such a way they can be hugely successful _without_ that
       | knowledge. It's also remiss to not mention that scientists
       | typically have 8+ years of training in their (non-software)
       | discipline, and "ain't nobody got time for [programming],"
       | despite its growing importance and necessity.
       | 
       | The biggest negative of using Lisp is that a lot of work you'd
       | take for granted in Python or Julia is _not_ done. It took me
       | several hours to get a certain complex 2^n dimensional matrix
       | factorization routine working from LAPACK in Lisp, whereas in
       | Python it would take me 30 seconds flat to look it up, read the
       | documentation, import it, and call it. So it's definitely the
       | case that Lisp's "startup overhead" is high. But at the end of
       | the day, if I'm building an electromagnetic simulator, I feel
       | some pre-baked math routines are the least of my concerns, and
       | the overall code structure and quality is of the highest concern.
       | Python is suboptimal in that category, in my opinion.
       | 
       | One good thing about college courses like these is that they
       | (slowly) displace numerical programming courses that are MATLAB
       | tutorials in disguise.
        
         | gautamdivgi wrote:
         | Most of your points about language deficiencies are valid. But
         | you still have the same problem of longevity and bitrot. Unless
         | you have teams that are willing to pick up Common Lisp and
         | contribute you will end up being the only maintainer of the
         | code. Case in point my advisor from grad school used C and did
         | all his numerical work from scratch. If I'd not found python,
         | numpy and scipy I'd still be in grad school.
         | 
         | If you're looking for a way out of python because of specific
         | performance issues you face I'm not sure why you did not just
         | go with Julia (unless you really wanted to do Common Lisp).
         | 
         | Try pyinstaller if you need to compile python into a self-
         | contained binary.
        
           | reikonomusha wrote:
           | I agree that code needs to be maintained, but I would say
           | that Common Lisp code has a tendency to bitrot a _lot_ slower
           | than other languages because implementations of Common Lisp
           | don't break code.
           | 
           | So there's definitely a valid maintenance point of code needs
           | to continue to be extended (though I've had no trouble
           | finding or training Common Lisp programmers), but Common Lisp
           | code simply doesn't stop working on the timespan of years.
        
             | ericbarrett wrote:
             | C can bitrot too, especially academic C. A few things I've
             | seen are missing headers, builds that only ever worked with
             | what the prof had in /usr/local, conventions that become
             | warnings that become errors (500 invocations of strcpy!),
             | and "beginner" stuff like assuming int's size or that you
             | can poke hard-coded memory ranges. I do think both
             | languages will be better than Python over decadal time
             | scales.
        
         | SiempreViernes wrote:
         | A list of reasons for using _any_ programming language has to
         | include either  "I already knew and liked the language" _or_
         | "management forced me" to be plausible, otherwise you're just
         | trying to rationalise a choice without admitting which of the
         | big two it was ;)
        
           | reikonomusha wrote:
           | I do think there's more nuance to the matter for scientific
           | software since choices are consistent even under
           | perturbations of "what management will accept". The nuance
           | typically comes from _how_ scientists are first exposed to
           | programming at all, especially late in their career. It's
           | often a survey course like this so they can catch up and be
           | productive members of their lab. And a course needs to fit in
           | a semester, so any language that's not Python, R, or MATLAB
           | will be wholly inadequate for that purpose.
        
             | Enginerrrd wrote:
             | More than that even... I spent 4 semesters learning
             | scientific computing in FORTRAN in school. As soon as I
             | graduated, my go to language for prototyping stuff and
             | frankly 99% of projects became and is _still_ python (or
             | c++ for microcontrollers).
             | 
             | The reasons are simple:
             | 
             | 1. Lots more help and examples available online for python
             | compared to FORTRAN.
             | 
             | 2. Python is performant enough, 99% of the time if you make
             | even a vague attempt to appropriately vectorize your code
             | and/or use the right libraries and techniques.
             | 
             | 3. Solving a well defined problem in any language is easy
             | enough. What's hard is getting to that well defined
             | problem, and that often involves playing and tweaking with
             | things until you wrap your head around the problem space.
             | Python saves me a lot of time so I can iterate faster by
             | avoiding stupid errors and having much lower boilerplate
             | overhead where stupid errors can really propagate into hard
             | to follow error messages down the line.
             | 
             | Python just is a lot more intuitive. I don't have to waste
             | nearly as much time on off-by-one and other stupid errors
             | because my indexing was fucked up. So I can spend most my
             | time on thinking about the stuff that really matters rather
             | than implementation details.
             | 
             | That said, I can write some lean and _mean_ FORTRAN if I
             | really need to, and I 'm ok with c++ when I need to be too.
             | In reality though, most of my workload isn't that
             | computationally intensive, and even when it is, most of the
             | hard parts have been outsourced to c++ behind the scenes
             | anyway. I can't even remember the last time when my
             | computational need was bad enough that I considered writing
             | my own fortran.
             | 
             | Microcontrollers are a different story. Trying to use
             | python for that seems like a dumb idea, but I know
             | MicroPython is a thing, though I'm skeptical of the whole
             | concept to be honest. You're so close to the metal at that
             | point that why would you want to abstract it away?
        
               | PartiallyTyped wrote:
               | There are cases in ML where python isn't performant
               | enough, so much so, that the state of the art for a while
               | converged on how to circumvent Python. This happens when
               | the model interacts with the outside world (environment)
               | a lot or when you are limited to non vectorizable
               | environments. Python simply adds too much friction to the
               | mix.
        
         | jpeloquin wrote:
         | > The biggest negative of using Lisp is that a lot of work
         | you'd take for granted in Python or Julia is not done.
         | 
         | Any hard-earned advice on how best to work around this in a
         | scientific context without wasting too much time? That is,
         | which areas of numerics / plotting / stats have a useful Lisp
         | library vs. when it's better to write your own? There are a lot
         | of apparently-abandoned numeric libraries on github. Do you
         | have to learn LAPACK & SIMD/AVX directly to be productive?
         | 
         | For context, I escaped to Python from Matlab 7 years ago, but
         | have grown to share many of your opinions on Python. Looking
         | for where to jump next. The short feedback loop of Lisp
         | (condition system / restarts) is appealing.
        
           | reikonomusha wrote:
           | I guess my number one piece of advice is to estimate time
           | accordingly. Most things can be solved using pre-existing
           | solutions with a bit of work, if you're patient and you can
           | afford to put in the time to do it.
           | 
           | Secondary to that:
           | 
           | - Learn to use FFI very well try hard to find libraries
           | written in C.
           | 
           | - Familiarize yourself with the structure of LAPACK and what
           | it offers.
           | 
           | - Learn to use a profiler and debugger (if using Lisp: SB-
           | SPROF, TIME, SLIME, and SLDB).
           | 
           | - (if using Lisp) Contribute useful things back to existing
           | libraries, like MAGICL [0].
           | 
           | In my opinion, Lisp has no good libraries for plotting. I
           | always have to plot by using another tool.
           | 
           | SIMD/AVX are things you use directly in SBCL if you want to
           | achieve very high FLOPS.
           | 
           | Maybe it's not the best analogy, but scientific programming
           | in Lisp is currently like woodworking (compared to building
           | IKEA with Python).
           | 
           | [0] https://github.com/rigetti/magicl
        
         | atsider wrote:
         | wrt to old crusty FORTRAN code: scipy is using many of those
         | popular libraries, it is based on them.
         | 
         | wrt to type signatures, many of those ancient FORTRAN libraries
         | are written with implicit interfaces, so bugs are likely to
         | show up. I came to learn this when compared some versions
         | floating around with the patched versions supplied with scipy.
         | 
         | My aim is not to bash, but justify scipy is a solid piece of
         | software, based on known developments, not just a shiny "new"
         | thing.
        
           | reikonomusha wrote:
           | I don't mean at all to imply scipy and co are flashy but
           | rickety pieces of software. I think it's a testament to their
           | quality that such libraries have reached a broad and diverse
           | audience.
           | 
           | I think the foundational libraries of the scientific Python
           | ecosystem are definitely well taken care of. I think a lot of
           | that care comes from "brute forcing" the ecosystem to make it
           | work, eg distributing native compiled C/Fortran code
           | seamlessly on a bunch of platforms with entirely new (at the
           | time) package managers like conda, or wholly new scientific
           | distributions of Python. My observations are more to do
           | what's built atop them.
        
         | craftinator wrote:
         | Excellent comment! I've spent countless hours on trying to get
         | my python environment to just the right state so I can run code
         | associated with scientific papers, and come to the conclusion
         | that it's just a big, unmaintainable mess. There are, of
         | course, many examples of well written python in the scientific
         | world, but there are many times when I simply haven't been able
         | to get it into a running state.
         | 
         | Julia is great in this regard, but it seems that over time, any
         | language with a centralized library repository will become a
         | graveyard for unmaintained legacy code, and this is bad for
         | long term scientific use.
         | 
         | The downsides of rolling your own mathematical functions is
         | that it's very time consuming, and there's always a chance
         | you'll get it _mostly right_. These are the reasons that
         | python, with numpy, scipy, and the huge quantity of other
         | libraries available tend to draw scientific use.
        
       | mud_dauber wrote:
       | Thank you for posting this.
        
       | unnouinceput wrote:
       | I love Python, it gives me the best jobs of my life.
       | 
       | Usually is like this. Customer happens to stumble some easy 3
       | lines program that solves the problem in an easy subset and after
       | that it will create a full program around that one with all the
       | blings. Then goes into production after a month and 15k USD
       | lighter and its users are using it for full scale and the solving
       | time now is in hours. Then it will search for an expert to solve
       | the impossible problem of speeding up Python and curses at all
       | the Indians that are scamming him and all experts who ask for
       | another 15k USD to actually implement the program correctly.
       | 
       | I shit you not, last one was a data scientist who paid like $3/h
       | some Indian dude that got the job done in 2 days, took the money
       | and contract was closed under those terms. Then this data
       | scientist was all over his job posting crying that, while the
       | initial matrix was in dozens of columns/rows now the same Python
       | program would take 3 days when throwing at it a matrix that was
       | having millions of columns/rows in size. I mean, it's several
       | orders of magnitude higher, I was amazed that the program
       | actually finished the job after 3 days instead of just crashing.
       | 
       | So I had to drill into his thick skull the idea that while he
       | initially went to war against Lichtenstein and won, definitely
       | cannot win against US army using the same weapon. Only after that
       | he agreed to scrap the Python program altogether and finally go
       | with C++, because speed does matter in the end.
       | 
       | Like I said, I love Python, especially for anything math related.
        
         | alokrai wrote:
         | I wonder why it is necessary to mention "Indians that are
         | scamming" him as if scamming is something only Indians do?
         | Should I be careful only when dealing with Indian consultants
         | and presume others won't lie or cheat?
         | 
         | edit: grammar
        
           | unnouinceput wrote:
           | Probably because Indians are the ones that are leaders in
           | scamming? Also, if you paid attention, I've mentioned the
           | initial Python dev was an honest Indian, he did his job
           | within terms of contract. The data scientist, which in this
           | case was a Canadian, is solely at fault for not disclosing
           | full scale to the Indian one.
        
             | alokrai wrote:
             | >Probably because Indians are the ones that are leaders in
             | scamming?
             | 
             | I could respond to it but there is not much to be said. I
             | think you have made my point.
        
           | mlN90 wrote:
           | >as if [bad thing] is something only [group] do? This is such
           | a nonsense statement and it comes up everytime someone
           | describes someones nationality in correlation with something
           | awful.
           | 
           | Honest question, when you read the line "[...]Chinese that
           | are good at math [...]" do you read that all Chinese people
           | are math-wizards? Because that is not what it says.
           | 
           | What about "[...] black people that are fast runners [...]"?
           | 
           | all 3 examples, including the one that turned on your torch
           | of virtue, describes a sub-set of a group, the primary
           | attribute of said group and nothing more.
           | 
           | If anything the implication that the guy you were replied to
           | is somehow biased and "racist" against the absolute-plague-
           | tier of disproportional scammers coming out of India is based
           | on nothing but your inability to differentiate between
           | "broadspectrum-racism" and "critism of a subset of a group"
        
             | alokrai wrote:
             | This is a very charitable reading of the comment, and the
             | examples stated seem somewhat unrelated.
             | 
             | A closer analogy will be: "He was lost in New York City.
             | Later, he cursed at all the Blacks who robbed him." Or "He
             | had an intense negotiation with the financiers. He later
             | cursed at all the Jews who were scamming him."
             | 
             | As you may note, the term "jews" or "blacks" or "Indians"
             | (in the original comment) is not merely stated as an
             | adjective to describe the individuals, rather it is used in
             | pejorative sense to denote a cultural trait within the
             | group that makes them act in a particular manner. A child
             | comment by the original poster makes his prejudice quite
             | clear: "Probably because Indians are the ones that are
             | leaders in scamming? "
             | 
             | I get your whole point about talking about individual,
             | subset, and group, but it looks like just a defence for
             | calling Indians "world leaders in scamming.", rather than
             | some data based, dispassionate description of the
             | situation.
             | 
             | Edit: grammar
        
               | mlN90 wrote:
               | You have to resort to using analogies when the actual
               | sentence in question transfers very well in my examples?
               | 
               | I'm making extreme examples out of the sentence, but
               | putting something 'awesome' with it. Being good at math /
               | Fast runners etc - to make the point very concise and on
               | point.
               | 
               | Had i run with the theme and went "White people who
               | shoots up schools [...]" or "Black people who sell crack
               | cocaine" you would have likely missed the point entirely
               | because I'm using negative-stereotypes.
               | 
               | That the child-comment elaborates his thoughts into
               | racist ramblings is frankly irrelevant to me. The guy is
               | clearly both illiterate, insensitive and likely in the
               | silly end of the bell curve.
        
         | analognoise wrote:
         | Where can I hire somebody for $3/hour?
         | 
         | Surely this is an exaggeration?
        
         | objektif wrote:
         | This is a horribly written post you may want to edit it.
        
         | alejoar wrote:
         | Libraries like numpy are implemented in C under the hood, and
         | you can also easily extend your Python code with C.
        
           | enriquto wrote:
           | > you can also easily extend your Python code with C
           | 
           | Or, more precisely, you can easily wrap your algorithm
           | written in C by a bit of boilerplate Python code (if that is
           | your fancy).
        
             | fauigerzigerk wrote:
             | Or you could use Cython to do a bit of both at the same
             | time.
        
               | enriquto wrote:
               | I don't see the advantage. If you use cython then your
               | code becomes quite difficult to use in the rest of the C
               | ecosystem (where your algorithm belongs).
        
               | counters wrote:
               | Usually if you use Cython, f2py, numba, or something
               | similar, the goal isn't to create code that fits into the
               | ecosystem of the compiled language - it's to optimize a
               | kernel that fits into the Python ecosystem with minimal
               | baggage.
        
         | shepardrtc wrote:
         | I'm sure that Python program could have been rewritten to
         | complete in an acceptable amount of time. Yes, the C++ program
         | will be faster, but a good Python dev could probably have fixed
         | that scientist's code with numpy, proper BLAS libraries, and
         | maybe a quick dash of Cython.
        
           | galangalalgol wrote:
           | I post this whenever it comes up, but lots of small matricies
           | or any operation gets hit too hard by the ffi overhead into
           | numpy.
        
             | shepardrtc wrote:
             | Yes, for smaller stuff, numpy really does add a lot of
             | overhead. It would make life much simpler if you could copy
             | things into a numpy array more quickly, but oh well. In any
             | case, you can find pretty tight pure Python code for most
             | things. For instance, I needed to drop a relatively small
             | Linear Regression calculation from like 500 microseconds
             | using numpy or scipy (I don't remember) to double-digit
             | microseconds somehow. I googled it for a little bit and
             | after adapting some pure python code using regular lists, I
             | got it into double digits. And then after converting the
             | function rather easily into Cython (and just the function,
             | not the entire program), its single-digit microseconds now.
        
           | dataflow wrote:
           | Don't forget Numba.
        
       | tralarpa wrote:
       | Some of the chapters look a bit superficial. I have seen other
       | books (with the same target audience) with deeper discussions on
       | the impact of algorithmic/implementation choices on the accuracy
       | of the results etc. Here, they sometimes just refer to a python
       | function with the link to its documentation.
        
         | Scipio_Afri wrote:
         | Which books?
        
       | [deleted]
        
       | saruken wrote:
       | This looks like a great resource, but the vast number of
       | grammatical mistakes is really distracting. Some issues are to be
       | expected with a self-published book, but at least in the sections
       | I read, a majority of sentences are grammatically incorrect.
       | Maybe I'm just naive, but that surprises me, especially with this
       | being affiliated with an institution like Berkeley.
        
       ___________________________________________________________________
       (page generated 2021-02-17 21:01 UTC)