[HN Gopher] Speeding up Go's builtin JSON encoder up to 55% for ...
       ___________________________________________________________________
        
       Speeding up Go's builtin JSON encoder up to 55% for large arrays of
       objects
        
       Author : eatonphil
       Score  : 33 points
       Date   : 2022-03-03 21:12 UTC (1 hours ago)
        
 (HTM) web link (datastation.multiprocess.io)
 (TXT) w3m dump (datastation.multiprocess.io)
        
       | jjtheblunt wrote:
       | i wonder what % of cpu cycles on AWS are spent encoding and
       | decoding JSON.
        
         | hhh wrote:
         | I wonder what % are just idle.
        
       | djanogo wrote:
       | What is "large"?
        
         | eatonphil wrote:
         | > And generated two datasets: one with 20 columns and 1M rows,
         | and one with 1K columns and 10K rows.
         | 
         | And if that's not big enough for you, the observed effects grow
         | as I increased columns and rows.
        
       | latchkey wrote:
       | I've been using jsoniter (another 'fastest' json lib) for ages.
       | It is much faster than the built in, so 55% faster than slow
       | isn't really meaningful.
       | 
       | Surprised a more comprehensive benchmark evaluation was not done
       | since this tends to be a pretty sensitive topic.
        
         | eatonphil wrote:
         | Unlike other implementations, as far as I can tell, this one
         | composes encoders. Under the hood it can use
         | encoding/json.Marshal or goccy/go-json.Marshal (another
         | existing very fast library) or any other libary that implements
         | the json.Marshal call.
         | 
         | This implementation is competitive (sometimes faster, sometimes
         | slower) with good implementations like goccy/go-json on its
         | own, and beats goccy/go-json when composing this library with
         | goccy/go-json.
         | 
         | This composition and goccy/go-json performance is included in
         | the post.
         | 
         | Maybe in a followup post I'll do more benchmarks against other
         | fast libraries. But for this one I wanted to show the process
         | and then just pick one fast library for comparison.
         | 
         | Edit: also, I forgot to mention in the post but some libraries
         | speed up encoding by requiring a fixed schema. DataStation/dsq
         | is extremely dynamic and I'll never know the schema up front.
         | Just another reason why I couldn't use some existing faster
         | libraries.
        
           | latchkey wrote:
           | All I am saying is that the 'fastest' json libs all do
           | comparisons against other 'fastest' json libs.
           | 
           | I'd expect any other json lib trying to be faster to do those
           | same comparisons and not produce arguably clickbait "55%
           | faster" titles since your library isn't really that much
           | faster than goccy.
           | 
           | Pick the ones that match (dynamic) to compare against.
        
             | eatonphil wrote:
             | If you're not a fan of the blog post that's cool. But I
             | posted the blog post (and wrote the post in the first
             | place) rather than a Show HN link to the project itself
             | because I thought the process was worth showing as much as
             | the result.
        
               | mh- wrote:
               | > I thought the process was worth showing as much as the
               | result
               | 
               | it was, thanks for sharing.
        
       ___________________________________________________________________
       (page generated 2022-03-03 23:00 UTC)