[HN Gopher] Launch HN: Vimmerse (YC W22) - Platform and SDKs to ...
       ___________________________________________________________________
        
       Launch HN: Vimmerse (YC W22) - Platform and SDKs to create and play
       3D video
        
       Hello HN! We are Jill and Basel, founders of Vimmerse
       (https://vimmerse.net) Vimmerse is a platform and SDKs for creating
       and playing 3D video. We make it easy for businesses, developers,
       and creators to provide 3D immersive experiences for their viewers
       using our content creation APIs and player SDKs.  We have been
       watching video in two dimensions for too long! Most 3D content
       people experience today is computer generated, such as VR games.
       Diverse use cases can benefit from real-world 3D video, such as
       music performances, training, family memories, and the metaverse.
       Why isn't there more real-world 3D video? Because it has been
       difficult and expensive to create, stream, and playback real-world,
       camera-captured 3D video content.  I am an IEEE Fellow and inventor
       of over 200 patents. Basel has a PhD in electrical engineering with
       deep experience in VR/AR/3D video. While at Intel (as Chief Media
       Architect and Intel Fellow), I led an MPEG standards workgroup on
       360/VR video. I found that 360/VR video's limitation to 3 Degrees
       of Freedom (DoF) caused discomfort or even nausea in some viewers,
       because we experience the real world in 6 DoF (controlling both
       position + orientation), not in 3DoF (just orientation). I
       initiated an activity in MPEG to develop the MPEG Immersive Video
       (MIV) standard, which provides 6DoF. I became the lead editor of
       the MIV standard, and Basel was the lead editor of the test model.
       While at Intel, we developed a MIV 3D video player for Intel GPUs
       and observed the greater engagement that 3D video provides to
       viewers. However there was no content available for the new MIV
       standard, and creation of 3D video content was a very difficult and
       expensive process. We realized that if 3D video were to become
       widely used, the creation and distribution processes needed to be
       simplified. We founded Vimmerse with a mission to greatly expand
       access to 3D video.  Businesses can build their own services using
       our APIs to upload captured content and prepare 3D video on our
       platform. Our platform is capture agnostic, meaning it can work
       with any video device suitable for 3D capture, such as iPhones or
       Microsoft Azure Kinect depth sensors. More than 60% of iPhone 12
       and 13 models sold (Pro and Pro Max) have LiDAR depth sensors,
       which can be used for capturing 3D video content.  The Vimmerse
       platform prepares 3D content from the uploaded capture files. Our
       approach is built on top of industry standard video streaming
       protocols and codecs, so existing video streaming servers and
       hardware video decoders can be utilized. The content preparation
       platform creates two types of output bitstreams from the uploaded
       captures: bullet video and 3D video. Bullet video (named after the
       Matrix movie's bullet effect) is a 2D video representation of the
       3D video, following a predetermined navigation path selected by the
       content creator. 3D video gives viewers the ability to control
       navigation with 6 Degrees of Freedom (6DoF), where they can pan
       around or step into the scene. Bullet video may be streamed (HLS)
       or downloaded (MP4) for playback on any device. 3D video playback
       may be streamed (HLS) to the Vimmerse 3D video player.  Services
       may use the Vimmerse 3D video player app, or developers can use our
       player SDK inside their own apps. Viewers have the ability to
       control navigation using any viewer input method: device motion,
       mouse/keyboard, touch controls, head/gesture tracking. The player
       SDK renders views for the selected 6DoF position and orientation.
       We haven't published pricing yet, but our plan is to charge for our
       content preparation APIs based on usage (e.g. minutes of video
       processed and streamed) and player SDKs based on number of units.
       The Vimmerse website https://vimmerse.net provides a no code way to
       test out our platform or view featured content. We invite the
       community to upload their own test content. Instructions for
       preparing content are available at
       https://blog.vimmerse.net/freeport-platform-usage-instructio....
       Sign up for an account to upload content, or use the guest account
       (login: guest, password: Guest#123). The Vimmerse 3D player for
       Android is available in the Google Play Store at
       https://play.google.com/store/apps/details?id=net.vimmerse.p....
       Please share your thoughts and experiences with 3D video, and your
       ideas for use cases that would benefit the most from 3D video. Are
       there any features we should add, or capture devices that you would
       like to have supported? Looking forward to getting your feedback.
        
       Author : jillboyce
       Score  : 53 points
       Date   : 2022-03-15 14:49 UTC (8 hours ago)
        
       | jzer0cool wrote:
       | Intrigued by the number of patents you have, 200! I like to
       | understand the process of submitting a simple one. Could you
       | share costs and share process so I can get habit of attempting to
       | submit one? Has any patents come to fruition in terms of
       | royalties, $$, etc., do you mind sharing?
        
         | jillboyce wrote:
         | All of my granted patents are owned by my previous employers,
         | who also paid all expenses involved in submitting the patent
         | applications, and collect any royalties. Some of my patents are
         | essential for video codec standards, including H.264/AVC and
         | HEVC.
         | 
         | It was a change of mindset to apply for my first patent for
         | Vimmerse.
         | 
         | If you are in the US, the USPTO has a good overview here:
         | https://www.uspto.gov/patents/basics/patent-process-overview
         | 
         | Application fees are lower for small entities or micro
         | entities. https://www.uspto.gov/learning-and-resources/fees-
         | and-paymen...
         | 
         | But the biggest cost is paying for a patent attorney to prepare
         | the application. If you want to try to do it on your own, I
         | found this book very helpful: https://www.amazon.com/Invention-
         | Analysis-Claiming-Patent-La...
         | 
         | Good luck!
        
       | ZhangSWEFAANG wrote:
        
         | [deleted]
        
         | ddlutz wrote:
         | Please don't bring this toxic culture here.
        
         | jillboyce wrote:
         | https://www.linkedin.com/in/jillboyce/
         | 
         | Lots of documentation available on the public Joint Video
         | Experts Team (joint group between MPEG and ITU-T) website
         | https://jvet-experts.org/ about the activities I led in 360
         | video.
         | 
         | Here is an example: https://jvet-
         | experts.org/doc_end_user/current_document.php?i...
        
       | pavlov wrote:
       | Congrats on the launch!
       | 
       | I've been very interested in this field since the original Kinect
       | came out and it was discovered you can easily get the raw RGB+D
       | data over USB. I made a bunch of prototypes and had ambitious
       | plans to make a depth-enabled short movie for my film school MA
       | degree project, with a player on the original iPad and head
       | tracking for a parallax effect. (A project much too ambitious; I
       | never finished the player or the movie or the degree.)
       | 
       | IMO you should hire an in-house artist right away, assuming you
       | have funding. Right now your Featured Content section looks like
       | demo footage from academic papers. For someone uninitiated, this
       | content doesn't really present the potential of immersive video.
       | 
       | You should have someone with enough artistic vision and technical
       | competence whose primary job is to come up with new exciting
       | demos of your technology. That person will need patience and a
       | start-up mentality because the content production pipeline for
       | something this cutting-edge is probably very foreign to most
       | artists. But it would make a huge difference in how people can
       | understand your project.
       | 
       | (If I were twelve years younger I'd send you a job application
       | right away. Working on this would have been my dream job.)
        
         | jillboyce wrote:
         | I agree that we do need some better demo content that can
         | better illustrate the potential of 3D immersive video. We are
         | working on that right now. I encourage anyone who has some
         | interesting content or can make some to contact me.
        
       | opan wrote:
       | No relation to vim then? I guess it's v for virtual and then
       | "immerse".
        
         | almog wrote:
         | "Ah, finally a VR platform where decent people (and otherwise)
         | can immerse themselves in vim..." I thought to myself while
         | deep inside me I knew it was destined to be a short road to
         | disappointment and heartbreak.
        
         | jillboyce wrote:
         | No relation to vim. Yes, Vimmerse = V + immerse. V is for video
         | or volumetric or virtual...
        
         | xcambar wrote:
         | As a typical HN member, I too saw Vim first.
         | 
         | But that's a very biased population for sure!
        
       | DonHopkins wrote:
       | Will there be an Emacsmerse compatibility mode? ;)
       | 
       | But more seriously (but still on the spirit of Emacs), how will
       | your 3D video player be scriptable and extensible at runtime?
       | 
       | I have some positive experience with extensible 3D panoramic
       | stereo videos players, which I'd love to share with you:
       | 
       | I developed a system for JauntVR that enables cross-platform
       | scripting of Unity3D applications in JavaScript, for Jaunt's 3D
       | panoramic video player. It also greatly eases and accelerates
       | development and debugging, which is otherwise extremely slow and
       | tedious with Unity.
       | 
       | Unfortunately JauntVR ran out of money for that project and
       | pivoted to other stuff before publishing it, but Arthur van Hoff
       | generously let me open source the general purpose
       | Unity/JavaScript scripting system I'd developed for it, which I
       | call "UnityJS".
       | 
       | It's useful for all kinds of other things beyond scripting 3D
       | videos, and it works quite nicely not just on mobile devices but
       | also with WebGL, but scripting interactive 3D video was the
       | problem I originally created it to solve.
       | 
       | I've used for several applications and integrated it with several
       | libraries, including financial data visualization on WebGL,
       | TextMeshPro in Unity, ARCore on Android, and ARKit on iOS.
       | 
       | UnityJS also makes developing and debugging scriptable Unity apps
       | much easier and quicker (by orders of magnitude) because you can
       | use the standard JavaScript debuggers (even on mobile devices)
       | and just restart the app to reload new code in seconds, instead
       | of waiting the 10-60 minutes it takes to totally rebuild the app
       | in Unity.
       | 
       | By using the same standard built-in JavaScript engine of the web
       | browser on each platform, you can implement cross-platform
       | interfaces in JavaScript instead of re-tooling the interface for
       | each platform in different languages, and incorporate standard
       | up-to-date off-the-shelf JavaScript libraries.
       | 
       | I believe the 3D video player should be much more like an
       | extensible scriptable browser (and like Emacs and NeWS!), instead
       | of just a dumb fixed-function play/pause/rewind/fast-forward
       | video player (like piping a text file through "more" on a VT100).
       | 
       | So you can download dynamic content and scripts that implement
       | custom interfaces for 3D videos, like interactive titles and
       | credits, custom menus, theater lobbies, even networked multi
       | player environments for watching 3D videos together.
       | 
       | It's great fun to overlay live 3D graphics, text, physics
       | simulation, interactive kinetic typography, particle systems,
       | etc, into 3D videos.
       | 
       | It's extremely useful for selecting and navigating content,
       | displaying interactive titles and credits, or anything else you
       | might imagine, like games and applications embedded in and
       | orchestrating 3D video.
       | 
       | And of course you can publish new content, hot updates, and
       | extend the experience by downloading new code and data over the
       | internet, instead of pushing out a new version of the app through
       | the app store every time you want to improve the experience or
       | publish a new video or series.
       | 
       | Some obvious and useful application for 3D videos is designing
       | compelling opening and closing credits, captions, subtitles, pop-
       | up notifications, pie menus, player controls, navigation,
       | bookmarking, tagging, and any other kinds of user interfaces,
       | that can adapt to the content and environment in whatever
       | direction you're currently looking, so you don't miss them and
       | they're not unusable, just because you were looking in the wrong
       | direction at the wrong time.
       | 
       | 3D videos suffer practically and creatively from hard-coded
       | static title sequences and credits, because you might not be
       | looking in the right direction at the right time to see them.
       | It's like pointing a movie camera at a stage to film a play,
       | instead of inventing a whole new expressive cinematic language
       | appropriate for the new medium. And I'm not just talking 3D
       | scrolling Star Wars credits!
       | 
       | Of course you could just build a dumb non-extensible 3D video
       | player with just one hard-coded style of menus and scrolling
       | credits, or burn the title sequence and credits into the 3D video
       | itself, but that would be static, boring, and inflexible, because
       | the title sequence itself is an important thematic art form,
       | which should ideally be designed for each different video,
       | collection or channel.
       | 
       | The point is to enable the infinite creative possibilities of
       | interactive titles and credits that respond to the viewer's
       | attention and interest: scrolling back and forth, revealing more
       | information about the names you focus on in the credits, and all
       | kinds of other cool innovations that run-time scripting, dynamic
       | downloadable content, and interactive 3D graphics enable.
       | 
       | Saul Bass realized that 2D movie title sequences could be so much
       | more than simply scrolling a list of names up the screen, and he
       | designed the groundbreaking opening credits to movies by
       | Hitchcock, Scorsese, Kubrick, and Preminger. Imagine how intense
       | the interactive panoramic opening titles of a 3D film like Psycho
       | or North by Northwest could be, to set the mood for the rest of
       | the movie!
       | 
       | https://en.wikipedia.org/wiki/Saul_Bass
       | 
       | >During his 40-year career, Bass worked for some of Hollywood's
       | most prominent filmmakers, including Alfred Hitchcock, Otto
       | Preminger, Billy Wilder, Stanley Kubrick and Martin Scorsese.
       | Among his best known title sequences are the animated paper cut-
       | out of a heroin addict's arm for Preminger's The Man with the
       | Golden Arm, the credits racing up and down what eventually
       | becomes a high-angle shot of a skyscraper in Hitchcock's North by
       | Northwest, and the disjointed text that races together and apart
       | in Psycho.
       | 
       | Psycho's opening credits
       | 
       | https://www.youtube.com/watch?v=hwq1XHtJEHw
       | 
       | Saul Bass: North by Northwest (1959) title sequence
       | 
       | https://www.youtube.com/watch?v=1ON67uYwGaw
       | 
       | The Art of Movie Title Design | Saul Bass and Beyond
       | 
       | https://www.youtube.com/watch?v=Q_Mo0MqICXI
       | 
       | Saul Bass- Style is Substance
       | 
       | https://www.youtube.com/watch?v=frWLpyI3lXY
       | 
       | >"I have felt for some time that the audience involvement with
       | the film should really begin with the first frame. You have to
       | remember that until then, titles tended to be just some dull
       | credits, mostly ignored, or used for popcorn time. So there seems
       | to be a real opportunity to use titles in a new way. To actually
       | create a climate for the story that was about to unfold." -Saul
       | Bass
       | 
       | UnityJS is useful for other stuff too: Here's a great example of
       | an interactive WebGL based business model and structured
       | financial data visualization system that I developed with UnityJS
       | for Jen van der Meer of Reason Street.
       | 
       | The lion's share of the code is written in JavaScript, plus a
       | bunch of generic and bespoke modular Unity components and prefabs
       | that JavaScript plugs together and orchestrates. On top of
       | Unity's 3D graphics, it uses the canvas 2D API and D3 to draw the
       | user interface, info overlay panels, and translucent pie menus
       | with live tracking feedback, all written in JavaScript so it's
       | easy to iteratively develop and debug!
       | 
       | It's totally data driven by JSON data and CSV spreadsheets. So
       | Jen can just enter fresh data into google sheets, turn the crank,
       | and play around with the interactive visualizations within
       | seconds. And I can just as quickly and easily modify the code or
       | data, then quickly run and debug the new version!
       | 
       | Amazon FY2018 Visualization:
       | 
       | https://www.youtube.com/watch?v=J8uOjelouUM
       | 
       | >"I created this tool as a way to kind of tell the story of how
       | companies shift from e-commerce, and build these combinatorial,
       | complex, technical system type business models. It's a little bit
       | of an obsession of mine. If you're curious about this project you
       | can find me in the links below. I'll be collaborating with my
       | long time collaborators here to see how complex companies and
       | ecosystems [work], and help us all better understand how to think
       | about these technology business models, and how they so
       | substantially shape our world." -Jen van der Meer
       | 
       | Apple Services Pivot:
       | 
       | https://www.youtube.com/watch?v=TR_4w7OCW4Y
       | 
       | Facebook FY 2018 Financials Visualization:
       | 
       | https://www.youtube.com/watch?v=cMQgjmj_mnQ
       | 
       | Here are some link to some doc, notes, and posts about UnityJS
       | I've written. And if you're interested in using it, I have a more
       | recent, modular, up-to-date version that I'm refactoring to use
       | the Unity package system instead of git sub modules and symbolic
       | links, and it also has some nice improvements for WebGL, and much
       | better JSON.net<=>Unity integration:
       | 
       | https://github.com/SimHacker/UnityJS/blob/master/doc/Anatomy...
       | 
       | https://github.com/SimHacker/UnityJS/blob/master/notes/talk....
       | 
       | https://github.com/SimHacker/UnityJS/blob/master/notes/unity...
       | 
       | https://news.ycombinator.com/item?id=21932984
       | 
       | https://news.ycombinator.com/item?id=22689008
       | 
       | https://news.ycombinator.com/item?id=22691004
        
         | jillboyce wrote:
         | Thanks for the info about the open sourced UnityJS. I'll take a
         | look.
         | 
         | We hadn't thought about making the 3D video player be
         | scriptable and extensible at runtime, and will give it some
         | thought.
         | 
         | Being able to overlay 3D graphics ( including titles) onto the
         | 3D video is on our roadmap. Glad to hear confirmation that it
         | will be a useful feature to add.
        
           | DonHopkins wrote:
           | You're welcome, and I'm happy to discuss it further, and
           | point you to the newer code that you can use or pick over for
           | tips and tricks as you desire.
           | 
           | Once you can overlay 3D graphics on 3D video, you'll
           | definitely want runtime scriptability!
           | 
           | Because of its name "UnityJS", sometimes people misinterpret
           | that it's something like Unity's old and now thankfully
           | deprecated "UnityScript", a compiled (not runtime interpreted
           | or JITted) ersatz JavaScript-ish language, that was really
           | just a thin wrapper around the CLR C# APIs, without most of
           | the standard JavaScript APIs, plus it's own weird syntax and
           | object system to make it even more uncanny.
           | 
           | But UnityJS is a totally different approach for a much
           | different purpose, and I wish I could think of a better less
           | confusing and loaded name for it.
           | 
           | Each platform has its own APIs and quirks for efficiently
           | integrating and exchanging JSON messages with its own browser
           | and Unity, in Objective C for iOS, Java for Android, and
           | JavaScript for WebGL.
           | 
           | UnityJS abstracts those platform differences like (and by
           | using) a web browser, so you can write cross-platform
           | JavaScript code, and communicate with Unity via JSON
           | messages, which uses JSON.net to convert back and forth
           | between JSON and C# and Unity objects.
           | 
           | It's better to rely on the build-in JavaScript engine in each
           | platform's standard web browser, than trying to roll your own
           | scripting language from scratch, bundle your own copy of
           | Chrome, or use a less ubiquitous languages than JavaScript
           | (as much as I love Lua and Python).
           | 
           | What's great about that approach is that it lets you use
           | standard development tools: you can live code and debug
           | WkWebView based iOS apps with the desktop Safari developer
           | tools, and Android Chrome component based apps with the
           | desktop Chrome developer tools.
           | 
           | And it works seamlessly with the entire ecosystem of browser
           | based JavaScript libraries. (Which is a relief if you've ever
           | tried to get a Windows C# library to work with Unity's
           | version of C#, let alone trying to port any JavaScript
           | library to UnityScript, which was practically futile).
           | 
           | On iOS, using the standard WkWebView browser as a scripting
           | engine also avoids the infuriating non-technical Dumb Apple
           | App Store Rules Problem, because they prohibit apps that
           | dynamically download and update and interpret any kind of
           | dynamic code, UNLESS you use their browser and JavaScript
           | engine.
           | 
           | Consequently, WkWebKit's JavaScript is the only runtime
           | scripting language you're allowed to use on iOS (it's much
           | better than the old UIWebView because it has a great JIT
           | compiler). Fortunately it works quite well! So be it.
        
             | jillboyce wrote:
             | Let's follow up offline.
        
       | hyferg wrote:
       | This seems like a step in the right direction away from sphere-
       | projected 'immersive' video. Very cool and can't wait for the
       | 6dof VR player to be ready!
        
         | jillboyce wrote:
         | Yes, sphere-projected VR/360 video isn't immersive enough given
         | its limitation to 3 DoF. 6DoF video feels so much more
         | immersive.
        
       ___________________________________________________________________
       (page generated 2022-03-15 23:01 UTC)