<?xml version="1.0" encoding="UTF-8"?><rss xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:atom="http://www.w3.org/2005/Atom" version="2.0" xmlns:cc="http://cyber.law.harvard.edu/rss/creativeCommonsRssModule.html">
    <channel>
        <title><![CDATA[Stories by Normal VR on Medium]]></title>
        <description><![CDATA[Stories by Normal VR on Medium]]></description>
        <link>https://medium.com/@normalvr?source=rss-a4e5b23e6714------2</link>
        
        <generator>Medium</generator>
        <lastBuildDate>Sat, 16 May 2026 01:58:24 GMT</lastBuildDate>
        <atom:link href="https://medium.com/@normalvr/feed" rel="self" type="application/rss+xml"/>
        <webMaster><![CDATA[yourfriends@medium.com]]></webMaster>
        <atom:link href="http://medium.superfeedr.com" rel="hub"/>
        <item>
            <title><![CDATA[Half + Half at Brooklyn Academy of Music & MIT Reality Hack]]></title>
            <link>https://normalvr.medium.com/half-half-at-brooklyn-academy-of-music-mit-reality-hack-43696569a97f?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/43696569a97f</guid>
            <category><![CDATA[brooklyn-academy-of-music]]></category>
            <category><![CDATA[normal]]></category>
            <category><![CDATA[half-half]]></category>
            <category><![CDATA[mit]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Thu, 20 Feb 2020 17:43:07 GMT</pubDate>
            <atom:updated>2020-02-20T19:39:06.834Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*HE5roRFF6VxcpnhLHGssfA.png" /></figure><p>Hey everyone,</p><p>I’ve got a few quick updates to share today, so let’s dive right in.</p><h3>MIT Reality Hack</h3><p>Last month Normal got to sponsor the <a href="https://www.mitrealityhack.com/">MIT Reality Hack</a> hackathon and had an absolute blast working with everyone there. I just about cried when everyone cheered for us during the opening ceremony. We rarely get to see the developers using Normcore in-person, so it was incredibly encouraging to hear how many people have used Normcore and were excited about the work we’re doing :’)</p><p>We believe in multiplayer VR because of its potential to connect users across physical distances, but we were particularly fond of a project called Duet that brings people into a shared physical space.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/569/1*VZbJdFb-T1j3phjh3g4KQA.gif" /><figcaption>Check out the <a href="https://vimeo.com/389359317">full video</a> on Vimeo.</figcaption></figure><p>The concept is gorgeous. You start off alone and through the use of dance, you’re guided along a path illuminated only by the steps that you need to take next. As you approach the end of your path, you reach out to discover your partner making contact with you. Guided by a bio-luminescent light, you move through a dance together as your bodies paint brush strokes through the environment.</p><p>I think it’s easy to see how many games translate to VR, but I’m always amazed to see how people build off of the things we already do in the real world. I can’t wait to see more content like this as the space continues to grow.</p><p>We learned a lot from seeing how developers are using Normcore practically. We’re excited to fold all of that into future updates to the Normcore SDK :)</p><p>Congratulations to all the participants, sponsors, and organizers for hosting such an awesome hackathon! We loved it and couldn’t recommend it more highly. If you were a participant at the hackathon, you’ll be receiving your free year of Normcore Pro from the MIT Reality Hack team in the coming weeks.</p><h3>Half + Half at the Brooklyn Academy of Music</h3><p>We’re excited to announce that beginning on February 22, <a href="https://halfandhalf.fun/"><em>Half + Half</em></a> will be presented as part of the <em>Teknopolis</em> exhibit at the Brooklyn Academy of Music!</p><p><em>Teknopolis</em> is an interactive technology showcase hosted by BAM, which explores how artists are using virtual and augmented reality, projection mapping, and other tools to create user-centered interactive, multisensory installations.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*ZS1ISZ57L93jDPBYpqiFmg.png" /></figure><p>At Normal, we believe virtual reality has the ability to bring people together instead of isolating them further, but doing so requires careful thought and consideration. <em>Half + Half </em>was our first foray into creating a game that was designed from the beginning to help you connect with people you know, and also to connect with people you don’t, even for a short while.</p><p>If you haven’t had a chance to play it, we highly recommend checking out the installation! The exhibition runs from February 22 to March 8 at the Brooklyn Academy of Music. You can grab your tickets on <a href="https://www.bam.org/kids/2020/teknopolis-2020">BAM’s website</a> now! We hope to see you there :)</p><p>That’s it for now, until next time, follow us on Twitter and Instagram <a href="https://twitter.com/normalvr">@normalvr</a> and <a href="https://twitter.com/maxweisel">@maxweisel</a> for the latest.</p><p>Virtually Yours,<br>Max</p><p>Originally published at <a href="https://www.normalvr.com/blog/half-half-bam-mit-reality-hack/"><em>Normal</em></a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=43696569a97f" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Half + Half Starball Update]]></title>
            <link>https://normalvr.medium.com/half-half-starball-update-b952ec89eea3?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/b952ec89eea3</guid>
            <category><![CDATA[starball]]></category>
            <category><![CDATA[multiplayer]]></category>
            <category><![CDATA[half-and-half]]></category>
            <category><![CDATA[normal-vr]]></category>
            <category><![CDATA[oculus]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Tue, 24 Dec 2019 18:27:42 GMT</pubDate>
            <atom:updated>2019-12-24T18:53:30.659Z</atom:updated>
            <content:encoded><![CDATA[<p>Hey everyone,</p><p>We’ve got a quick holiday update for you. We’re making <a href="https://www.oculus.com/experiences/quest/2035353573194060/?locale=en_US">Half + Half</a> free for the holidays, and we’ve added a new game mode called Starball :)</p><h3>Starball</h3><p>With its smooth movement, serene environment, peaceful music and the sound of wind rustling between your ears, it’s no wonder that Glider has been one of the most loved and played experiences in <em>Half + Half</em>. Knowing this, we wanted to expand the world of Glider in <em>Half + Half</em> and bring another layer of fun and competition in so you can enjoy it even more.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*vRtXfcsPe2axn1URQH_GPg.jpeg" /></figure><p>Today, we’re announcing <strong>Starball</strong>, a two-versus-two competitive, action-packed aerial soccer game. Starball uses the Gliding mechanic and world we’ve all come to love and adds a new playing field in the sky for an addictive and insanely fun competitive game.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/576/1*O5lQAY3w1awgzjwG0Hssvg.gif" /></figure><p>Score by bumping the Starball into the other team’s goal, using your Glider to boost and strafe around opposing players to keep control of the ball and defend your team’s goal.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/576/1*DtAgp2T5QfFawiZsOgfxZQ.gif" /></figure><p>Matches last five minutes, and the team with the most points wins. For a bonus point, try knocking the Starball into the small circle goal above the main goal!</p><p>Head over to Homeworld and find the new Starball door to try a match starting today!</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*UoQOz8Iyin4NFxOu.png" /></figure><h3>Rich Presence + Destinations</h3><p>Half + Half was made to be shared with others, which is why we’re excited to announce our support for Oculus’s new social APIs to help you find and meet up with your friends in Half + Half.</p><p>The Rich Presence API provides a way for your friends to see what game or application you’re running on your Oculus headset, and even what minigame or world you’re visiting. If you’re in a party with a friend on Oculus Home, head to the “Parties” tab under “Social” and search for their name. You’ll see their current status and location displayed.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/960/0*Msf3nwEgi98jf8oF.png" /></figure><p>If your friend is playing a minigame that needs more players or is an open world, you can select “Go To” to jump to their location in a shared app and join their match.</p><p>The Destinations API provides a way for you to send a URL over Facebook Messenger or another messaging platform that will allow you to launch into a world or minigame directly with your friends.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/993/0*ekEmcyLT2qSxBRWb.png" /></figure><p>All of these new social features are coming to Half + Half beginning today. Try inviting someone to play a round of Hide + Seek, or just hang out together in Swim. Just a few clicks, and you’re chilling!</p><h3>See you in Starball!</h3><p>That’s all for now. We hope we’ll catch you in the sky in a match of Starball!</p><p>Happy Holidays and Happy New Year from the team at Normal.</p><p>P.S. Let us know what you think of Starball and the new updates! Give us a shout on Twitter and Instagram @normalvr and @maxweisel</p><p>Learn more about Half + Half at <a href="http://halfandhalf.fun">halfandhalf.fun</a></p><p>Originally published at <a href="https://www.normalvr.com/blog/half-half-starball-update/"><em>Normal</em></a>.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=b952ec89eea3" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Half + Half Matchmaking Update]]></title>
            <link>https://normalvr.medium.com/half-half-matchmaking-update-51bb46db30d4?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/51bb46db30d4</guid>
            <category><![CDATA[half-and-half]]></category>
            <category><![CDATA[oculus-quest]]></category>
            <category><![CDATA[oculus-rift]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[matchmaking]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Fri, 18 Oct 2019 14:13:27 GMT</pubDate>
            <atom:updated>2019-10-18T15:20:03.123Z</atom:updated>
            <content:encoded><![CDATA[<p>Hey fam,</p><p>A few weeks ago we launched our multiplayer VR game Half + Half! We’re beyond thrilled with the release, and we wanted to share a few highlights as well as a few updates we’ve made today.</p><p>We made a big splash at the Oculus Connect 6 keynote. Half + Half will soon have support for Rich Presence + Destinations so your friends can see you’re in Half + Half and hop into your game.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*QgA6-1UhlHkuZJojAtRZDA.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*8uMofb7APkRwpLHE9-XRDg.png" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*hiHUeoP1JAHEA1KniMsVag.png" /></figure><p>My personal fave though was definitely the video montage of games that came to the Oculus Store recently. It’s wild to see just how different Half + Half is from the bulk of the VR experiences out there right now.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FXSTDQNknvzI%3Ffeature%3Doembed&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DXSTDQNknvzI&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FXSTDQNknvzI%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/68ece6491481d6ce17207c919a02fcf3/href">https://medium.com/media/68ece6491481d6ce17207c919a02fcf3/href</a></iframe><p>We definitely got a laugh from seeing our blobs between all of those first person shooters. Half + Half is the first title the team at Normal would want to play in VR. And we’re not just saying that because we made it and we’re getting paid to say that ;P We’re saying it because it’s true, and we believe there are other people out there like us, that are tired of more shooters &amp; violent games. We want fun, family oriented stuff. We want collaboration. We want positive experiences. We want things that aren’t just for gamers. Honk if you agree!</p><p>On top of the Oculus Connect love, NathieVR also gave us a glowing review on his channel here:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FvbF939tL7Sk%3Ffeature%3Doembed&amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DvbF939tL7Sk&amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FvbF939tL7Sk%2Fhqdefault.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=youtube" width="854" height="480" frameborder="0" scrolling="no"><a href="https://medium.com/media/b474ec6632fc585cb4dbe25a3fd3915f/href">https://medium.com/media/b474ec6632fc585cb4dbe25a3fd3915f/href</a></iframe><p>We’re in love with this review and love that people understand our intentions with Half + Half. It’s what you make of the time you spend with others in the game, and it’s all about the connections you make while playing. Nice going, Nathie and team!</p><h3>Half + Half is about people</h3><p>I talk about this concept at length in my <a href="https://www.youtube.com/watch?v=pnIB1PQNVM0">fireside chat with Yelena Rachitsky at Oculus Connect 6</a>. VR is dominated by games and gaming culture right now, but I don’t believe that’s going to be the extent of what VR is capable of. Many people expected Half + Half to be a game you try to complete, but Half + Half is more about the people you hang out with than it is about completing each experience.</p><p>I prefer to think of Half + Half like going to the park. You don’t go to the park to complete it. You don’t try to sit on every bench or play on everything in the playground. No, the park is about the people and the space itself.</p><p>When we hang out with friends, we don’t hang out in an empty room. We go out. The park gives us something to calm the ADD part of our brains. We can people watch and hang out, but at the end of the day, the experience comes from the people, not the park itself.</p><h3>Half + Half updates</h3><p>Ok, let’s get into the good stuff. We’ve got a bunch of shiny updates for you all today!</p><h4>New Matchmaker</h4><figure><img alt="" src="https://cdn-images-1.medium.com/max/700/1*mNzG-3NP1GPYDYqRP2mogQ.gif" /></figure><p>We’ve received a lot of feedback about matchmaking in Half + Half. What good is a social experience if you can’t find anyone? Half + Half now shows you which doors have other people waiting for a match, as well as who is online, and how many people are playing each of the games.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/640/1*zHJZLzCWWQmDgFPJSmTbzQ.gif" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/640/1*43VkOB66xXd-H1vg0DJGRw.gif" /></figure><h4>Shared Lobbies</h4><p>We now have shared lobbies in the Homeworld! You’ll see a handful of the other players online. If you want to play a game with someone, just gesture to the door and you can both hop in together.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*frMkHZruKpb2t0Blr5Dt-A.png" /></figure><p>While you’ll see these players in your space, we prevent anyone from getting in your personal space. If anyone gets too close, they’ll disappear and you’ll disappear for them too. We always want people to feel safe in the world of Half + Half.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/700/1*fBIZi-8pBN7uvWZYesIilg.gif" /></figure><h4>Open Worlds</h4><p>Swim and Glider were designed to be open world experiences. Places that have no mechanics or rules. In the latest update, you no longer have to wait for a match to join either of these worlds. When you enter a door, you’ll hop in instantly and we’ll add up to 12 people before we create a new room.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*rtyFvsm-RKI9EsMfoAbGkA.jpeg" /></figure><h4>Bugfixes</h4><p>We’ve also fixed a ton of small bugs with invites and each of the experiences in Half + Half. We’ve been working closely with Oculus and have resolved issues around inviting other players too.</p><h3>Wrapping up</h3><p>I really think that we’ve created something special with Half + Half. It’s the first title of its kind, and there’s still so much for us to learn about the multiplayer VR space. If you’re enjoying Half + Half, <a href="https://www.oculus.com/experiences/quest/2035353573194060">leave us a review</a>! We’re doing our best to change the mold of what a typical VR title is. Those reviews help us out a lot.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/800/1*9_TOBNxCisTG3vPShPB4Qw.gif" /></figure><p>If you’ve had any issues with matchmaking or invites, give it another shot this weekend! I’ll be in there saying hi to people. See you all in Half + Half!</p><p>Virtually Yours,<br>Max</p><p>Follow along on Twitter <a href="https://twitter.com/MaxWeisel">@maxweisel</a> and <a href="https://twitter.com/normalvr">@normalvr</a> for the latest updates.</p><p><em>Originally published at </em><a href="https://www.normalvr.com/blog/half-half-matchmaking-update/"><em>Normal</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=51bb46db30d4" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Working Remotely in VR & AR]]></title>
            <link>https://normalvr.medium.com/working-remotely-in-vr-ar-cadf467c0f39?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/cadf467c0f39</guid>
            <category><![CDATA[remote]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[collaboration]]></category>
            <category><![CDATA[remote-working]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Wed, 28 Aug 2019 18:47:27 GMT</pubDate>
            <atom:updated>2019-08-29T16:15:03.459Z</atom:updated>
            <content:encoded><![CDATA[<p>What’s up family!</p><p>Not too many people know this, but the entire team at Normal works remotely. Having a remote team has allowed us to hire people all over the world. However, it can make collaboration difficult. No one wants to open a VR prototype, try it out, and then spend time typing their comments into Slack. You also miss out on the organic impromptu discussions that can happen while working together in person. In my experience, the tangents we go on after trying a prototype usually lead us to our best ideas.</p><h3>Enter: Normcore</h3><p>If you haven’t heard of Normcore yet, it’s a multiplayer networking plugin that we created at Normal. Normcore allows you to easily add avatars and voice chat to any Unity VR/AR app in addition to any other multiplayer networking you might need.</p><p>If you’ve got a prototype that you’d like to share with someone, just import Normcore, add the “Realtime + VR Player” prefab to the scene, and you’re done. When your app opens up, it will automatically connect everyone to the same room, and spawn simple avatars with working voice chat.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/710/1*x8YvpxhjTUMCti4DYIh97Q.gif" /></figure><p>If you have any dynamic elements in your project, Normcore offers a rich set of tools to synchronize that state between all clients. Common things like synchronizing transforms or rigidbody physics are handled automatically by our RealtimeTransform component, and <a href="http://normcore.io/documentation/guides/synchronizing-your-own-data.html">custom components can be created to synchronize everything else.</a></p><p>You can invite as many people to the room as you’d like, we’ll take care of spinning up a server in the ideal region for the group to reduce latency.</p><h3>Working Remotely</h3><p>We’ve integrated Normcore heavily into our team’s workflow. With the ability to add multiplayer in a few minutes, we’re able to turn any prototype or design into a multiplayer space effortlessly. Whenever someone on the team makes something new, they can drop in Normcore, send a build to everyone on Slack, and get feedback from the team from within their design. Once we’re in, we don’t have to go back to Slack or email to guide each other through new features or bugs, or even sketch out visual ideas. We can do all of this inside the actual space we’re working on. The easiest comparison I can think of to help you contextualize the benefits, is if you had the functions of Google Docs, integrated directly into your website. This would enable you to edit and make updates with other team members, on the spot.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/819/1*P6qS5rXIiADYGgmi5BXxOg.gif" /><figcaption>Internally, we’ve created Wanda, our blobby avatar that we use across all of our projects, along with tools like brushes that we can use to sketch out ideas or annotate an existing prototype.</figcaption></figure><p>Normcore is especially helpful when it comes to working with different clients. Traditionally, we’d have to email them a prototype, wait for them to open it, try it out, and then have them email us feedback. It’s no fun, and can be incredibly frustrating if there are technical difficulties along the way. But if you’re showing a client an unfinished project in multiplayer, it becomes easy to guide them through areas that require feedback or further work. You can be there with them, and they can ask questions as they review everything.</p><h3>Multiplayer Changes The Way You Think</h3><p>In the early days of Normal, I knew multiplayer was going to improve our workflow significantly, but I didn’t realize that it would change the way we think about VR design.</p><p>It’s a hard thing to really put down on paper, but there’s a big difference between trying to problem solve on your own and with other people. Especially if what you’re designing has yet to take shape. You shift from thinking about a VR space as another app for a single person, and start thinking about it as a space for multiple people to play, think, work, etc.</p><p>One of my favorite examples of this is <a href="https://museumor.com/">The Museum of Other Realities</a>.</p><p>If you haven’t seen it, it’s an incredible art museum in VR. Honestly, it’s still one of my favorite VR apps to date. It’s the only place where you can view artworks created entirely in VR and its’ unique use of space, scale, and portals really take advantage of what VR is great at.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/491/1*HWdsVaXbzVmiJCC_lUu7lA.gif" /></figure><p>When Colin Northway and I first talked about this project, it was originally going to be a quarterly VR magazine. The plan was to release an app each quarter with a few artworks in a museum space. The funds would go to the artists creating VR work, and a little bit of each issue would go towards funding the next. It seemed like one of the few ways VR artists could potentially make money on their work.</p><p>You can hear Colin talk about it here in Az Balabanian’s <a href="https://researchvr.podigee.io/76-researchvr71">Research VR podcast</a>:</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fw.soundcloud.com%2Fplayer%2F%3Furl%3Dhttps%253A%252F%252Fapi.soundcloud.com%252Ftracks%252F672299858%26show_artwork%3Dtrue%26secret_token%3Ds-dafeg&amp;url=https%3A%2F%2Fsoundcloud.com%2Fuser-15906638%2Fresearchvr-podcast-514-excerpt%2Fs-dafeg&amp;image=https%3A%2F%2Fi1.sndcdn.com%2Fartworks-000588851636-vj77id-t500x500.jpg&amp;key=a19fcc184b9711e1b4764040d3dc5c07&amp;type=text%2Fhtml&amp;schema=soundcloud" width="800" height="166" frameborder="0" scrolling="no"><a href="https://medium.com/media/60fc9546ff947c8726b98ef1ce9170ee/href">https://medium.com/media/60fc9546ff947c8726b98ef1ce9170ee/href</a></iframe><p>As Colin mentions, in the beginning the goal was never to create a multiplayer space. However, after hanging out in the space with people just a few times, the multiplayer ideas started flowing. We started treating this space the same way we would a space in real life. The conversation switched from “Let me send this build to people” to “Let’s invite people over to the space to give us feedback”. Colin eventually started throwing monthly cocktail parties in VR, and it turned into the multiplayer experience it is today.</p><h3>The Future of the Workplace</h3><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*7YxcTPIQPBm_7njisc-9wA.gif" /></figure><p>Normal has been around for almost four years now. Throughout that time, I’ve worked from home. I’ve moved across the country. I’ve worked with brilliant people, some of whom I’ve never met in-person. Working remotely has allowed me to create a team of the brightest minds across the world.</p><p>While this workflow has been very successful for us as a VR and AR team, I fully expect this to translate to just about every industry that works on a computer. At the moment, this works for us because we can create our own tools, but I expect things will change as tools are made for every other type of computer work.</p><p>One of the many perks of working remotely is that I can live anywhere. So many people’s jobs dictate where they live. I think many people are worried about VR replacing their time in real life and people will become even more addicted to their computers. I like to think that VR will replace the time we already spend on computers. I envision a world where our residency isn’t limited to our place of work—VR could allow us to do the same things we do when we sit behind a screen for 8 hours a day, remotely, and ideally more effectively.</p><p>Anyway, I’m getting a little carried away ;P We’ve still got a while before that happens.</p><p>Want to try adding multiplayer to your own Unity app? Download Normcore <a href="https://normcore.io/">here</a>. We also have a few guides you might find useful like <a href="http://normcore.io/documentation/guides/adding-vr-ar-avatars.html">Adding AR/VR Avatars &amp; Voice Chat</a>, and <a href="http://normcore.io/documentation/guides/creating-a-multiplayer-drawing-app.html">Creating a multiplayer drawing app</a>.</p><p>Until next time</p><p>Max</p><p>Follow along on Twitter <strong>@</strong><a href="http://twitter.com/MaxWeisel"><strong>MaxWeisel</strong></a><strong> </strong>and <strong>@</strong><a href="http://twitter.com/NormalVR"><strong>NormalVR</strong></a> for the latest updates.</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=cadf467c0f39" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Using AR to see into the VR world]]></title>
            <link>https://normalvr.medium.com/using-ar-to-see-into-the-vr-world-a944bf3ef375?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/a944bf3ef375</guid>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[unity]]></category>
            <category><![CDATA[future]]></category>
            <category><![CDATA[ios]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Thu, 15 Aug 2019 14:28:27 GMT</pubDate>
            <atom:updated>2021-03-02T18:23:00.744Z</atom:updated>
            <content:encoded><![CDATA[<p>Yooooooo, I’ve been waiting to write this post for over two years now.</p><p>I originally got into VR because of how immersive it is. And while it can transport you to an entirely different world, it’s still a very isolating medium. In most cases, everyone else in the room is completely clueless to what the person in VR is experiencing. I believe phone AR is an excellent way to change that.</p><p>Forever ago, you may have seen this tweet that I posted from the Normal twitter account:</p><h3>Normal on Twitter</h3><p>Blobbing in the studio today w/ the Vive + ARKit. Definitely some huge mixed reality potential here. 😍😍😍😍 #arkit #vr #indiedev #gamedev https://t.co/C1zANBuSrx</p><p>In this video, I’m using a custom iPad app that can see into the VR space that Kirsty is in. This video was filmed in realtime on an iPad using ARKit and <a href="https://normcore.io/">Normcore</a>.</p><p>I love this concept because it’s an app that anyone can just pick up and use. Anyone in the room can grab their iOS or Android device, and use it as a magic window to see into the VR world.</p><p>You can see what it looks like behind the scenes here:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/785/1*cBZWsBJ_hSIK1SUn-dRfXw.gif" /></figure><h3>How it works</h3><p>On paper, it’s actually pretty simple. We’re both running the same app. One on the VR headset, one on the iPad, and we’re using Normcore to synchronize the state of each copy of the app.</p><p>Unity makes it pretty easy to export one project that works across multiple platforms. In this case, once I had the app running on iOS, all I had to do was import Unity’s ARKit plugin, and use Normcore to synchronize the avatar and brush strokes.</p><p>That’s it :)</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/655/1*doGLfzaeYKvhN-kwnN6sGQ.gif" /></figure><p>With everything set up, someone on a mobile device like an iPad can launch the app, connect to the same room as someone who’s running the app in VR and navigate around in the same virtual space. By far my favorite detail is that VR players see an avatar for people in AR. For our own apps, we’ve created a cute floating iPad with a Wanda inside. VR players can see where the AR players are in the virtual space and can even talk to them using voice chat as they show them around the virtual world :)</p><p>Want to try this in your own app? We’ve written up detailed instructions in our docs on how to set this up. Check it out <a href="https://normcore.io/documentation/xr-guides/using-ar-as-a-spectator-view.html">here</a>.</p><p>Max</p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=a944bf3ef375" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Introducing Normcore: High Quality Multiplayer Networking for Unity]]></title>
            <link>https://normalvr.medium.com/introducing-normcore-high-quality-multiplayer-networking-for-unity-6a530a018912?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/6a530a018912</guid>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[normal]]></category>
            <category><![CDATA[multiplayer]]></category>
            <category><![CDATA[oculus]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Tue, 07 May 2019 15:31:35 GMT</pubDate>
            <atom:updated>2019-05-07T15:41:02.768Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*LDs-zbRTFcdS4prrMZO6zg.gif" /></figure><p>Yooooooooo, it’s ya boi, Max Weisel, chief blob officer here at Normal with some exciting news.</p><p><strong>Today we’re launching Normcore, a multiplayer networking plugin for Unity.</strong></p><p>If you’ve ever tried building a multiplayer game, you know it’s a lot of work. Even just getting to the point where you can pass data between two clients can be challenging. It’s so much work that many developers decide from the beginning not to create multiplayer games. We love multiplayer games and apps at Normal. Especially when it comes to VR, multiplayer turns what has the potential to be a very isolated experience, into a shared one.</p><p>When we started implementing our own multiplayer titles, we realized the multiplayer aspect was going to be a lot of work. Sending messages between clients, synchronizing &amp; smoothing movement of objects, implementing voice chat, matchmaking, running servers, etc. The list piles up quickly, and there are many engineering challenges that aren’t obvious until you’re months or even years into a project.</p><p>We’ve spent the last three years working on Normcore, a Unity plug-in for our own internal use, implementing all the different pieces-state syncing, physics syncing, voice chat, persistence, fast serialization with versioning, delta compression, flow control, and much more. Through this process, we noticed a pattern: <strong>Everyone currently needs to implement each of these pieces from scratch</strong>.</p><p>We’re releasing Normcore in an effort to not only save developers time and encourage more multiplayer titles, but with the hopes of creating the best multiplayer networking plugin available. Our goal is to refine and improve Normcore until it becomes so good, you wouldn’t ever dream of writing your own multiplayer networking. You should be spending that time on your game anyway.</p><p>Now, let me set the record straight. Normcore is far from perfect. We’ll continue to make massive improvements to it as time goes on, but we’ve hit a point where the only way to improve is to get it out in the wild so we can see where it falls short in practice.</p><p>Normcore is designed to be used for anything in Unity that you’d like to add multiplayer to (games, installations, apps, enterprise, etc). However, we’ve noticed the VR/AR community is in serious need of good multiplayer support, especially when it comes to voice chat (which we believe is paramount to achieving presence in multiplayer spaces). The <a href="https://en.wikipedia.org/wiki/Voice_over_IP">VOIP</a> community solved high-quality low-latency voice chat a long time ago, and we’ve incorporated the lessons they’ve learned into how audio works in Normcore.</p><p>We’ll be in beta for the next few months through the launch of our next title. During that time, all accounts will have a coupon applied for free hosting / bandwidth as we work out the kinks.</p><p>Want to learn more about Normcore? Head over to <a href="https://normcore.io/">https://normcore.io/</a> to create an account and download the SDK. I’ll be writing about cool things you can do with Normcore in the coming months, so keep an eye on this space to stay in the loop :)</p><p>Max</p><p><em>Originally published at </em><a href="http://normalvr.com/blog/normcore/"><em>Normal</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=6a530a018912" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Oculus Quest is the only VR headset that you should care about]]></title>
            <link>https://normalvr.medium.com/oculus-quest-is-the-only-vr-headset-that-you-should-care-about-8e9af9374bde?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/8e9af9374bde</guid>
            <category><![CDATA[augmented-reality]]></category>
            <category><![CDATA[oculus]]></category>
            <category><![CDATA[normal]]></category>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[oculus-quest]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Mon, 08 Oct 2018 18:11:56 GMT</pubDate>
            <atom:updated>2021-12-15T22:33:08.624Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*PkKbvDJZhXlCa4Dw.png" /></figure><p>Hey fam,</p><p>We’ve been pretty quiet over here at Normal. We’re about to announce some cool stuff, but in the meantime, I wanted to put together a quick write-up about Oculus Quest.</p><p>Last week Oculus announced <a href="https://www.oculus.com/blog/introducing-oculus-quest-our-first-6dof-all-in-one-vr-system-launching-spring-2019/">Oculus Quest</a>. A $399 standalone VR headset with room-scale head and hands tracking. <strong>As of right this moment, I believe Oculus Quest is the only VR headset you should care about as a developer.</strong></p><p>It also happens to be the first headset I could see my parents wanting to use.</p><p>I started Normal with one singular goal, which was to build the app that would get my parents to use VR. That’s not to say I wanted to build apps for an older generation, but I knew if I found something my parents would use, it would be an elemental technology, applicable to almost everyone. I picture them buying a VR headset that they leave on their coffee table. A headset they would occasionally pick up to use this app.</p><p>I didn’t know what that app would be, but before I could start, I needed to figure out the hardware requirements. After building a few prototypes, I realized that the hardware needed to have minimum viable tracking, be priced affordably, and function standalone.</p><h4>Minimum Viable Tracking</h4><p>I define minimum viable tracking as room-scale <a href="https://en.wikipedia.org/wiki/Six_degrees_of_freedom">6dof</a> tracking for your head and both hands. It’s a level of tracking that completes the hand/eye coordination feedback loop, allowing us to take advantage of skills we’ve spent our whole lives practicing. To me, that means full 6dof tracking for your head and both hands.</p><p>For those of you not familiar with Normal, I wrote this on our <a href="http://www.normalvr.com/about/">about page</a> two years ago. I think it captures the importance of minimum viable tracking:</p><blockquote>Virtual Reality is compelling to me because it capitalizes on our sensory and motor skills. We’ve spent our whole lives developing these skills and (up until now) our digital devices haven’t taken advantage of that. With hand controllers, we can complete that feedback loop. VR without hand controllers is like living in the real world with your hands tied behind your back. Most of us have hands and are pretty good at using them, so I think it’s fair to say VR without hands is going to be a waste. (I know not everyone will agree with this and that’s fine. You do you.) I want to build for a world where room-scale VR with hand controllers is the norm, because I believe that one day it will be.</blockquote><h4>Standalone</h4><p>If your VR headset is tethered to a large, clunky, &amp; stationary computer, then you’ve already lost. A $1500 mega-computer might be ok if you’re targeting gamers / cryptocurrency folks, but you’ll be limited to the size of that demographic, which is pretty small, and certainly doesn’t include my parents or the coffee table they’re gonna put this thing on.</p><p>A stationary computer also means that a VR user’s movements will be limited to a desk. While this works for the desktop PC gamers, it’s going to be important that this device can be used anywhere in order to be accessible to everyone. I expect most people will want the mobility that a device like an iPad provides — you can stand up, sit down, or bring it with you anywhere in your house.</p><h4>Affordability</h4><p>Packing the computer part into a standalone headset presents technical problems that have predictable solutions. Computers get smaller and smaller over time, so to solve that problem we just need to wait. However, on top of that, we also have to consider how much utility a device provides.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*bfKEnyTRgt66gszY.png" /></figure><p>I kind of imagine a graph like this in my head. Whenever a new technology is created, it’s very expensive, and it doesn’t provide a lot of utility. However, over time, the hardware gets cheaper to manufacture, and the software gets better and provides more utility. Sometimes cost moves faster than utility or vice versa. I’d imagine if VR hardware suddenly became $1, everyone would have a VR headset even if it doesn’t do all that much. If someone made a VR app that could cure cancer, people would be willing to pay the $2000 it costs to get a room-scale VR headset, all of the sensors, and a gaming PC to go along with it.</p><p>That said, with most new hardware, that sweet spot usually happens in the middle. It’s when the hardware gets cheap enough, and the software gets good enough that the amount you spend on a headset is equal to the value it will provide. A side-effect of this is that first generation is primarily bought by early adopters/people with more expendable income, but with each new generation the price becomes more accessible and the utility increases until it’s something almost everyone can justify buying. This is when the gold-rush starts, and the market for that device explodes.</p><p>For me, Oculus Quest is the first headset with all of the necessary ingredients, and I think Oculus knows this. $399 is an insane price for the amount of technology Oculus has packed into this headset. Which leads me to believe that the only reason this headset exists at this lower price point, is because Oculus is heavily subsidizing the cost of Quest in order to hit that sweet spot between cost and utility that I mentioned above. Oculus Quest might end up being accessible to a large number of people who will get more value than what it actually costs to buy the product. In other words, this is the best bang for your buck.</p><h4>Why is Oculus Quest the only VR headset I should care about?</h4><p>Let’s say you could go back in time to January 2007 and get in on the app gold rush before it started. You’re going to build a ride-sharing app (<a href="https://www.susanjfowler.com/blog/2017/2/19/reflecting-on-one-very-strange-year-at-uber">hopefully with a proper HR department this time</a>). If I had to guess, you’re probably not going to target the Blackberry, Motorola RAZR, or any of those Windows Mobile 6 phones. You’ll target the iPhone. It’s the device that introduced the entire-phone-is-a-touchscreen interface paradigm to the masses, and it’s the only one you would care about — at least until Android phones entered the market in the following year. I believe Oculus Quest is the iPhone of the VR industry. The iPhone marked the beginning of the <a href="https://en.wikipedia.org/wiki/Hype_cycle">slope of enlightenment</a> for smartphones, just as I believe Oculus Quest will do the same for VR.</p><p>Minimum viable tracking, standalone functionality, and affordability put Oculus Quest into the same category as the original iPhone or original Macintosh for me. Now don’t get me wrong, both devices were less than ideal at launch. They were underpowered and had almost no 3rd party apps. I had the original iPhone, and everyone would say to me <a href="https://www.youtube.com/watch?v=eywi0h_Y5_U">“You spent $600 on a phone?! My T9 keyboard phone is faster and it has 3G!”</a></p><p>As I mentioned earlier, when a new device is introduced, most people can’t justify the hefty price tag for the amount of utility they get out of it. That is, until app developers start creating apps. 3rd-party apps are what transformed the iPhone from just another cell phone into something we use almost everywhere and for almost everything. App developers are what increase the utility of any device.</p><p>Initially we can end up with a weird catch-22. Developers won’t make apps for a device until there are enough people using it. And people don’t want to buy this new device until it has enough apps / utility.</p><p>However, as devices become cheaper to manufacture, we hit a point where someone makes a device that has yet to change the world, but is still good enough to be bought by a large number of early adopters. Once that happens, developers start to join the ecosystem and something magnificent happens.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*6VBQ8lB0zvdSfMKcg7IglQ.png" /></figure><p>The utility of the device starts to skyrocket. Developers add utility by creating more apps, which leads more people to buy devices, which leads more developers to create even more apps; wash, rinse, repeat. Then what? The industry explodes and everyone talks about how <a href="https://thehardtimes.net/harddrive/man-with-really-cool-idea-for-game-just-needs-volunteers-to-do-coding-and-art/">they’ve got a hit app idea they need you to help them build.</a></p><figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*4VOC3CqomxDbFCCr.png" /></figure><p>Oculus Quest will be the device that the early adopters get. It’s the device that hit app developers will make apps for and it’s the headset that will kick off the infinite loop of developers making apps -&gt; more people buying devices -&gt; more people making apps -&gt; etc. Just like with VR today, people were hopeful, but skeptical, but if we could go back in time, we’d know that the iPhone was the only device worth targeting. I think that will be true for Oculus Quest, too.</p><p>Virtually Yours,</p><p>Max</p><p><em>Originally published at </em><a href="http://www.normalvr.com/blog/oculus-quest-is-the-only-vr-headset-that-you-should-care-about/"><em>Normal</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=8e9af9374bde" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[An open-source keyboard to make your own]]></title>
            <link>https://normalvr.medium.com/an-open-source-keyboard-to-make-your-own-4f8c531bb92d?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/4f8c531bb92d</guid>
            <category><![CDATA[virtual-reality]]></category>
            <category><![CDATA[open-source]]></category>
            <category><![CDATA[cutie-keys]]></category>
            <category><![CDATA[normal]]></category>
            <category><![CDATA[drum-keyboard]]></category>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Fri, 03 Feb 2017 17:53:28 GMT</pubDate>
            <atom:updated>2017-02-03T18:41:19.325Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/0*V6q248pZUhR-rVk6.jpg" /></figure><p>A while back, my friend Robbie let me try out a keyboard his team was working on at Google. It was something they were calling the ‘drum keyboard’ and, as you might suspect, it worked similarly to a set of drums. Using two hand controllers, you could bang out letters on large keys, spelling out words to your heart’s content. Here’s what it looked like:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/675/0*HU4NrZxB2gaLj10a.gif" /></figure><p>…And it was awesome. Like really awesome.</p><p>Granted, as a drummer myself, I might be slightly biased, but I loved how natural it felt and how easy it was to pick up. So, when we heard that Google had retired that project to focus on other dreams — specifically, Daydream (ba dum, cha!) — we thought, hey, let’s pick this back up. So, we made our own.</p><iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fplayer.vimeo.com%2Fvideo%2F202270443&amp;url=https%3A%2F%2Fplayer.vimeo.com%2Fvideo%2F202270443&amp;image=http%3A%2F%2Fi.vimeocdn.com%2Fvideo%2F616260633_1280.jpg&amp;key=d04bfffea46d4aeda930ec88cc64b87c&amp;type=text%2Fhtml&amp;schema=vimeo" width="1920" height="1080" frameborder="0" scrolling="no"><a href="https://medium.com/media/f7f6d5e03770199f3d4efef4a9b6513a/href">https://medium.com/media/f7f6d5e03770199f3d4efef4a9b6513a/href</a></iframe><p>Virtual reality is still a pretty new space. It’s a bit like the wild wild west. There’s no standardized toolkit or set of rules for how things should work. It just doesn’t exist. And since we’re all building our own tools and games, we’re making a lot of this up — interfaces, buttons, you name it.</p><p>As it turns out, that can get pretty confusing. The result is a bunch of interfaces that all work just a teensy bit differently. Not to mention that when you spend 99% of your time designing the rest of the experience, the last thing you want to think about are details like keyboards.</p><p>We’ve seen some pretty cool drum keyboards around town; Editor VR has one, so does Fantastic Contraption. However, we know not everyone has the resources to make their own.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/675/0*txBRas9Mrk-oNEju.gif" /></figure><p>Inevitably, we needed a keyboard for a few of our own projects, so we took some inspiration from that early Google drum keyboard and made a version all our own. We thought through how long the mallets should be, how to trigger each letter that’s hit, as well as the best gestures to write out things like asdfghjkl. All in all, we’re pretty pleased with how it works.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/900/0*AgS-LZcuUqrTK4Tl.gif" /></figure><figure><img alt="" src="https://cdn-images-1.medium.com/max/900/0*TZC1-KsSk_AqALvk.gif" /></figure><p>And we decided to open-source it, so you can use it, too. After all, why reinvent the wheel each time? Our hope is that by open-sourcing it, we can save other folks some time by letting them grab it, tweak it, and ultimately make more rad stuff. Maybe they’ll open-source that stuff, too, and we can all build on top of it and push this fledgling community forward.</p><p>Our keyboard may not be the end all, be all keyboard — after all, what feels good to us might not feel the same to you — but it’s a starting point. We’d love it if others could give it a whirl, improve upon it, and contribute back to the repository. Then we can add even more cool features like different languages, autocorrect, speech to text, different keyboard layouts, emoji, sounds, etc, and make this better for everyone!</p><p>So, are you a developer? Get started with the code today. We’ve put it on github <a href="https://github.com/NormalVR/CutieKeys/">here</a>.</p><p>More of a Vive enthusiast? Grab the <a href="http://cdn.normalvr.com/blog/03-cutie-keys/CutieKeys.zip">build</a> and let us know your thoughts!</p><p>I’d like to extend a big shout-out to our friends Annlie Huang &amp; Devin Kerr for the wonderful keyboard sounds, Mike Meyer for helping us make it as cute as possible, Gus Bendinelli for directing photography, and Isaac Cohen (aka Cabbibo (aka lil’ squish)) for the background tunes. Thanks for the help all 😘😘😘😘.</p><p>Max</p><p><em>Originally published at </em><a href="http://www.normalvr.com/blog/an-open-source-keyboard-to-make-your-own/"><em>Normal</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=4f8c531bb92d" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Your Couch, Now in VR!]]></title>
            <link>https://normalvr.medium.com/your-couch-now-in-vr-8b5818c70abc?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/8b5818c70abc</guid>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Wed, 11 Jan 2017 17:26:32 GMT</pubDate>
            <atom:updated>2017-01-11T17:43:35.076Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*fp2GTB5q-NfGUzY17Oxo2Q.gif" /></figure><p>We at <a href="http://normalvr.com/">Normal VR</a> have been working in VR for a while now, and have come to notice a pattern…</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/480/1*G8dPDyLcgYWUJp13vPZA9g.gif" /></figure><p>…people really enjoy watching other people humiliate themselves.</p><p>Well, yes, but not the problem we’re talking about today, which is: <strong>People keep running into physical walls when in virtual reality.</strong></p><p>Clearly, it’s a problem. As VR developers, it’s our job to make experiences as immersive as possible. However, once the experience is immersive enough that people forget about their physical surroundings, we run into problems. Literally.</p><p>The solution is not to make things <em>less</em> immersive, <strong>but to improve the relationship between the virtual and physical worlds.</strong> Before we can tackle this problem, though, it’s important to understand why it’s happening:</p><p>Two parts of our brain are responsible for processing visual stimuli: the conscious and the subconscious. The conscious part is responsible for deliberate decision making. If those actions are repeated enough, however, they’re handed off to the subconscious. When learning to play a new sport, for example, dribbling a basketball might require full concentration. Over time, however, we develop muscle memory and delegate these actions to the subconscious — effectively allowing us to focus on other parts of the game.</p><p>It’s a good thing, really. Your subconscious steps in and tackles things like walking and breathing, all the while processing visual stimuli to help you navigate and avoid obstacles, enabling you to do fantastic feats like looking at your phone while walking. In virtual reality, however, these obstacles aren’t rendered at all and the boundaries that do exist are often too subtle for your subconscious to pick up. The result ends up being a jarring collision with reality.</p><p>We’ve seen Valve approach this problem through its Chaperone system in SteamVR. When you come close to the physical boundaries of your space, a grid of blue wireframe walls fade in to make you aware of your surroundings.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/480/1*J6iL6cQA7E1NF5u96-xSYA.gif" /></figure><p>The Chaperone functions much like a sliding glass door. If your conscious brain is active, it recognizes the door. If the conscious brain gets distracted, however, it fails:</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/900/0*x430SeG83GcCXudU.gif" /></figure><p>It’s sort of like an invisible fence, much like the ones used to keep dogs from running away. When we’re well within the fence, those physical boundaries are completely invisible. You might even forget they’re there altogether. But, much like our canine counterparts, crossing these boundaries can result in physical pain. While dogs get a zap from the shock collar, we might run into a wall, a desk, or <a href="https://i.imgur.com/vtfRw2k.jpg">unsuspecting roommate</a>. And, just like our canine counterparts, these painful experiences train us to fear our boundaries and avoid crossing them. This fear keeps us from being able to completely relax and enjoy being in VR. Add teleportation and all bets are off.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/480/1*J4BIEb_TtjLVErhIZkRWhw.gif" /></figure><p>So, what’s the solution? In games like Budget Cuts, we found ourselves teleporting relatively often, forcing us to pause and reorient ourselves each time. Almost every time, we’d take off the headset to find that we were in a completely different place than previously thought. We realized pretty quickly that the solution to our problem wasn’t to rely more on our conscious brain, but to create a system that enabled our subconscious to do the leg work. So, we created Deluxe Chaperone 3000, essentially mapping out all the objects in a space and placing them in our virtual environment, as well.</p><h3>Max Weisel on Twitter</h3><p>Mapped my apartment so I wouldn&#39;t bump into things while working in VR. The view isn&#39;t half bad either ;) https://t.co/3ju4OUS3Cr</p><p>When all of the furniture and large objects are visible in VR, it not only keeps us from running into things, but that anxiety and fear around boundaries completely subsides. Without this anxiety, we feel comfortable spending longer periods of time in VR. Bonus: we’re able to use our real furniture! Rather than standing in the middle of an empty room, we can sit on the couch while talking to friends. Ultimately, we’re able to relax — free from the crippling fear of faceplants and embarrassing VR faux pas. Liberated to frolic freely, tethered only by our wired headset.</p><p>This applies to multiplayer VR, too. With Deluxe Chaperone 3000, you can sit across from your friends in VR, but on your own physical couches. With enough finesse, we can even line up your furniture so you can sit next to your friends. Sitting next to someone on a couch or across from a friend at a table are actions that we take for granted in the physical world, so translating this basic social element into VR is a good step towards creating comfortable virtual social spaces.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/480/1*wDAamHnXpIL6lv0JFf2N6g.gif" /></figure><p>Headsets like the Hololens already map the physical world in real time. It’s fair to assume that the Vive will eventually include a depth camera that functions similarly. While showing real physical space in VR is obviously an important step, we can go much further. When designing your experiences, you should take furniture into account when building out your environment. Rather than having these two contrasting objects in the same space, you can use them to your advantage by creating tactile obstacles and objects in their place.</p><p><strong>Blending the virtual and physical worlds does so much more than simply solve a problem, it provides a new realm of possibilities for both the user and the developer. It enables us to turn obstacles into advantages to create more immersive worlds and better experiences.</strong></p><p>Virtual Reality should be a place that lets you go beyond your own reality, whether sitting on a couch with a friend thousands of miles away or trying your hand as a gourmet chef. You shouldn’t have to live in fear of crashing into walls, tripping over coffee tables, or just <a href="https://www.youtube.com/watch?v=CPCbnNc-Uiw">falling on your face</a>. So, we’re designing apps with this in mind. We think others should be, too.</p><p>Thanks for listening to this extended thought, and expect more to come in this space.</p><p>Here’s to breaking down walls (but, not like real ones),</p><p>Max</p><p><em>Originally published at </em><a href="http://www.normalvr.com/blog/your-couch-now-in-vr/"><em>Normal</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=8b5818c70abc" width="1" height="1" alt="">]]></content:encoded>
        </item>
        <item>
            <title><![CDATA[Throwing Throw Down]]></title>
            <link>https://normalvr.medium.com/throwing-throw-down-82db3a658d4c?source=rss-a4e5b23e6714------2</link>
            <guid isPermaLink="false">https://medium.com/p/82db3a658d4c</guid>
            <dc:creator><![CDATA[Normal VR]]></dc:creator>
            <pubDate>Mon, 02 May 2016 21:17:36 GMT</pubDate>
            <atom:updated>2016-10-09T03:55:35.494Z</atom:updated>
            <content:encoded><![CDATA[<figure><img alt="" src="https://cdn-images-1.medium.com/max/1024/1*H8VnOUW94_LBeB-aNfi2tw.png" /></figure><p>Hi everyone and welcome to our blog! We are NormalVR, your neighborhood virtual reality shop.</p><p>As we delve deeper into VR, we occasionally come across obstacles and opportunities that we think are worth sharing with the VR community. We hope to write about how we navigate the unique set of problems VR presents developers and users. Today, we want to talk about something seemingly basic: throwing.</p><p>Throwing is a skill that many of us learn at an early age. It becomes intuitive and (once we reach adulthood) is something that feels inexplicably natural. We become hyper-aware of the physical sensations associated with it. As we waded into VR we noticed that this basic human ability wasn’t being translated well; throwing wasn’t consistent and didn’t feel intuitive. We won’t outline all of our failed experiments, but rather how we saw the problem and our proposed solution.</p><p>When you are throwing in VR there are many variables that need to be considered. If there is variability that’s introduced from your system, rather than the variability of your users, then throwing is going to feel frustrating and hinder your ability to develop good muscle memory for specific throws in the game. You can spend all day tweaking mass and drag on your RigidBodies in Unity to get something that feels good, but that won’t address the issue of consistency. Addressing this problem is especially crucial if throwing with accuracy is part of your core game mechanic.</p><p>SteamVR provides a velocity and angular velocity value for each controller. In our experience using these values it’s obvious that Steam spent a considerable amount of time to create something that feels natural and consistent. However, these values are only for the center of the controller.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/720/1*YenHwsrggwSWiav7QSjdUQ.gif" /><figcaption><a href="http://imgur.com/VloBPE9">http://imgur.com/VloBPE9</a></figcaption></figure><p>If we attempt to use them for an object that is not attached to the center of the controller we get behavior that is consistent, but does not match the user’s expectations.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/720/1*ess_d-rBSOD7btFIfWk7nQ.gif" /><figcaption><a href="http://imgur.com/X4DR9hq">http://imgur.com/X4DR9hq</a></figcaption></figure><p>As is with most solutions, the simplest route proved to be the one that gave us the best results. Given that SteamVR provides high quality velocity and angular velocity values, ideally we should try to derive the values we want from them. Luckily, we can trick Unity’s RigidBodies into calculating velocity and angular velocity for any point around the controller. When implemented correctly objects behave the way we expect them to. 1It is possible to calculate your own velocity by measuring the position of points on the controller and watching them move over time, but we found it incredibly hard to get anything that would yield the same consistency as SteamVR’s values.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/720/1*Gi1igIzKgEvyj56xyrINOQ.gif" /><figcaption><a href="http://imgur.com/Fntw9Qw">http://imgur.com/Fntw9Qw</a></figcaption></figure><p>More importantly, the throws are consistent and accurate.</p><figure><img alt="" src="https://cdn-images-1.medium.com/max/720/1*NgrfISLFoiWvBUwqP0uJdA.gif" /><figcaption><a href="http://imgur.com/1Ljr8jx">http://imgur.com/1Ljr8jx</a></figcaption></figure><p><a href="http://posts.normalvr.com/wp-content/uploads/2016/05/HowsItGonnaWork.mp4">Brilliant! But how’s it gonna work?!</a></p><p>First, you’ll want to create an empty game object and rigid body that you can use to calculate the velocity at positions around the controller.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/fca8135382dd879074e52c17a9c579c6/href">https://medium.com/media/fca8135382dd879074e52c17a9c579c6/href</a></iframe><p>Whenever you get new controller positions from SteamVR, you’ll want to make sure you position this game object at the center of the controller. Some people like to do this in Update(), but we find the values are usually a frame behind. We generally like to register for SteamVR’s “new_poses” event and trigger our own method to update geometry. We won’t outline how to do that here, but we’ve made the code available at the end of this post.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/41ebbedc0f3030c2d1c1a0c43a378b84/href">https://medium.com/media/41ebbedc0f3030c2d1c1a0c43a378b84/href</a></iframe><p>Next you’re going to update the empty game object’s velocity and angular velocity using the values from Steam inside of your Update() method.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/a70bb213cc2e4d99800b4347ce5e3bfc/href">https://medium.com/media/a70bb213cc2e4d99800b4347ce5e3bfc/href</a></iframe><p>That’s it! Once the trigger is let go, you can use GetPointVelocity on the rigid body to sample the velocity at any point around the controller.</p><iframe src="" width="0" height="0" frameborder="0" scrolling="no"><a href="https://medium.com/media/bd56f01e99846c9f6d7a4bad8d0152da/href">https://medium.com/media/bd56f01e99846c9f6d7a4bad8d0152da/href</a></iframe><p>Grab the source on Github <a href="https://github.com/NormalVR/ThrowVelocity">here</a></p><p>Hopefully this proved to be helpful! We’d love to hear how this worked for you, what you thought of the post, and so on. Let us know what you think in the comments. Until next time!</p><p>Max</p><p><em>Originally published at </em><a href="http://normalvr.com/throwing-throw-down/"><em>Normal VR</em></a><em>.</em></p><img src="https://medium.com/_/stat?event=post.clientViewed&referrerSource=full_rss&postId=82db3a658d4c" width="1" height="1" alt="">]]></content:encoded>
        </item>
    </channel>
</rss>