Event Recap

Event Recap: Future of Live Experiences In The Age of AR/VR

May 21
Aug 23
Alley Team
Event Recap

Event Recap: Future of Live Experiences In The Age of AR/VR

May 21
Aug 23
Alley Team
Event Recap

Event Recap: Future of Live Experiences In The Age of AR/VR

May 21
Aug 23
Alley Team
Music Performer on StageMusic Performer on Stage
Photo by Austin Neill on Unsplash

What does the future of live experiences look like? In this conversation we spoke to leaders in the mixed reality space about how this technology is going beyond just AR/VR, and what the future holds for truly crafting live immersive experiences. We dove into the challenges with designing for a space that may be slow to adopt, how to build human connection in a headset environment, and what we can look forward to seeing in the near and distant future.

Mentioned:

  • 5G, Music & Emerging Fan Experiences Event - A virtual event exploring 5G, music, technology and new fan experiences now, and in a world with live events.
  • spatial.io - Most tools aren’t designed for how we naturally think, and our coworkers are spread all over the world. Spatial breaks through these limitations, transforming work from how it is, to how it should be. In the process, teams grow closer, think bigger, and accomplish things faster. To push the world forward, we need to spend more time thinking and less time traveling. And to execute the next big idea, we’ll need more space to think than a laptop or phone screen can provide. How does Spatial work? Create a lifelike avatar and work as if next to each other. Transform your room into your monitor and then fill it with information. Use all of your favorite existing tools so your workflow remains intact. And use any device, including VR/AR headsets, your desktop, or your phone to participate.
  • display.land - Displayland generates 3D captures of physical spaces using the everyday smartphone camera, empowering anyone to create shared digital spaces.
  • Wave - formerly TheWaveVR - is an entertainment technology company and creator of the world’s first multi-channel virtual entertainment platform for live concerts. The new Wave harnesses cutting-edge broadcast technology that transforms artists into digital avatars in real-time, casting them onto a virtual stage built with stunning visuals and customized interactions that fit each artist’s unique style. Our virtual concerts are called “Waves” and every Wave is live, interactive, social, and free to attend on the largest social and gaming platforms in the world.
OUR PANELISTS:
Noelle Tassey
Alley
Ben Nunez
Evercoast
Rebecca Barkin
Magic Leap
Amy LaMeyer
WXR Fund

TRANSCRIPT:

Noelle Tassey  0:00  
Welcome to all of our participants. Thank you all so much for joining us this Wednesday afternoon to talk about the future of life experiences and AR and VR. So really, really excited to be bringing this content to you guys today in partnership with Verizon 5G Labs. For those of you who don't know, I'm your moderator Noelle Tassey, I'm the CEO of Alley. At Alley, we are focused on driving innovation through community using our labs, physical communities, and digital communities. And we do this with support from all of our partners, especially Verizon. So today, we're so excited to dive into this topic and first, we're going to go around have our panelists all introduce themselves and share with you a little bit about their backgrounds, although I bet you have already heard of at least a few of them. Amy, do you want to take it away?

Amy LaMeyer  0:51  
Sure. So I'm a managing partner of WXR Fund. WXR Fund invests in VR, AR, and AI companies with female founders. And before that I worked for a company called Akamai Technologies, we helped to scale the Internet. And I've been an angel investing in the immersive and music space for about four years.

Noelle Tassey  1:16  
Awesome, thank you. Rebecca.

Rebecca Barkin  1:19
Sure. So my name is Rebecca Barkin, and I'm Vice President of Immersive Design at Magic Leap. I have a sort of a background of entertainment and technology and working in startups that landed me in this spot. And here at Magic Leap, I have an interdisciplinary team of devs and engineers and artists, producers, strategists that really figure out how to develop content, mostly with partners in this new space.

Noelle Tassey 1:49  
Awesome. And last but not least, Ben?

Ben Nunez  1:52
Well, my name is Ben Nunez, I'm the CEO of Evercoast. Evercoast is a software platform for live and on-demand volumetric content. We provide a software cloud platform for capturing and streaming 3-D humans, objects, environments, things in full motion to deliver 3-D photorealistic holographic content. And, yeah, happy to be here.

Noelle Tassey  2:22
Awesome, so excited to have all of you here. And so to really dive into this topic, which, even before the current crisis was incredibly top of mind, which is how our live experience is going to shift into VR and AR and how will that look in the future? And now more than ever, this is obviously top of mind for us, our partners, our community, given the current crisis, and given how hard it is to get together in person. So I'm really excited for today's conversation and for the wide range of perspectives you guys are bringing to it. So as a starting question to the entire panel, you all are obviously approaching this from very different perspectives. But can you just give us a little bit of context around where you currently see, you know, the— I guess the best application of VR and AR in terms of bringing live experiences to people remotely; whether that's the world of music, fashion, gaming, etc?

Ben Nunez  3:24  
Sure, I can start with some of the— just a lot of the use cases that we've seen come through at Magic Leap. I mean, I think there's, the enterprise space, which, you know, manufacturing and training are really hot areas. That's even right now, like, you know, we're looking at like a $26 billion marketing— market for manufacturing. So I think that live training aspect is really important on the enterprise side of things. I think telemedicine with healthcare, specifically, you can definitely tell in this time, how valuable it would be to be able to sit down with a doctor across from you, and have that kind of dialogue and conversation. We've seen a lot— you know, defense— obviously, this kind of technology can allow for real-time, you know, simulations and training. And I think it's kind of the crossover pieces, e-comm is a really interesting area because eventually, it will allow— this sort of shoppable world concept will allow for kind of a seamless exploration to conversion. And then, of course, I think entertainment, I mean entertainment's where, you know, my heart is, but on the B2B side, we have a lot of work to do to really be able to enable great entertainment experiences.

Amy LaMeyer  4:41  
I think a couple of other places that we've seen growth lately is kind of a take off of entertainment, almost kind of health, right, like taking Beat Saber and Supernatural and You're to another level and kind of incorporating how is it to use virtual reality and workout— exercise particularly in this pandemic climate. And then remote work, productivity apps, and virtual venues, I think is another place that's really picking up a lot.

Ben Nunez  5:14
Yeah, we've seen just over the past couple months, we've seen and Noelle, you kind of kicked this off by saying we've already had— we've had these trends in place, in motion for a long time now, and I think what we're seeing is that the state of affairs in the world is greatly accelerating that. And that dollars are shifting in general, from, you know, in real- qworld in real life to virtual experiences. And you know, you don't want to be too COVID-chic, if you will, that, you know, this isn't a temporary thing that this is a long game. And so our mindset is that really, truly across every industry, we're going to see, you know, transitions from you know, flat, non-immersive 2-D content to, you know, fully-immersive 3-D worlds. And that's going to take a time— take time for us to get from where we are today doing, you know, virtual conferences and virtual events, as, you know, over Zoom, as great as it might be to you know, true telepresence, with far more immersion than what we're going to get today. I think we're seeing particularly in the near-term, a lot of, you know, replacement of events. You know, we were chatting earlier about fashion week and fall, and product launches where, you know, a huge large company wants to release a product, how do they do that now? They used to bring, you know, thousands of people or 150 members of the press to you know, a central location and demo the product and showcase them and let them you know, talk about it, that's different now, they're not gonna be able to do that. So how do you replace all of that and I think we're just kind of seeing it. Rebecca mentioned medicine, we're seeing a lot of interest in telemedicine and Virtual Education. So really, I think it kind of touches a lot of different— pretty much every sector.

Noelle Tassey  7:10
So you know then that's— obviously I think a lot of people joining today and certainly for me, that's one of the really exciting, you know, the exciting promise of this technology is the way that it'll touch, I mean, I think we just listed probably 15 different distinct applications to this tech and that's only scratching the surface. What do you guys see as like the low-hanging fruit in the short-term for how people will start to see this, you know, in their lives, especially again, as we are all working to bring live experiences, digital?

Rebecca Barkin  7:46  
I mean, I think you're starting to see— you need both right? You need like the emotional appeal like the "Wow", and you also need some really practical applications and you know, some really simple kind of ideas about heads-up navigation, e-commerce you know, implementations I think will work really well and be kind of low-hanging fruit. But, the marketing and advertising industry I think is always a bit progressive in this sense because they have a desire really to push. And so you'll start to see like kind of the simpler things execute, you know, using things like image tracking and whatnot where, you know, you can start to layer on virtual elements onto social assets or onto, you know, live video streams and things like that. Those are kind of the simple introductions in education that I think we need in the market to be able to really start to familiarize people, take the intimidation out of it while the tech catches up in the background. That is the infrastructure that actually can support the big "Wow" moments across entertainment and other areas.

Ben Nunez  9:00
Yeah, I think— I'll jump in on that, that I think a lot of the answer that question depends on interfaces to how are you going to consume this content that, you know, immersive content? And I think, you know, Magic Leap's working hard to make something that we can all wear that really does put us in places or augments our world around us that doesn't require me to use a mobile phone. And, you know, can you know, other devices— you know, is it a head-mounted display? Is it a mobile phone? Is it a looking glass? Is it you know, or just the 2-D screens that we all have in front of us now? Can they serve as gateways into, you know, more immersive content? I think when we— a lot of the use cases will depend on, you know, who the audience is and what environment they're in. Rebecca was talking about manufacturing, we see a lot of interest in changing the manufacturing line, and logistics, and shipping and you know. And in those types of environments wearing a headset is completely different than, you know, for consumer— you know, consumer at-home. Or we just don't have a lot of that adoption yet. And so, you know, are we going to be able to recreate that feeling of, you know, being at a concert in the crowd dancing, you know, when you're at home? To some extent, we will, I think there's a timing question around when we can actually, you know, start to achieve some of those things. And the more I think we sort of walk towards those things and, you know, move as fast as we can, but ultimately, you know, reach as many people as we can use the devices that we have is, you know, a mindset that as many of us need to have.

Rebecca Barkin  10:45
But some of the simple— like, some of the things that seem really simple that I think are going to have really strong practical appeal require, like pretty strong object recognition. And that is something that, you know, I know at Magic Leap we're working really hard on but I think like there's certain pieces that still need to catch up to enable a lot of those ones that seemed like easy low-hanging fruit.

Ben Nunez 11:11  
Yeah.

Noelle Tassey  11:12  
I think Ben, I'm so glad you mentioned concerts because I think this is something that everybody on this panel has like a very uniquely strong point of view on is kind of how this will— this technology will interact with the music industry. And you know, it's obviously an industry that is heavily reliant on live events and getting people together in-person. So, you know, going a little bit further out into the future what does the future of concerts look like? What can we expect we'll still be seeing in the future? How will the definition of a live concert shift? You know, I think— truly every single person here has like recently worked on in some way something that's incredibly relevant to this. It's really one of my favorite topics in the world of live experiences. Whoever wants to jump in first.

Amy LaMeyer  12:02  
I think in the future— yeah, in the future it's going to be— music and concerts and artist interaction will be anywhere, at any time in an immersive or a digital on top of reality setting. Like it's going to be— it'll be anything. There'll be so much creativity it will be things we can't even— we can't even guess at right now. But in between now and then there's some really interesting things that are happening even just in the past two months since quarantine. Real Estate, for example, the band put out "Quarantour," which is a fall generated augmented reality concert that you can put on your countertop on your table and watch them. You know, Pharaohs, AR has done some really interesting things with Childish Gambino in a— again, using a phone without even you know, dependent on any sort of hardware that you don't already have in your home. Using a phone to jump— walk into a portal and interact with cave art that when you touch it, it dances with you. I mean, you know, just anything from your basic just watching a band play to actually interacting with that piece of music. Those are some of the things I think are exciting now that that can happen over the next few years.

Ben Nunez 13:25  
Yeah, I mean, I think—

Amy LaMeyer 13:27  
And there are many more.

Rebecca Barkin 13:28  
Yeah, the oven— Amy and I definitely— I know we're music nuts both of us. But I think in general, like the experience economy on the whole which— of which music and entertainment is a part of was kind of caught off-guard by 2020. And they will never be caught in that situation again. And you had sort of these two paths— you had like the live event business which was sort of— had its way of running but was very much focused on the in-venue physical aspect of it. And then you have these virtual worlds that were very much focused on creating sort of virtual environments and all the back-end and infrastructure you need to be able to do that well. But what will emerge when people go back to the sort of, you know, this, whatever the new normal is a new space that's right in between those two that makes it easy to do a live you know, AR-enhanced event experience or concert that also sort of is able to auto-detect different platforms and be able to deploy that synchronously to different types of, you know, to various virtual platforms, whether that's AR or VR. And use things like planar detection to like slap the DJ booth down, or whateva er it is. So right now a lot of the struggle is that— in doing a really awesome concert because as we've talked about before a music is— it's so rich as a subject matter on its own, you have to be really good to add to it, is things like master show control integration, which is being able to respond to real-time lighting and sound design that makes those live concerts so phenomenal. So there's a lot of work being done. I think it's a super hot space of development. But I think those are some of the ways you'll see it change in the future is that there'll be this sort of convergence of those two spaces. Right now they're very separate.

Ben Nunez  15:19  
Yeah, I a hundred percent agree on that. I think— I went to school in New Orleans, and there's nothing that's going to replace me being able to go down to Jazz Fest, which we missed a couple of weeks ago. But you know, I think there are different ways you can look at content now with music and that being able to bring streaming the stuff into the game engine and take advantage of, you know, what may seem like game-like effects, creating virtual worlds, being able to manipulate the artist, being able to shade and light the artists in different ways. Being able to combine, you know, different artists who might not be in the same location— you always have that special guest that appears on stage, well, a special guest can be from anywhere in the world now that can you know merge into a single experience. And so that changes sort of the way we, you know more mind-blowingly consume content. We saw what Fortnite and Epic did with you know with Travis Scott and trying to create those types of you know those types of worlds and those types of interactions that make it feel real-ish but also in a way that's really different and not something like you're gonna have outside at a festival and listening to a DJ.

Noelle Tassey 16:36  
Yeah, I'm so glad you brought up that Travis Scott concert because that— we got a ton of questions about that, and yeah. Amy, over to you next. But I'd also love just to know everyone's thoughts on like, what that would have looked like in a world where you know, VR AR adoption was more widespread?

Amy LaMeyer  16:53
Yeah, just one thing to add to the last statement is not only can they bring back any— you know, anyone in the world, but it can be anyone that isn't even alive, right? Like that's the area— that's the level of content that we could have in the future. Or that we, you know, we are having— Tupac came back to Coachella a few years ago.

Rebecca Barkin 17:12  
He set the standard for like, what a hologram should look like. And now we're all trying to catch up desperately because the ball cap is like— (inaudible) but like, dammit, that one hologram just like nailed it so fast.

Unknown Speaker 17:26  
(inaudible)

Amy LaMeyer  17:31  
That's great. I mean, Travis Scott was great, too. I actually was in it as a character in Fortnite and experienced it both just watching it on video and as being a character. And also obviously, as a VR person imagine what it would be if I was in an immersive context. But it was beautiful, and it was fun, and it was interactive. And there were a lot of— I mean, I understand why 48 million people watched it.

Ben Nunez  17:55
Yeah, I think where that goes, you know, the effort to create something like that, the amount of money that is required to create something like that making it something that doesn't require Fortnite money to be— (inaudible) it's something we're trying to do. But also to achieve more photorealism you know, sort of blend between what might be super CG and augmented to something that's, really real. And, you know, is there artists in their raw form. And I think, you know, these types of immersive technologies, open the doors, you know, Amy to kind of what you were saying, just all of it— takes just someone up on stage, you know, strumming a guitar to you know, they can be anywhere doing anything, we can, you know, add all kinds of effects and do cool things that, you know, you're not going to be able to actually see or get while you're actually at the show so it can be better.

Noelle Tassey  18:53
Yeah, awesome. And I think the other questions that Ben, you mentioned sort of the gaming effects and then we're obviously— we're talking—Fortnite's a video game. And it feels like in general gaming has been just so much at the vanguard of this entire movement, and has really influenced the way that people think about this technology and the futures that they imagine it. But Rebecca I know that at Magic Leap you guys are exploring a lot of alternative ways of eliciting emotional responses in VR/AR right? Like creating that feeling of connection with people that you're alongside at an event that— how do you represent that in different ways? Like what do you— what do you think some of the— for— and this is for all three of you, the really interesting ways in which we'll take in-person experiences and translate them in a non-literal sense. into VR and AR.

Ben Nunez  19:46
Yeah, I mean, I think some of the early explorations were, you know— where you basically kind of— you were like a head on a stick that like was seated in the— and Amy's smiling because we've talked about this before, but like you're kind of a head on a stick that's like looking around you. And you can see an audience and you can, like, you know, transport and be like on the stage side. And those were really necessary to, like start to bring up the subject and push it. But one of the things we haven't really figured out how to do well yet is to represent human presence. Other people's human presence in an environment like that, where it almost isn't distracting, it sort of adds to the euphoria that comes really naturally with a live performance and an exchange of human energy. So we're doing a lot of explorations around like, how do you actually reflect like— how do you invite people, create a personal space? On the back-end we know there's spatial partitioning happening and you're moving people into, you know, a hundred and 50 people in this room, which people can't see, the consumer can't see. But can you invite someone from your contact list and say, you know, join me for a live show right now, and everybody shows up into that space, and maybe they're represented by this kind of particle effect. And maybe there's some new kind of effect, that when you guys interact with each other, you create this original or some composition aspect you can do together. I think interaction, like, still needs so much more exploration in the space, like it is still a pretty serious barrier that everybody's working on. But once we figure out— we kind of have to let go of what— the idea of replacing a human just by putting an avatar next to me, and think about what represents human energy and the euphoria of a live experience and an exchange of energy, visually. And sound, you know, spatial audio, and try to come at it from that angle where it's new, you're solving a different— a different problem, I guess.

Ben Nunez 21:49
And even ultimately, haptics. This t-shirt that I'm wearing is something that allows me to, you know— someone taps me on my shoulder and they can appear in front of my glasses or on my screen. That there are multiple dimensions to how you recreate that immersion and that feeling, and that sensory— those stimulations. And so it's kind of exciting to see it across everything from visual, to audio, to haptics.

Noelle Tassey  22:21
So I'm so glad Ben, you brought up the idea of the non-visual inputs, right? Because we tend to be very headset-focused in these conversations and very much about the visual experience and then the auditory experience, then we leave out things like haptics or, you know, certain other kinds of spatial augmentation, if you will, or other sensory inputs. What's like the most interesting thing you've all been seeing recently, you know, just something kind of that blew your mind in terms of where this could be going and like other ways to integrate, you know, a truly immersive experience?

Ben Nunez 22:57
I mean, I think personally spatial audio is under-utilized. It's sort of— kind of— it's a bit of an afterthought. But it can be really, really powerful and sort of being a narrative tool to direct people's attention. And also just really mimicking the human experience of how sound actually wraps around you. And I agree on haptics 100%. Like, we went to see a company recently to try out their tech. And you just held like a pen and basically, they give you an iPad with three different textures on top of it. One of them was like a tree that had been cut in half, the other was sand, the other was like a mesh sort of fence. And you just sort of ran it over those pictures— those 2-D pictures and the haptic sensation that it created through your whole hand was one of like running your hand through sand or running your hand across it. I mean, it was remarkable. And if we can get that down to the— sort of little electrodes that you can sort of just place. There's a lot of research now that if you just place them, you know, on maybe two parts of the hand, you can send sensations all the way through the arm. So there's some really, really exciting development to be done in haptics.

Amy LaMeyer 24:14  
Yeah, I'm definitely looking forward to the move away from the controllers and into hand tracking as well. I just think that's a lot more natural for a lot of people. And so I think that'll help the experience. I mean, I'm not worried about which button to push.

Ben Nunez 24:27  
Totally. And that in part comes from like a history of being connected I think with gaming, you know, where there's a— they're used to a lot of different buttons and gesture— you can get really exhausted if you try to make people use their arms too much. But I think that there's something really powerful and like cross-modal that we haven't tapped yet.

Ben Nunez 24:51
The— there are a lot of things that are happening that will make all of that easier. From you know, the calculations and the technology, the vision, the sort of economies of scale we're seeing with depth sensors that are, you know, in our phones now, and, you know, they're small and cheap and, you know, $150 real-sense cameras that, you know, can enable us and, you know, vision libraries that let us very easily detect a hand, detect the skeleton, software skeleton. And enable— you know, it's not just about the user's consumption of the content, it's about how they are perceived, you know, using that content as well and how their body's reacting to it. So whether it's through gestures or in other modes of sort of reading people's emotions or you know— and then layering on other types of sensor data. Thermal sensor data, you know, being able to just detect other sensory inputs and morph all of them together into an experience across a lot of different use cases is I think what will— what I'm particularly excited about. It's not just one.

Noelle Tassey  26:06
So we got a lot of questions in on the event for a forum about different upcoming live events. And on a very wide timescale, and I'd love to just hear everybody's thoughts on kind of what each one of these events will look like as we zoom forward at different kind of intervals. So we have some people writing in about fashion week, which Ben, I know is something that you've been involved with, as well as, I think everyone else here. The Tokyo Olympics, which will now be next summer, where we're likely to have some variety of live events, but you know, who knows how many people will really be there. So, you know, we're still expecting augmented reality in some way to be a major part of how people experience that. All the way forward to the 2026 World Cup. So kind of stopping at each of those stations a few months out, a year, out a few years out, you know, how do we see this more and more becoming integrated into the core of those experiences?

Ben Nunez  27:01
Ben, do you want to start with fashion week? I can talk about the Olympics, unfortunately, but—

Amy LaMeyer 27:07
And I've got World Cup, so—

Ben Nunez  27:09
All right, well, I'm not a fashion guy, (inaudible) t-shirt, but it's— I'll do my best to explain you know what I'm seeing is a lot of interesting— you know, how do you— you know fashion week in New York has been you know, a staple of New York for so long and in Paris and other parts of the world. And, you know, it's not going to happen the same way anymore. And that may change, you know, for, you know, for a long time but certainly, come this Fall. And so how are big— you know, fashion designers going to— and I mentioned this earlier, this applies to fashion but it also applies to really any product launch. In a lot of ways fashion is a, you know, is a product launch, it's announcing new lines of clothing. And you know, spawns and spurs people's interest in, you know, in that brand, and, you know, how do you— how do you do that? How do you replicate that? And, you know, so we're, you know, figuring out how do to you, you know, there's obviously the techniques of using a treadmill to replicate, you know, someone walking down a runway, having little, you know, staircases that someone can kind of walk up and down, you got to figure out timing, about how you can loop them, and recreate sort of what a, you know, a model might be doing when he or she walks down, you know, a runway or otherwise, you know, demonstrates that—you know, sort of showcases their clothing. But you've got a lot of dynamics there about how, you know, physics come into play with, you know, their body and when, you know, someone twirls in a dress, how that dress, actually, you know, appropriately, you know, given the, you know, the gravity in the world we live in, you know, how does that actually flow? You know, and so there's— and then ultimately, how do you— you know, the people who have just typically gone to Fashion Week and been a part of these events and the parties and everything experience that? And, you know, that gets to where you want to be able to reach still lots of people, it's a somewhat exclusive event. So you can do things in headsets. But you know, a lot of that crowd doesn't have headsets. And so how do you create something that's, you know, immersive enough so this is not on my mobile phone, ut you know, not so exclusive that everyone has to have a headset? And that's a lot of the things that we're trying to figure out and working with people on. And then, you know, how does commerce come into play there? You know, when people aren't necessarily going to there, they can be there in front of electronic device that can enable very quick purchase. And there might be certain things that you see on the runway that no one ever wants to wear in public, but there are a lot of other things that that stuff drives. It drives a lot of purchasing and how do you connect, you know, e-commerce systems to those types of immersive experiences?

Noelle Tassey  30:00  
This is a great time to be buying clothing you won't wear in public too. So— you know, for now.

Ben Nunez  30:05
Yeah, like, so— we actually have, you know, we were working on something for the Olympics for the last couple of years. And, I think, you know— it's not only just, you know, the kind of the heartache of it moving out, especially, you know, if we have any kind of developers in the room or anybody that's worked on a project for a few years, it's like, Whoa. So the heartache of that being moved out. But, it was also the challenge of then even doing build reviews, right? Which is really common to do together, that we had to sort of figure out how to do remote build reviews together, and then, you know, how do we do remote build reviews internationally? And how do you— so it pushed us in a lot of ways in the near-term. And again, like, I think— I hesitate to say that like a year from now, we're going to be full-on, like, you know, a virtual version of the live event. But I do think that you're going to see people start— again like they're just unwilling to be caught in the position they were in this year. So you'll start to see people build into the experiences that they're doing live, which may only be able to hit 30% capacity. Some capability to capture that event in real-time, and broadcast it across several different mediums even if it's not AR or VR, maybe it's just VR. Or maybe it's mobile, some ability to do live, you know, real-time compositing of 3-D elements into some socially broadcastable piece. Which is still really powerful education, it still moves the market forward. But I think that's— what you'll really start to see is taking elements of a live experience that can't quite hit capacity and therefore the economics don't quite work like they used to, being pulled into different types of platforms, whether it's like, you know, web or live streaming or social or VR. They'll be looking for ways to build that infrastructure so this doesn't happen again, this sort of gap where they're completely disconnected.

Ben Nunez  30:09
Yeah, and in terms of World Cup, as you look at arena-based experiences today, you're already seeing companies like Immersive doing layover augmented realities that's on top of, you know, the game that you're watching— or no one's watching them now, but you know, if there were games in an arena. And you're seeing things like augmented reality additions to Chainsmokers' songs and their concerts, actually, that was a Verizon 5G Labs event last Fall. But over time, I think what you'll see is just the growth of that. And obviously, the switch to— from a phone to eyewear, smart glasses. But I hope there will also be in addition, that concept of being able to share that with people that aren't in the event. So I think you're— you'll see two things. One, the augmentation of the event that you're at physically and two how do you bring that to people that aren't there? And hopefully, there's merch sales— making it easier to get that money to the artists and the teams and—

Ben Nunez 33:33  
I think that you hit a great point, though, that even for venues that maybe like want to sell, you know, premium tickets for, you know, an enhanced AR experience. I think that's like one way you'll see it rolled out because the truth is, is there's still sort of like a logistical challenge to putting a bunch of headsets on people's heads who have never tried them before. And even the device-management aspect of that is kind of a nightmare right now. But as those tools start to mature, that are foundational, I think you'll probably see like limited rollout in venues of like enhanced experiences that people get if they wear the headset. So you get the status to share it on social, but you don't quite have the logistical challenge of trying to put, you know, 500 headsets on people's heads and know if they've lost headphones or, you know, know if their controller's not working or— so—

Ben Nunez  34:25  
And Noelle, I— you know, the 5G Lab, the Verizon team just won a Webby for you know— we were part of this as well with the Thanksgiving Day Parade. Back until— even pre-COVID you know, there are a lot of these examples where people were just trying to do virtual events and take, you know, sort of mesh technologies together. And what Amy mentioned with the Chainsmokers was in the Fall. All of this was pre-COVID and there's so many things we can learn from successes in the past where you know, the Thanksgiving Day Parade, they live-streamed a 360° video that we took, you know— volumetric capture is where we figured out how to insert biometric captures into the live stream— into the 360°  video. They had 10 million people stream that. And yeah, it was a great success. And so, you know that stuff, just you learn from those things, you execute them, you do them and then you learn and you take that into something that's going to be different down the road.

Noelle Tassey  35:20  
And so and I— by the way, I loved that experience, it was really, really cool. I did it. I'm a big fan of the Thanksgiving Day Parade. It was awesome. So on the topic of, you know, things that have already been done like that, but, you know, maybe we haven't had like, still, you know— I guess what I'm saying is like, you take live experiences, even pre-COVID, bring them into some sort of AR or VR setting. We still see a lot of challenges to adoption, right? Like, these are expensive, they're difficult to produce, they're difficult to distribute. We've touched on a lot of those challenges, just sort of in passing, but what really are the biggest hurdles in the next few years in terms of getting this technology into the hands of more people, getting more people to engage with it beyond just the early adopter community?

Amy LaMeyer  36:12  
So I can talk about it at a high level, and then I'm sure you guys can go into a little bit more detail of like how things are made and why it's hard. But just I think at a high level, I look at three things that will enable mass usage of this content. And that's cost, comfort, and content. So the cost has to be— for the headsets has to be at the point where people can afford it. You know, we're finally getting there with something like the Oculus Quest, being less than the cost of a phone. But it really has to be comfortable and easy to use, like as easy to use as the phone and we are not there yet. And then there needs to be content that's appealing, to drive people to use it. So that's what I think about when I think about when will we get to mass usage.

Ben Nunez  37:01  
Yeah, I mean, I think— right now— I mean, that's obviously 100% right. Like, on all fronts that nails it. I think on the back-end side of things, you know that— we're still fairly fragmented on the tech side of things. But you're starting to see like a lot of progress on like CD and mobile mapping and sort of some of the infrastructure-level things that will create these digital maps and spatial publishing of content into those maps where it lives persistently and you can play with it all the time, that sort of metaverse theme that was, you know, coined a long time ago by Neal Stephenson. But remote rendering, and then sort of advanced graphics processing is going to be really critical. Volumetric, you're— I mean, Ben I'm sure you could go on about this for ages but definitely, like that process of volumetric capture to compression, to sort of real-time transmission and broadcast and being able to composite on broadcasts in real-time. So you have kind of that commoditize TV pipeline that we don't have yet for broadcasting any of this stuff. Those things that— and I think interaction. Like I'm obsessive about input, but the truth is like, when you nail really intuitive, simple interaction, like it just, it takes all the intimidation out of technology, and it just makes it so it feels mainstream. And right now, none of this feels mainstream.

Amy LaMeyer  38:35  
It's something— it's taking something digital and making it feel human.

Rebecca Barkin  38:39
Right. Right. And we've been so— we've had to be I think it's a necessary phase, we've had to be so focused on the engineering aspect because there's so many technical hurdles to get through that the human-centered part of it, like the maturity that's required, underneath all of that to enable human-centered design hasn't been really been there yet, right? Like you need the engineering to hit a certain level of maturity so that the sophistication is there to make really beautiful and meaningful interaction. So I think that will— that'll open things up too.

Ben Nunez  39:12  
Yeah, I think— so dovetailing on what you said about sort of remote rendering, I think, you know, 5G, Edge Compute, you know, fiber into the home. You know, these types of infrastructure— core infrastructure things so that I don't need a $10,000 machine in my home. I don't need, you know— or in a factory or wherever you might be installing, you know, computer vision or, you know, machine learning or immersive technologies. You know, you can leverage little things like a, you know, a Jetson nano or, you know, a number of different sort of micro compute devices and leverage edge compute on-site leverage edge compute for, you know, when you have a low-latency network, like 5G, you know, when you have a high-bandwidth network, you can leverage resources that you know, an edge compute center six miles down the road versus having to go to an AWS data center 200 miles away, you can actually render this stuff in real-time. The compression— and really the novel formats and ways in which you, you know, capture this data, reconstruct it, and then stream it. You know, being able to compress this and stream it over the low end of 5G or, you know, your home Wi-Fi network is critical to be able to do that. And, you know, we can do that, but the more— we need more of that. We need more capabilities to be able to just— not just create the content, because, you know, game engine developers, unity developers, unreal developers, being able to leverage pixel streaming and unreal, or, you know, the VFX— real-time VFX graph and unity and then being able to take those worlds and that data and be able to compress it and stream it down to devices, whatever that device is, is critical to us to sort of completing the, you know, the end-to-end chain where we have, we have costs, we have content, we have, you know, the devices all within, you know, within reach, you know, for everybody in the chain. Rebecca, you mentioned like there's no— you know, the pipelines for just broadcast video and being able to, you know, do this even just a live stream of anything face-to-face, whether it's FaceTime or Anderson Cooper on CNN. You know, those pipelines are very refined and have been for a long time, but they're advancing too. And how do you layer in, you know, augmented reality, you know, just XR, volumetric, you know, how do all those things come into play? And those pipelines need to advance as well.

Ben Nunez  41:43
Yeah, I mean, latency is huge. Being able to reduce latency. Because the truth is like these devices are, I mean, really, when they're at their best, they're, mimicking, like a human— or extending a human's ability to like, see, feel, and hear you know? So if that is not aligned perfectly, like if you've ever tried to touch something, in augmented reality or VR, but if you've tried to touch something and like your hand appears to go through it, and then like when your hand's halfway through it, then it pops— like it's a very disorienting, unsatisfying, creative experience. And ultimately, we need that level of sophistication all the way around, in order for the content to be compelling enough to move people.

Ben Nunez  42:30  
Yeah, the device's chip needs to have low latency all the way through to the network to multi-camera registration and streaming all of that. That latency needs to be, you know, for true real-time, and for two-way you know needs to be 150 milliseconds. And all of that is— you know, or around there— and all that's really hard to achieve. We're getting there. But so— I think that first wave of everything we see will be very much a one-to-many kind of broadcast where the latency twitches, what, a 45-second delay? You know that those types of delays are way more acceptable. You are actually interacting, I think, you know, I think that first wave is, you know, we'll see a lot more in way of sort of one-to-many broadcast versus true two-way, you know, low-latency sync.

Noelle Tassey 43:23  
So, we've talked so much about, you know, these much more like advanced challenges that come with creating experiences in VR. So whether, you know, you're being held back by network issues, or there's just a really lengthy development process required to let's say, bring the Olympics to your headset. For— we've gotten a lot of questions, I think, from people who are, frankly, probably sick of being on Zoom. So sorry, for those of you who are sick of being on Zoom, but you're here. About how to incorporate this now, you know, what can I do now? I'm not a developer like I don't have a big development team. Can I host an event in alt-space? Like what platforms can I start to use in lieu of teleconferencing in certain cases? Or where can I have my meetup? Like, what can people do potentially now to start exploring with this technology, you know, if you're an early adopter?

Amy LaMeyer  44:15  
This is a great time for me to talk about the event I'm doing next week with Verizon 5G Labs. We're going to talk about 5G music and emerging fan experiences. So if you're interested in this, you might be interested in that. But it's in Mozilla hubs. And you know, that's accessible by browser. So it's accessible by any device headsets, or mobile phones, or laptops. So they'll start— Google that or follow me on Twitter and there'll be a link to that. But those I think, are— some of the meetups are the easier and most successful ones right now. Reggie Watts does 5 pm Pacific Time on Saturdays for— I don't know who knows how many weeks he gets together and interacts with whoever's in the room and plays music and talks. And that's really fun. So I would— I would say, just start experiencing things first and then start to build. But if you're just in the beginning stages start to look at how other people are interacting in spaces like this. And then that'll help form your own creation process.

Rebecca Barkin  45:26
Yeah, there's some apps too, that, like we use. Like I have our weekly meetings. And there's a couple apps that Magic Leap is working on. But, also Spatial, is one that we've used where, you know, the setup is pretty simple. But you're able to create a 3-D avatar of yourself basically, by taking a picture with your camera on your computer, they put it onto a 3-D avatar, and then you can go into an environment on your device and have other full, you know— not full, that's not right. But you can have other volumetric representations of your friends that look like your friends or your colleagues. And you're in a shared space together, you sort of appear in a shared space no matter what room that you're in. And you can even put tools on the wall and work on documents together, and 3-D objects and whatnot. So there's some tools like that that are out there that are starting to, I think that one is both on HoloLens and Magic Leap at this point. But you know, obviously, there's the trouble of having a HoloLens or a Magic Leap so— But there are some tools out there that you can— I'm sure there's some like videos and stuff that demonstrate what those are and the value they can bring to holographic meetings.

Ben Nunez 46:46
Yeah, Spatial's actually recently made it free and you know, they have web, and they're again a different interface. But I think I echo everything Rebecca just mentioned, but also look on the web. There are five browser— just browser-based on your mobile phone or on your computer, you know, interfaces that let you consume 3-D content. And yeah, I was just on the phone with, with some guy walking— doing a walkthrough of a new, you know, building where they want to put a volumetric capture stage. And they just had me— it was just a YouTube video, a data pipeline where they took a bunch of CAD renderings and textures and you know, ingested it into Unreal, and created camera paths through all of this stuff and just completely virtually walk me through it on this computer, on a Mac. And so there are a lot of, you know— whether that's a live event or a more new, very specific business case like I just described. There's a lot of sort of browser-based stuff. You know, if you don't have a Magic Leap if you don't have a HoloLens, or you don't have a Glass.

Ben Nunez 47:57  
Yeah, I play with two. There's some apps out there the— like Display.land and others that you can do just photogrammetry and have some fun with. It'll just start to familiarize you with sort of, you know how easy that that tech can be used and how kind of practical it can be. It's not like, you know, they're not perfect, but that is photogrammetry in its essence, you know. And so you can have some fun playing around with those and they have a little bit more of a consumer kind of front to them so they can be fun to play with.

Ben Nunez  48:31
Yeah, if you're into Fortnite, you know, I'm not super into it at all, but you know, they're turning that thing into a party. (inaudible).

Amy LaMeyer 48:41  
Oh, we haven't mentioned Wave. Wave is a great music experience. It's a lot of fun. So definitely if you're a music lover, don't miss that.

Noelle Tassey  48:52  
Awesome. And in terms of the online collaboration tools, we were just talking about, what's you know, the real benefit to something like Spatial— do you feel a real collaborative benefit to meeting in Spatial for instance, versus like over Zoom? And what is the real difference there, like what's driving that?

Rebecca Barkin  49:10
Yeah, I mean, I think real co— like real collaboration, there's sort of a— there's sort of a physical aspect and again, like that kind of energy exchange, right, like when you all sit around a table, and you have a whiteboard and you know, get some beer and pizza and you decide you're gonna work out a problem together, that sort of exchange and energy together is really valuable and, there's something about the tactile experience of like writing it out together and erasing it and it gives you the ability to do that. And, you know, I will say personally, as much as working from home can— you can be more productive sometimes, the thing that I miss most about the office is the fact that there was a real energy exchange. Like when you saw people in the hallway or when you met with them like that conversation was an exchange of energy. Versus sitting in front of a computer, which sort of just pulls at you, like, all day long, like it pulls your energy, right? And so when you're able to move around a space together, which, you know, we stand up when we have those meetings, and you're kind of moving around each other, and you're all looking at a common object together, and then it's interacted, you can draw on it together and see it in your space in 3-D. It just feels like collaboration a bit more. So, I mean— and I personally have never been a huge fan of avatars, like the creation of avatars and represent— like, I can't ever imagine getting like a bad diagnosis from like an avatar version of a doctor like I just so— I think volumetric is definitely going to be the way that that has to go. People— when people see humans, even if it's not perfect, but it looks like the person you know, and you can see their body gestures and their movement a little bit, it just really— the intimacy level goes way, way, way, way, way up.

Noelle Tassey  51:07
So that touches on— I think we have time for one last question. But something you all just touched on, actually, which is the uncanny valley. So we've had some people write in the chat about, you know, how that's sort of been wider than expected in the world of VR and what tools you know, we see to kind of bridge that, and I think Ben might have an idea or two about that. But—

Ben Nunez  51:33
Yeah, I mean, so I think that that picks up right where Rebecca left off, and that, you know, the more— but there are certain use cases where that uncanny valley feeling is interesting and funny and cool. There are other cases where it's creepy and you know, really makes you want to get out of the environment fast. And then there's— you know, you can take it to an extreme which makes it even— maybe even more fun. But from our perspective, you know, whether that's a doctor giving you some sort of diagnosis or,  evaluating you to, you know, a teacher teaching you how to, you know, how to do something, you know, to military simulation, you know, how do you— you know, do those things in a much more, you know, photo-real way where, you know, the goal here is to make you feel immersed. And if something seemed really foreign, you know, whether it's, you know, the human face, you know, you can't mess up a face, like we spent so much time not messing up a face. You can't have a nose, you know, slightly off or an eyebrow just two millimeters off, you know, there are things like that that, you know, break immersion immediately and make you feel really uncomfortable, you know, with kind of what you're looking at, particularly with a face. And I think there are, you know, a wide set of tools from, you know, what, what spatial and other companies can do with just a 2-D you know, photo and given things Facebook and Google are doing with 2-D, you know, 2-D captures to how you use photogrammetry or, you know, matter quarter scans to completely reconstruct environments, you know, to what we do with, you know, human— you know, human performance capture and volumetric capture and being able to reconstruct even more real, we can get that as a massive set of tools to be able— that are emerging to be able to do that, you know, the more people gravitate towards this stuff.

Amy LaMeyer  53:31
The only thing I'd add to that is that— if you look at the difference between play and work— with play, that sense of needing to look like a human or look very realistic, becomes less necessary. And there's a lot of ways that you can get a sense of presence without having to look like who you look like as a person. So I think there's some more flexibility in terms of entertainment and play environments and gaming environments then perhaps in enterprise. Which is fun!

Noelle Tassey  54:08  
Definitely. Unfortunately, we are almost— time's almost up for us today. I could honestly talk about this for another hour but I just want to thank all of our panelists and our participant attendees. The thoughtful questions, the amazing answers. Anybody who wants to review this, it will be available on our website, Alley.com tomorrow, feel free to distribute it to your community. And if you are free next Wednesday at 2 pm, please join us for AI and human connections. We've got a pretty awesome panel lined up. Thank you again to everybody. Hopefully, I'll see you all in virtual reality sometime soon.

Rebecca Barkin  54:45
Thank you.

Amy LaMeyer  54:45  
Good to see you, thanks.

Noelle Tassey 54:47  
Take care.

Amy LaMeyer  54:47
Bye, everybody.

Rebecca Barkin 54:48  
Bye-Bye.

Read Next