Meta Orion AR Glasses: Developers Hands On: Viewing the Future


On Wednesday, after Meta’s keynote address, Ian Hamilton was invited by Coulombe to observe his scheduled Orion demo. On Wednesday, after Meta’s keynote address, UploadVR’s Ian Hamilton was invited by Coulombe to observe his scheduled Orion demo.

While we would have preferred to bring you first-hand impressions directly from UploadVR, and we’ve been told by Meta they may do future demos in New York, until that happens we’ve asked Coulombe to write for us from his expert perspective about his time in the glasses.

September 25 was a good day at Meta Connect.


Harkening back to the idyllic days of Oculus Connect, I had hallway run-ins with countless brilliant XR people, some of whom I’ve known for many years and others I’d only met in the metaverse before today. I had a lot of hallway encounters with countless brilliant XR people, some whom I’ve known for many years and others who were only met in the metaverse before today.

displayed a slide with Unreal Engine information and camera access. Andrew Bosworth announced a new a stumble and said it would make HorizonOS development easier in the future. How did I end up here?Mark RabkinA couple of weeks ago I received a last-minute invite to Meta Connect, but I wasn’t sure if I should go. My travel plans were already set for Unreal Fest Seattle where my company would be giving away apologized and then a Star Wars convention at Orlando where our

was going to be displayed. Was it really worth the extra travel to attend Meta Connect? Andrew Bosworth is Meta’s Chief Technology Officer. He 10-year AR glasses project called Orion said: “We’ll have cake, I believe.”

I love cake. Cake, as it turns out, was an Orion glass demo. It was well worth my trip to see the demo. The demo I experienced was worth the trip.

I left Orion with the same excitement that I felt when trying out the Tuscany Demo on my Oculus DK1 in 2013. Like with DK1, besides the thrill of my direct experience, I also felt the promise of this new product type that is still in its early days. As a programmer, I am eager to create for this platform. This platform is a great place to build for as a developer. As a user, I’m excited to see the products that others create. This felt like a combination of all the XR devices I had tried and also something completely new. The whole is greater than its parts. Quest’s current roadmap was a dead end for many. I imagined a world where they had the device that handled most of their computing, similar to Air Link. It would be a headless computer with wireless connectivity, and it would ‘just work’ on a level even my grandmother could understand. SteamVR is not needed, nor are Oculus Runtime or complicated driver updates. Many people said that what I imagined would be a pipedream. We’re less than one month on and I have now tested the first prototype of what I imagined. It may be able to convince those who have never worn a smartwatch. You’ll be coming after us, several talks.open source recreation of the Galactic StarcruiserAs a market researcher over the past decade, I have experienced AR headsets with hand tracking and eye tracking as well as gesture detection. I also tried wireless XR streams, spatial persistent AI, video calling, avatars, multiplayer games, and wireless XR streamers. It was a new experience for me to see all that in a form-factor that is comfortable, lightweight and cool. Thomas Van Bouwel took a video of my entire experience on my iPhone, but you cannot see it. Meta’s Joshua To guided me through my future “day in the life” of wearing these both for casual and more serious use.

https://www.youtube.com/watch?v=reeBoSK_YLQchimedI tried on several Orion glasses. It’s unclear how the decision was made which Orion glasses I would wear, but it wasn’t just about how they fit. It was easy to recognize the eye tracking system. I looked at dots. The wristband I wore was slightly tighter, but still not too much. The wristband was a little tighter than I would wear my watch, but not much. I was pleasantly surprised when I discovered that it vibrated whenever a hand gesture was detected. It left an impression. Ian Hamilton took this photo of Alex’s hand a few moments after the demonstration.

I didn’t touch the wireless puck, which does the majority of computation and broadcasts wirelessly the results to Orion. It was not in my pocket as I thought. Instead, I left it out on the table for the majority of the demo and never touched it. It maintains a strong ‘Air Link” connection. It was a little hot. Would it have been uncomfortable in my pocket if I had kept it there? The battery and wristband could be used all day while the glasses would last about two hours. Why? We’re not looking through the cameras. The real world is seen with no lag, distortion or stutter. The digital content was bright and had a wide field of vision that widened as the limit of the screen approached. Because of the wristband I was able to do this in a more discreet manner than Apple Vision Pro. That felt like a superpower.

The starting menu reminded me of the Apple Vision Pro home menu – a collection of simply arranged icons – but like a 1980s remaster of them with fewer colors and pixels. It was charming, like most of what reminds me of the 1980s. It was a glitch that the window sometimes flipped over, and I wondered why there wasn’t a way to force it towards me. The new window was seamlessly integrated between my other windows and moved them off to one side. I dictated the chat over Messenger to Chris Bacon, and then he called. I watched the Matrix demo which was good, but I realized that you cannot render anything black. For non-developers this means everything is alpha=0 or invisble. It’s not a problem I have seen solved other than using passthrough camera which allows for more control of how the environment is shown and overlayed.

Meta’s Prototype AR Glasses Have Remarkable Field Of View

Meta showed off prototype true AR glasses at Connect, codenamed Orion, with a remarkable 70-degree field of view.

I had to double-tap my thumb to create a Meta AI of a crowd of people wearing AR glasses in a movie theater. I was actually able to conjure this image in various formats in

. As someone a little fatigued by Vision Pro’s reliance on a pointer finger to thumb pinch, I appreciated there was more of a variety of gestures in use here: double thumb tap for Meta AI but also middle finger to palm for bringing up the home menu and the coin flip gesture for scrolling.I was ruminatingMeta’s representatives then directed me to use Meta AI by standing up and walking over to an example kitchen scenario, and then commanding Orion to “Give me a smoothie recipe”. First, I made the error of asking “Can you tell me a recipe for a smoothie?” This resulted in a more traditional search. The program then reviewed my ingredients and labeled each one with text that was firmly anchored. It produced a page-by-page recipe I could easily follow. The first time I tried the recipe, it didn’t recognize a pineapple that was on the table. The recipe was tasty and included all the ingredients that I had correctly identified. The second time, we added new ingredients and they were all correctly identified, labelled, and spatially located. The demo didn’t allow me to move the ingredients. That would have required object tracking and this wasn’t included. Three times I have cooked wearing Vision Pro. Orion was the first of these devices I could imagine wearing comfortably during a cooking session. I read the comments while scrolling Instagram, liking photos and clicking likes. The coin-flip gesture was also available to flip through photos or to pinch and swipe in order to scroll with hand tracking.

I then went through another Messengers demonstration using a flat Codec Avatar, Scott Galloway who identified himself as Josh. Meta clarified that Josh was controlling Jason’s Avatar. This type of switch is common for Meta’s demos. In 2022, I had the opportunity to talk to a Codec Avatar using VR. It was a great experience. The Vision Pro and Personas are great for “spacetime” calling. I use them more often. The Orion Codec Avatar Demo isn’t spatial, because the rendering would be too difficult. However, it looked great in flat format. It also passed the Uncanny Valley test. Personas still have the edge on this front, for what it’s worth. I asked him if he could puff out his cheeks and stick his tongue (which didn’t work). It’s a big deal to me as someone who is trying to push human performance fidelity limits for virtual live theater. This space race is something I’d like to watch (arms or face race?) I hope to see this space race (arms race? Continue at full speed. The feeling peaked when I played a game called “Stargazer”, which felt exactly like the one I made using Myo. The glasses were essentially the same, except for the absence of the hammer.

Many Apple Vision Pro apps have depth but sit in a frame, and on Orion this game was ‘windowed’ in a similar way. I was able to ”step in the game” and felt immersed. The experience was frozen after I played the game and tried to open the next application from the menu. The experience was frozen after this game, when I tried to launch the next app from the home menu. Then I received a new pair of glasses. It was fine and I appreciated the opportunity to explore this other input contingency. It was fine and I appreciated the opportunity to explore this other input contingency.Laser DanceThe next game was multiplayer and was called PONG. It was fun to play this game and I loved how it had an old-school feel. As it was in a large cube I liked that I could see the ball from the paddle angle, rather than from above, like in the original Pong. This was one of those games that only worked in a spatialized environment. But I did want more topspin, like in ping-pong or tennis. Did I have anything else to complain about? The Future Of AR Glasses

Sure the resolution can be improved. It would be great to eliminate all traces of chromatic distortion, even though it was only noticeable at the edge. There could also be more features like active object tracking, and a host of others that are available on other XR products. For example, I didn't type or touch virtual objects. For a development kit I would prefer a lower resolution with a high frame rate and low latency to a higher resolution version that is less stable. This 30 minute presentation was a great way to show what Orion has become and how it will evolve. As a theatremaker who's worked on many XR demonstrations before, I really appreciated the clarity of this vision. I know all of this will improve in time, and I'll be patient.

Well, let me be more specific. Will be patient until the release for consumers. It would be great to get one as a development kit or a device that we could use with

. It could be a simulator. That’s how I got so excited and started with both & .

wouldn’t hurt either!

Please Meta, if you truly want to make the lives of developers easier, give us developer kits and simulators early and often. In return, we offer to produce stellar content for your ecosystem, much more of which will be ready for your consumer launch dates so no one feels like an hour after unboxing they’ve seen everything.

Oh, and speaking of

A

lien

V

s

P

redator, you might have noticed I’m making a lot of comparisons to the Apple Vision Pro, a device that will almost certainly always cost more across all its generations than whatever Meta ends up charging for Orion. These two companies have clearly engaged in an epic battle. They are attempting to copy each other’s features and then launching a brand new feature to challenge the other company to surpass or ape it. It’s good. It’s good. This level of competition has been sorely needed by Meta for quite some time. In 2018,

was a game changer that turned the Lenovo Mirage Solo essentially into the Oculus quest 1 six months prior to the release of Oculus quest 1. Not only was it fun to use, but I also enjoyed the anticipation of a battle between Google and Facebook to see who could be first. Google cancelled Daydream, and the consumer kit was never made available. This happened just at the time that Facebook was doubling down on standalone headsets and preparing for Meta. Meta entered the market for standalone consumer 6-dof headsets largely without competition. Meta was complacent because they had no real competition. They were also slower innovating than those of us who are trying to make it through the VR Winter. Can another company take the crown as go-to consumer for XR/spatial computing? I can’t get enough of that race. not always using AIThis cake was not a lie and we will get to enjoy it in the future. Coulombe would like you to see the movie

coming this December to Meta Quest and possibly Apple Vision Pro.



Source link

Scroll to Top