Reading view

There are new articles available, click to refresh the page.

Global Hackathon for Vision Pro Development, Vision Hack, Kicks Off Next Month

Vision Hack is a forthcoming remote hackathon for developers building apps for visionOS. Open to all skill levels, the event is set to take place September 13–15th.

Guest Article by Cosmo Scharf

Cosmo Scharf is an Emmy-nominated product designer and entrepreneur with a decade of experience in XR. He co-founded Vision Hack and visionOS Dev Partners to support visionOS developers. Previously, Scharf created MovieBot, an AI-powered 3D animation app, and Mindshow, a VR platform for animated content creation. His also started VRLA, one of the world’s largest immersive technology expos.

Imagine building the first iPhone apps in 2008. That’s where we are with visionOS today. Vision Hack is your launchpad into the new frontier of spatial computing.

Over the past decade, I’ve had the privilege of witnessing the virtual reality industry evolve from a niche technology to a transformative medium. From the early days of clunky prototypes to the sleek, powerful devices we have today, the journey has been nothing short of extraordinary. Now, with the introduction of Apple Vision Pro, we’re standing at the threshold of a new era.

As one of the organizers of Vision Hack, I’m thrilled to announce the launch of the first global visionOS hackathon. Scheduled for September 13–15th, this event represents a significant milestone in our industry’s progression. It’s an opportunity to explore and shape the future of spatial computing as Apple Vision Pro continues its global rollout.

Vision Hack is designed to be a truly immersive experience. We’re encouraging teams to communicate in Vision Pro itself using spatial Personas, in addition to Discord. This approach not only showcases the device’s capabilities but also provides participants with authentic, hands-on experience in visionOS development.

Our three-day program caters to both seasoned spatial computing developers and newcomers:

  • Day 1: Workshops and team formation
  • Day 2: Intensive development with mentorship from industry experts
  • Day 3: Development, project presentations, and awards

To foster collaboration while ensuring focused development, we’ve capped team sizes at 5 people each. Understanding the global nature of our community, we’re organizing local meetups in various cities so developers can connect in person.

While we highly recommend access to a Vision Pro for the full experience, it’s not a strict requirement for participation. However, developers will need a Mac with an Apple chip to run the visionOS simulator. This setup will enable meaningful participation even without the physical device.

The organizing team brings extensive experience from major VR expos (VRLA), Metaverse hackathons, and XR startups.

As spatial computing evolves, we believe early developer engagement is crucial in building a robust ecosystem. Vision Hack aims to play a key role in nurturing the visionOS developer community, potentially influencing the trajectory of spatial computing applications.

For developers keen on exploring visionOS, Vision Hack offers a unique opportunity to dive into this emerging platform. There’s a $25 registration fee, which helps us cover some of the event costs and ensures committed participation.

For companies interested in being at the forefront of spatial computing development, we offer various sponsorship opportunities. These partnerships not only support the event but also provide sponsors with direct access to a pool of talented developers working on cutting-edge spatial computing applications.

More details, registration information, and sponsorship opportunities can be found at visionoshackathon.com. We’re excited to see the innovative projects and ideas that will emerge from this event, and we look forward to welcoming you to the next chapter of spatial computing development.

The post Global Hackathon for Vision Pro Development, Vision Hack, Kicks Off Next Month appeared first on Road to VR.

Quest Sleeper Hit ‘Yeeps’ Reveals Player Population, Talks Past & Future of the Studio

Yeeps: Hide and Seek is an App Lab sleeper hit that the developers say reached 360,000 monthly-active users, even before its debut on the main Quest store this week. Speaking to Road to VR, the studio shares the impetus of the project and where it’s headed next.

Created by Trass Games, Yeeps: Hide and Seek was developed over an eight-month period, culminating in a Closed Alpha launch in February 2024. The game officially launched on App Lab in March, and just this week launched on the main Quest store.

The development team, consisting of co-founders Jack Southard, Nathan Jew, and Steve Shockey, created Yeeps through a collaborative process. As the developers explain, “The three of us, Jack, Nathan, and Steve, pitched three different game ideas. We took elements from each, then consolidated them into Yeeps.”

Yeeps prominently features arm-based locomotion, a system popularized by VR titles like Gorilla Tag. The developers acknowledge the inspiration. “We have a lot of admiration for Gorilla Tag and the arm locomotion system they pioneered. We knew it was the future of VR locomotion, at least in the social playground genre.”

However, Yeeps aims to differentiate itself through additional features. “We added more depth to [the locomotion] with gadgets like bombs and gliders,” the team explains. “We saw how games like Fortnite have evolved the battle royale genre with fun playful items and knew we could do the same with Yeeps in VR.”

A key focus of Yeeps is user-generated content (UGC), driven by the past experience of the team. “As kids, we sunk thousands of hours into social UGC games like Minecraft. We knew the power of these expressive building systems and set out to build a VR-native one with Yeeps.”

Since its App Lab launch, Trass Games reports significant user adoption. “Since we launched on App Lab on March 14th, 2024, Yeeps has grown to around 360,000 MAU,” the team says. With the [launch on the main Quest store], we are confident the player base will continue to grow rapidly.”

The game has also garnered a significant number of user reviews. According to the developers, it was just three months after its App Lab launch that Yeeps surpassed VR classic SUPERHOT to become the 10th most-reviewed VR game of all time, while managing to exceed SUPERHOT’s average review score. The game currently has more than 31,000 ratings to SUPERHOT’s 18,000. Granted, Yeeps is free-to-play while SUPERHOT is priced at $25.

Trass Games has outlined plans for regular content updates to Yeeps. “We are committed to frequent content updates that add new gadgets, blocks, game mechanics, and even biomes. We are especially excited to further develop the social UGC side of Yeeps and empower creators to build and share entire worlds with the entire community,” the team says.

Jack Southard and Nathan Jew | Image courtesy Trass Games

Trass Games was founded in 2022 by Jack Southard and Nathan Jew, who say they met in their freshman year of high school in 2015 and began building games together. They say they built 30 small game projects together before discovering VR in their senior year of high school.

Their first VR project was Overboard, released on Steam in 2019, a “ridiculous VR game about fighting bloodthirsty and increasingly deadly sharks using makeshift weapons.”

When the pandemic hit in 2020, they become compelled to build a multiplayer VR game. That led them to build and release Gods of Gravity in 2021, which was fairly well received on Quest.

The studio says that Gods of Gravity gained the attention of the venture capital firm A16Z, which invited Trass Games to its Speedrun accelerator, an early-stage investment program focused on game development. The program invests $750,000 in participating companies.

Following the accelerator, Trass Games doubled its team from three to six, bringing on new artists with the goal to “take everything we had learned from making Gods of Gravity and make a new free social game that was bigger and better in every way.”

Despite getting a head-start with the A16Z Speedrun program, the studio says it doesn’t plan to take additional investment. “While we plan to slowly grow our team, we are confident in our ability to scale Yeeps as a small, efficient team. As such, we do not plan on taking any funding,” the studio tells Road to VR.

The post Quest Sleeper Hit ‘Yeeps’ Reveals Player Population, Talks Past & Future of the Studio appeared first on Road to VR.

Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO

Last year Meta announced the so-called Augments feature, planned for Quest 3, which would allow persistent mini AR apps to live in the world around you. Now, eight months after the headset hit store shelves, Meta’s CTO explains why the feature has yet to ship.

Augments was announced as a framework for developers to build mini AR apps that could not just live persistently in the space around you, but also run concurrently alongside each other—similar to how most apps work on Vision Pro today.

Image courtesy Meta

And though Meta had shown a glimpse of Augments in action when it was announced last year, it seems the company’s vision (and desire to market that vision) got ahead of its execution.

This week Meta CTO Andrew “Boz” Bosworth responded to a question during an Instagram Q&A about when the Augments feature would ship. He indicated the feature as initially shown wasn’t meeting the company’s expectation.

We were playing with [Augments] in January and we decided it wasn’t good enough. It was too held back by some system architecture limitations we had; it ended up feeling more like a toy and it didn’t really have the power that we think it needed to deliver on the promise of what it was.

So we made a tough decision there to go back to the drawing board, and basically [it needed] a completely different technical architecture. Starting from scratch basically. Including actually a much deeper set of changes to the system to enable what we wanted to build there. I think we made the right call—we’re not going to ship something we’re not excited about.

But it did restart the clock, and so [Augments is] going to take longer than we had hoped to deliver. I think it’s worth while, I think it’s the right call. But that’s what happened.

We’re only two-and-a-half months out from Meta Connect 2024, which would be the one-year anniversary of the Augments announcement. That’s where we likely to hear more about the feature, but at this point it’s unclear if it could ship by then.

The post Quest ‘Augments’ Feature for Concurrent AR Apps Needs More Time to Cook, Says Meta CTO appeared first on Road to VR.

VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro

We’ve know that Apple planned to support WebXR for quite some time, but with VisionOS 2, the company is enabling the feature for all users. WebXR allows developers to deliver cross-platform XR experiences directly from the web, with no gatekeepers to approve or reject content.

WebXR is the widely supported web standard that allows developers to deliver AR and VR content from the web.

Just like anyone can host a webpage online without any explicit approval from another party, WebXR allows the same for AR and VR content. And because it’s delivered through the browser, accessing and sharing WebXR experiences is as easy as clicking or sending a link—like this one.

Vision Pro has supported initial WebXR support since its launch, but it required users to manually enable the feature by digging into Safari’s settings.

With VisionOS 2—available today as a developer preview, and coming to all this Fall—WebXR will be enabled by default, making it easy for anyone to access WebXR through the headset. Vision Pro thus joins headsets like Quest, HoloLens 2, and Magic Leap 2 in supporting WebXR content.

Though WebXR is “supported” on VisionOS 2, our understanding is that it only support VR (or ‘fully immersive’) experiences. WebXR is also capable of delivering AR experiences (where virtual content is merged with a view of the real world), but VisionOS 2 doesn’t yet support that portion of the standard.

There’s many reasons why developers might want to use WebXR to build experiences over native apps that are distributed through a headset’s proprietary store.

For one, any headset with WebXR support can run any compatible WebXR experience, meaning developers can build one experience that works across many headsets, rather than needing to make multiple builds for multiple headsets, then uploading and managing those builds across multiple platform stores.

Like a webpage, WebXR content can also be updated at any time, allowing developers to tweak and enhance the experience on the fly, without needing to upload new builds to multiple stores, or for users to download a new version.

WebXR also has no gatekeepers. So content that wouldn’t be allowed on, say, Apple or Meta’s app stores—either for technical or content-related reasons—can still reach users on those headsets. That could include adult content that’s explicitly forbidden on most platform app stores.

The post VisionOS 2 Enables WebXR by Default, Unlocking a Cross-platform Path to Vision Pro appeared first on Road to VR.

Image courtesy Apple

VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers

For basic Vision Pro interactions like navigating apps and scrolling web pages, the headset’s look-and-pinch input system works like magic. But if you want to go more ‘hands-on’ with virtual content, the headset’s full hand-tracking leaves much to be desired.

Compared to Quest 3, Vision Pro’s full hand-tracking has notably more latency. That means when moving your hands it takes longer for the headset to register the movement. Especially in interactive content where you directly grab virtual objects, this can make the objects feel like they lag behind your hand.

Changes coming in VisionOS 2 stand to improve hand-tracking. Apple detailed the changes in a developer session at WWDC 2024 this week.

For one, the headset will now report estimated hand positions at 90Hz instead of the previous 30Hz. That means the system can reflect changes in hand position in one-third of the time, also making the movement of the hand smoother thanks to more frequent updates. This only applies to a small portion of the overall latency pipeline (which we previously estimated at a total of 127.7ms) but it could reduce hand-tracking latency by as much as 22ms in the best case scenario.

Here’s a look at that in action:

It’s an improvement, but you can still easily see the latency of the teapod compared to the hand, even with this slow movement.

For a snappier experience, VisionOS 2 will alternatively allow developers to enable hand-tracking prediction, which provides an estimate of the user’s future hand position. While this doesn’t truly reduce latency, it can reduce perceived latency in many cases. Similar prediction techniques are common across various XR tracking systems; it’s quite surprising that Vision Pro wasn’t already employing it—or at least not making it available to developers.

Here’s a look at predictions in action:

Now we can see the virtual teapot staying much more aligned to the user’s hand. Granted, this isn’t likely to look quite as good with faster motions.

We’ll be looking forward to putting Vision Pro’s hand-tracking latency to the test with VisionOS 2 soon!

The post VisionOS 2 Improvement Targets Key Vision Pro Critique Among Developers appeared first on Road to VR.

Image courtesy Apple

Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise

VisionOS 2 is bringing a range of new development features, but some of the most significant are restricted to enterprise applications.

VisionOS 2 will bring some of the top requested development features to the headset, but Apple says its reserving some of them for enterprise applications only.

Developers that want to use the features will need ‘Enterprise’ status, which means having at least 100 employees and being accepted into the Apple Developer Enterprise Program ($300 per year).

Apple says the restriction on the new dev capabilities is to protect privacy and ensure a predictable experience for everyday users.

Here’s a breakdown of the enterprise-only development features coming to VisionOS 2, which Apple detailed in a WWDC session.

Vision Pro Camera Access

Up to this point, developers building apps for Vision Pro and VisionOS couldn’t actually ‘see’ the user’s environment through the headset’s cameras. That limits the ability for developers to create Vision Pro apps that directly detect and interact with the world around the user.

With approval from Apple, developers building Vision Pro enterprise apps can now access the headset’s camera feed. This can be used to detect things in the scene, or to stream the view for use elsewhere. This is popular for ‘see what I see’ use-cases, where a remote person can see the video feed of someone at a work site in order to give them help or instruction.

Developers could also use the headset’s camera feed with a computer vision algorithm to detect things in view. This might be used to automatically identify a part, or verify that something was repaired correctly.

Even with Apple’s blessing to use the feature, enterprise apps will need to explicitly ask the user for camera access each time it is used.

Barcode and QR Code Detection

Image courtesy Apple

Being able to use the headset’s camera feed naturally opens the door for reading QR codes and barcodes, which allow structured data to be transmitted to the headset visually.

Apple is providing a readymade system for developers to detect, track, and read barcodes using Vision Pro.

The company says this could be useful for workers to retrieve an item in a warehouse and immediately know they’ve found the right thing by looking at a barcode on the box. Or to scan a barcode to easily pull up instructions for assembling something.

Neural Engine Access

Enterprise developers will have the option to tap into Vision Pro’s neural processor to accelerate machine learning tasks. Previously developers could only access the compute resources of the headset’s CPU and GPU.

Object Tracking

Although the new Object Tracking feature is coming to VisionOS 2 more broadly, there are additional enhancements to this feature that will only be available to enterprise developers.

Object Tracking allows apps to include reference models of real-world objects (for instance, a model of a can of soda), which can be detected and tracked once they’re in view of the headset.

Enterprise developers will have greater control over this feature, including the ability to tweak the max number of tracked objects, deciding to track only static or dynamic objects, and changing the object detection rate.

Greater Control Over Vision Pro Performance

Enterprise developers working with VisionOS 2 will have more control over the headset’s performance.

Apple explains that, out of the box, Vision Pro is designed to strike a balance between battery life, performance, and fan noise.

But some specific use-cases might need a different balance of those factors.

Enterprise developers will have the option to increase performance by sacrificing battery life and fan noise. Or perhaps stretch battery life by reducing performance, if that’s best for the given use-case.


There’s more new developer features coming to Vision Pro in VisionOS 2, but these above will be restricted to enterprise developers only.

The post Some of Vision Pro’s Biggest New Development Features Are Restricted to Enterprise appeared first on Road to VR.

Image courtesy Apple

VisionOS 2 is Available in Developer Preview Starting Today

Today at WWDC 2024, Apple revealed VisionOS 2, the first major update for its Vision Pro headset. The new version of the software will be available for developers to experiment with starting today.

VisionOS 2 is primarily designed to round out some rough edges from the headset’s release earlier this year, while adding some new features and also expanding development capabilities so developers can take greater advantage of the headset.

While it won’t release publicly until the fall, Apple says developers can get their hands on VisionOS 2 starting today. We haven’t spotted the direct page to the developer preview update yet, but the company says it will be available through its official developer website.

VisionOS 2 is bringing a range of new features and improvements like 2D to 3D photo conversion, ultrawide Mac Virtual Display, new ways to navigate the headset’s core interface, and much more. We’ll be breaking down the full range of features soon, stay tuned to the front page!

The post VisionOS 2 is Available in Developer Preview Starting Today appeared first on Road to VR.

Image courtesy Apple

❌