Press This: XR and WP– WordPress-Powered Spatial Computing

Welcome to Press This, the WordPress community podcast from WMR. Each episode features guests from around the community and discussions of the largest issues facing WordPress developers. The following is a transcription of the original recording.

Powered by RedCircle

Doc Pop: You’re listening to Press This, a WordPress Community Podcast on WMR. Each week we spotlight members of the WordPress community. I’m your host, Doc Pop. I support the WordPress community through my role at WP Engine, and my contributions over on TorqueMag.Io where I get to do podcasts and draw cartoons and tutorial videos. Check that out.

You can subscribe to Press This on Red Circle, iTunes, Spotify, your favorite podcasting app or you can download episodes directly at wmr.fm

Today we have a very special guest who is an expert in virtual reality, augmented reality, XR, metaverse, whatever you wanna call it, and its intersection with WordPress. But before we dive into the fascinating world of VR, Let’s start off with the big news. You probably heard Apple made a significant announcement that they are about to shake up the virtual reality space.

They unveiled the much anticipated Vision Pro headset, which promises to deliver an immersive and groundbreaking XR experience. This development has sparked a renewed interest and enthusiasm for the Metaverse, VR, whatever you wanna call it. And it holds great potential for WordPress developers and agencies.

So let’s jump into the heart of today’s episode. I’m delighted to introduce our guest, Anthony Burchell. Anthony is a software engineer on ACF and he’s the brilliant mind behind the innovative Three Object Viewer plugin for WordPress. Now, this plugin empowers website owners, WordPress website owners, to run virtual reality experiences, spatial experiences directly through their WordPress website, all while managing the content within a virtual space very familiar to WordPress users. It’s the WordPress dashboard. With this amazing technology. So without further ado, let’s welcome Anthony Burchell to the show. Anthony, thank you so much for joining us. Let’s get started by just telling us how you got into WordPress.

Anthony Burchell: Yeah. Well actually it relates to how I got into the 3D web. I started out when I was very young. I used to make flash games and that naturally progressed to making flash websites, and then I started making 3D flash websites and ironically enough, Apple kind of killed Flash, and I kind of stopped making 3D websites.

So, I started looking into the next thing and at that exact same time WordPress was taking off with Custom Post Types and it was starting to get seen as a formidable way to make websites, not just a blogging platform. Imagine that. Kinda worked out.

Doc Pop: During the intro you probably heard me stumble over exactly what we’re gonna talk about, and I just wanna address that because I didn’t practice this. We’ve got: VR, we’ve got AR, we’ve got the Metaverse. I believe now there’s XR. What is the preferred term that you’d like us to use for this conversation and can you describe what that term kind of means and embodies?

Anthony Burchell: Yeah, so I think XR is probably the one that over the years has stood the test of time. And over the years I mean, like since 2018, I think that people were starting to say that. I think spatial computing is a really good way to say it. And yeah I think that there’s just so many terms.

I think XR is the nice one because when I think of virtual reality, I think of not only being fully immersed, but having like blends of immersion where you can kind of get some of your world inside of a virtual space, but still feel like you’re somewhere.

Doc Pop: And what does XR stand for?

Anthony Burchell: I think the agreed definition is extended reality or something like that.

Doc Pop: Extended reality. Well, it’s definitely the key thing, I think, to all of these technologies, whether we’re talking about VR or AR, or the metaverse or XR, I think spatial is a big component to what we’re talking about,

Anthony Burchell: Yeah, actually I wrote an article back in, I think it was 2019 in WP Tavern and the title of it was, I think it was something like WordPress in the Spatial Computing Future of the internet, just like very buzzwordy title, but that is essentially it. Spatial computing, 3D internet.

Doc Pop: Being able to move around spatially to navigate something is what we’re talking about when we’re talking about spatial computing. And with Apple’s entry into XR, how has the space changed since Apple made their announcement?

Anthony Burchell: I think one of the things is there was a big sigh of relief, at least in the circle that I’m in. I build on the WebXR standard, so that means everything that is happening spatially is happening from a browser and there’s a standard that many browsers, I think all browsers, most browsers have accepted and accommodate.

And this allows you to, from the browser, click a single button and enter. I think there was a big sigh of relief in this community just because Apple, they waited a day. The second day after the announcement, they announced that WebXR will be supported, and fully immersive WebXR. And I found it really interesting because Apple showed nothing, virtual reality, nothing fully immersed where the entire environment you’re in goes away and you’re somewhere.

What WebXR allows you to do is exactly that. So I found it really interesting that the best canvas we have is going to allow. Fully immersive experiences. So yeah, I think the general feeling is not that everyone’s gonna go out and buy a Vision Pro and drop $3,500 on it, but it does show there’s confidence from Apple in web-based standards and in spatial computing in general.

I think that’s the biggest takeaway. And I personally was really excited because my plugin, I’ve already tested it in Vision Pro, using their SDK and everything seems to be working and matching up with the WebXR standards that work with like the Quest Pro, the Quest and the Vive headsets.

Doc Pop: And there’s always been this space, this idea in the XR community that you don’t have to have a fancy headset to be able to experience some of this stuff. There was this idea that some users would maybe just hold up their phone and kind of use it to spatially navigate something by looking at their phone, but pointing it and moving it in different directions.

And so the hope may be that Apple Vision Pro might be good for the Apple Vision Pro community, but it might also bring some interest in to people who might wanna navigate with their phone because it’s gonna support the same standards.

Anthony Burchell: Well, that’s one of the interesting things that’s still yet to be announced by Apple is if they’re going to allow the WebXR standard to work with Safari on the mobile handheld browser. Currently they’ve kind of not to be negative, but they’ve kind of held back a lot of innovation in the WebXR community just because it hasn’t been accepted on the phone, which would allow you to do that sort of like navigating by picking up your phone and walking in a space and that sort of thing. 

Currently you could only do that with an Android device. The plugin that I build, I’ve got a way for people to do AR as a target to display a 3D object. The problem is it only works on Android. So I think that’s one of the things that’s yet to be determined. 

But the clear good signal is that the Vision Pro will accept this WebXR standard in a flag that you can turn on the Safari browser. So it’s a good signal in the right direction and I don’t find it hard to believe they would not allow this on cell phones in the near future.

Doc Pop: When we’re talking about the XR standards, the WebXR standards, I’m just kinda curious, is Apple kind of signaling they’re gonna be creating their own standards for some things? And are there bridges between these standards? 

Anthony Burchell: Yeah. Well, WebXR allows you to sort of have a unified way for having controller input, a way to enter an experience. So like that button that you click enter VR, just having a standard way to know when you’re entering an experience and when it should take over the page.

That’s what WebXR is specifying as a standard, and Apple hasn’t really put out anything as far as like they wanna be a standard. What they’ve instead signaled is that they want to adopt what the current 3D apps are doing, which a majority of them are being built in Unity.

And I found it really interesting when they announced the Vision Pro that they announced it with a Unity logo and said, we are going to support all of the developers that want to build in Unity to bring their creations into the Vision Pro. So they’re kind of just going with what’s working for the industry right now. So I found that really awesome. 

But they didn’t sort of heavy handedly say, this is your standards going forward.

One thing that they have been putting a lot of effort into is the 3D file type that they natively support, which is the USDZ Standard. And this is the Pixar file format standard. Think of it as like the Photoshop file, but for 3D assets and they have different versions of these files that can be like a compressed version, or a zipped version that has all of the assets bundled inside of it. So they’ve been really, really focused on that file standard. 

The file standard that I personally have put all of my effort into is the GLTF standard, cuz it’s more of an open standard. It’s easier to participate in the working groups that define these standards. So, yeah, there’s all kinds of different directions and people are working in these different focuses, but they’re all coming together with this idea of we need to figure out these interfaces.

Apple kind of was opinionated in saying, this is kind of a stationary device. It seemed like they positioned it as a stationary device because the rooms that they were showing was just a person sitting down mostly. So I think what they’re trying to focus on is the 2D interface interactions and just having a stationary person while companies like meta are trying to do full body tracking and like deeper ways to express yourself and more points of tracking so that you can do that.

Doc Pop: This is a good spot for us to take a quick break, and when we come back, we’re gonna keep talking to Anthony Burchell about what Apple’s announcement could mean for the XR community and what WordPress developers need to know about and tools they can use to get XR running on a WordPress environment. So stay tuned for more after the break.

Doc Pop: Welcome back to Press This, a WordPress Community podcast. Today we’re talking to Anthony Burchell, a software engineer at ACF and the creator of 3OV, Three Object Viewer for WordPress. Anthony, we were talking about what is XR before, and we basically said it’s the standard term is VR and AR and some of the other terms metaverse have kind of come and gone.

XR seems to be the long lasting one. So let’s talk about. What WordPressers need to know about XR? Is there existing XR support in WordPress?

Anthony Burchell: Not natively. You’ll have to go with the plugin. Again, I actually sought out to explore this a year ago, I think in March of 2022 is when I released the Three Object Viewer Plugin. And the plugin was sort of the answer to that question. It was like, what is missing in WordPress right now to get to a bare minimum just 3D object displaying in my site.

And maybe you can click the enter VR button and have it wrap around you. And that was in March of last year. It’s since progressed into an actual world builder with. 3D blocks that you can compose a scene with. To get started now, you could just install the Three Object Viewer Plugin. Another alternative is you could build with the rendering engine that I use in the plugin is called Three.js.

You can very easily embed Three.js inside of a front end web page just right in the header of the file and quickly compose some scenes with JavaScript to allow people to] enter in VR and have a fully XR enabled website. 

So there’s nothing really stopping WordPress today, aside from,  the file format for the web is, has been kind of agreed to be GLTF files. So you’ll need to add support for GLB files, which are the binary files of a 3D object. And that’s something that the Three Object Viewer Plugin does is in the media library, you’re allowed to upload GLB files and then also an avatar format called VRM, which is very similar to GLB files, but it has more avatar featured metadata and extensions inside of these characters.

So yeah, that’s sort of like how you could compose. In my plugin, you can just open up the 3D Environment Block, which has a bunch of inner blocks that you can select things like models, NPCs, videos, images, and then position them in a 3D scene all within your editor. And it’s all done very natively in exactly the same way that you would build in block-based templates.

So that’s one way. The other way is you could just straight up build it inside of a program called Blender. And there are many artists out there that are looking for work to build blender scenes. And you could just very easily create a simple one object scene and put that on a page using something like Three.js or the Three Object Viewer and allow people to go into it.

Now, that’s the part where it gets a little more difficult, cuz you need player controllers and all kinds of logic that’s what I’ve been focusing on for the last year. So it’s kind of what you wanna build. Like it comes down to what problem are you trying to solve?

And right now the spatial web doesn’t solve a ton of problems cuz we’re still trying to figure out what it is. But I see a future where we’ll have things like shops and permanent AI agents hanging out inside of our shop selling for us, which I’ve already got. If you go to 3OV.XYZ, at the bottom of the page, you can load in my AI that knows everything about the Three Object Viewer Plugin, and you can chat with her and ask her questions about it.

So that’s kind of how you can think about building is I like starting with a problem and going from there. 

Doc Pop: I feel like we could talk about front end and back end and I feel like on the front end, and maybe I’m misusing those terms, but I feel like on the front end we’re talking about like the website that visitors come to and it’s got Three.js, so that’s what powers the 3D rendering and the user might have the Apple Vision Pro or they might have a phone or some sort of VR headset. 

What I wanna talk about is the, the backend, cuz this is one of the things that always blew me away about Three Object Viewer is basically when you have a spatial scene, you have content, you have a wall, you have another wall, you have ground, you have a character, you have audio, you have objects on the wall or objects floating and the best way to manage content would be in WordPress, it seems, right, a very easy way to just create a bunch of content and manage it, a content management system, and I really like that that’s basically what you’re kind of really leaning on WordPress for, is for managing all the content that makes up a scene and then you’ve got this thing that kind of does some magic and now enables that to be shown on the front end.

Is that kind of a good description?

Anthony Burchell: Yeah. You’re using the Block Editor to compose 3D scenes. It’s exactly that. And the catchphrase of the plugin that I’ve been using and something I’ve been saying for a very long time is a post is a place, and up until now, a post has only been an asynchronous place where you can go and leave comments.

But now it’s becoming a thing where it is a place where you go visit a website. And somebody else might be there with you. There might be an NPC and it might feel like that NPC is real and in the room with you and interacting with you  and making personalized recommendations based on you being logged in and in what it knows about you.

So we’re entering this new blank slate really, of the web. It’s an evolution of the web where we can finally break out of these screens and kind of go to the outside and kind of put something in. One of the things I’m gonna be working on soon is AI agents where you can put ’em in mixed reality.

So if you are wearing the Apple Vision Pro and they’re claiming it’s something you can wear all day and they’re aiming, it seems to be a monitor replacement. But the way that I’m envisioning this, is that you could be working on your Gutenberg template, your block template, and then right next to you on the side of your desk, there’s your AI agent making recommendations or maybe holding up signs with a bunch of content recommendations or ways that you can reword things or meta descriptions or just any information. Maybe you got a meeting coming up. 

So that’s kind of the way I see the future of WordPress and XR and then with collaborative editing coming soon to WordPress, that’s something that I’m gonna fully utilize on the front end so that I could allow people to collaboratively 3D edit.

I’ve got a working prototype of that today, but I wanna see what WordPress does first so that I can maybe just use that.

Doc Pop: If a client heard the news about Apple’s Vision Pro, got really excited and wants to talk to their agency or talk to a web developer about how can we integrate XR onto our website? What are some tips you have for that seamless experience and for like first time people building an XR website.

Anthony Burchell: Yeah, well one thing I do is don’t put it at the very top of your page. It’s one of those things, it can become a trap. So I like to put it at the very bottom because cursors can click into it and get locked inside of the space. And it’s just not a great experience right now. The UX are still figuring out how to properly get people into experiences.

And the way that I compare it is the mobile web transition. When desktop websites were mobile responsive, were now going to have this new future web where they’re going to be 3D responsive. And for a long time, mobile responsive websites were not very mobile responsive.

People were making separate websites to even handle the mobile traffic and things like that. So we’re gonna do it wrong for a while. But right now the way that you would do it is it would be like a frame. The way the Three Object Viewer does it is you set a preview image, and that’s the image that kind of shows people what room they’re about to enter. And then it has a button in the middle that says Load World. And you click Load World and it’ll render inside of that container, the world. And then at the very top of the screen, it’ll say something like, enter in VR or Enter in AR.

And then you could click that if you were in a device, in like a headset, you would just use your in the Apple Vision pro your hands or your eyes, and look at the button and then do the click motion with your hands. And then it would click the button and then enter you in virtual, fully immersive virtual reality.

And then when you’re in there, hovering above you at all times is a little arrow. No matter where you turn your head, the arrow follows you. And you just look at that arrow and click with your hand and it’ll open a menu to exit the immersive experience and go back to the webpage 2D flat, like a monitor would look on your laptop or on your desktop.

Doc Pop: That’s really cool that you have this like experience with Apple’s headset that we can talk about after this next break. Before we do, Anthony, what is your favorite VR, AR experience you’ve had so far? Like, just a real quick, what’s your favorite thing you’ve done in this world?

Anthony Burchell: I like world hopping in VR chat, just kind of bouncing room to room with friends. That’s probably my favorite thing to do. There’s a conference that it’s called VCAT, the Virtual Market, and every I think summer and winter, they gather a bunch of creators in VRChat and give them a booth where they can kind of express themselves and show their work.

And I love going there with friends, just kind of bouncing booth to booth, trying on the little hats and things that they’re making. I even have a booth for an upcoming VCAT that I’m working on. I think VR chat’s just the best experience right now for the Metaverse. It’s truly the, the, the full package. It’s got full body tracking, all of the features you want.

Doc Pop: Well, we’re gonna take one last break and when we come back we’ll wrap up our conversation with Anthony Burchell about XR experiences and his experience with the Apple Vision Pro. So stay tuned for that.

Doc Pop: Welcome back to Press This, a WordPress Community podcast. I’m your host, Doc Pop. Today I’m talking to Anthony Burchell, and right before this break, Anthony was telling us about his experience using the Apple Vision Pro SDK and Anthony you have not actually used the headset yet, but you have used the SDK and you kind of have a pretty good feeling of maybe how it’s going to feel.

Anthony Burchell: Yeah. So, Apple just, I think like maybe five days ago released, Xcode 15 Beta 2, and this version of Xcode introduces the Vision Pro SDK. And inside of that it also adds a simulator. And the simulator can essentially do everything that the Vision Pro can do. 

It essentially is the device. You can get a feel for the interfaces today. The very first thing I did was I opened up Xcode. I opened up the simulator and I went straight to the 3ov.xyc website and I went to the bottom, clicked the load world button, and that’s how I was able to find out if it’s going to work there. And so far the only thing that I haven’t been able to test, and the only thing that they haven’t really implemented an emulator for is in WebXR an input you like the clicking of your finger, that doesn’t, that doesn’t exist yet. So teleportation isn’t working, but it will work once it comes out.

The thing I’m lacking right now is just the headset to get the input controls. But from my understanding, if they are fitting to the WebXR standard, then they’re just going to use the eyes as the laser for where the controllers would be pointing a laser, and then the tap of your fingers as the click.

So it’ll probably do exactly the same thing that the current experiences of other headsets where you hold the controller, you point the laser where you want to go, and then you click the trigger to go there. It’ll probably be that, but they’ll separate the laser and the trigger, so you’ll have two inputs, your eyeballs and your fingers to use those

Doc Pop: We just have to address how crazy that is. So there are existing headsets out there, but what’s really being done with the Apple headset that may make it kind of stand out, may make it the thing that kind of crosses over. It’s definitely the thing people are most excited about is the combination of gestures and inputs.

The inputs are these cameras it has around your eyes, around your face, and it’s actually tracking what you’re looking at. And it’s actually anticipating, I’ve heard, it’s anticipating that it actually knows before you click. So the click kind of confirms it or whatever. But from what I’ve heard, the research that they did at Apple kind of shows that they can actually tell when you’re about to click before you even move your hands.

But the input is gonna feel like your hands are doing something and your eyes are just kind of looking in a direction, and that’s a thing you haven’t been able to test out yet. But everything else you’re saying is kind of working. You actually experienced the WordPress website through the SDK.

Anthony Burchell: Yeah. Yeah, it totally works. And it’s worth noting that the Quest Pro does have hand tracking, and it’s very similar, where you close your fingers to click to go somewhere and I was actually recently at the Meta campus doing a hackathon because they wanted people to be building with the hand awareness and also table awareness and things like that.

So I built a hackathon where we had butterflies that were flying from table to table because it knew that there were tables. So the only difference, it’s gonna be between the Quest Pro and, and the Vision Pro is the hands in the Quest Pro are what you use to point the laser and you just click your finger to go there.

And that currently works with 3OV. So if you want to get a kind of a feel of the interface, you could do that in a Quest Pro today. But yeah, Apple’s really taken it to the next level by separating those and doing kind of I guess behavior analysis on what you’re doing while you’re browsing.

From what they say, it all stays on device. So that’s a good sign. It’s just a really exciting device. I hope I can get my hands on one. They have the developer kits opening up in July. So if any agencies are wanting to start planning things out in July on Apple’s website, I think you can just Google “dev kit Vision Pro”.

They will be opening up an application so that you could get a dev kit. And they’re famously a company that will retrieve their dev kits once you’re done. So you won’t be getting a free Vision Pro, but you will have some comfort knowing that you can start activating on it.

Doc Pop: Well, Anthony, I really appreciate your time today. It’s really fun talking to you about this new space. If people wanna learn more about what you’re working on, what’s a good way to do that, and where can they find out more about Three Object Viewer as well.

Anthony Burchell: Yeah, if they want to find out more about what I’m working on, I blog on the 3OV.XYZ. I blog there a lot. and if you want to get started with the plugin today, it’s a free plugin in the WordPress Plugin Repository. Just search for Three Object Viewer.

I think you can even search Metaverse and it’ll come up, and install that and then make a new post and add the 3D environment block and you’re already started, so yeah.

Doc Pop: Thanks for listening to Press This, a WordPress community podcast on WMR. Once again, my name’s Doc and you can follow my adventures with Torque magazine over on Twitter @thetorquemag or you can go to torquemag.io where we contribute tutorials and videos and interviews like this every day. So check out torquemag.io or follow us on Twitter. You can subscribe to Press This on Red Circle, iTunes, Spotify, or you can download it directly at wmr.fm each week. I’m your host Doctor Popular I support the WordPress community through my role at WP Engine. And I love to spotlight members of the community each and every week on Press This.

Doc Pop: Thanks for listening to Press This, a WordPress community podcast on WMR. Once again, my name’s Doc and you can follow my adventures with Torque magazine over on Twitter @thetorquemag or you can go to torquemag.io where we contribute tutorials and videos and interviews like this every day. So check out torquemag.io or follow us on Twitter. You can subscribe to Press This on Red Circle, iTunes, Spotify, or you can download it directly at wmr.fm each week. I’m your host Doctor Popular I support the WordPress community through my role at WP Engine. And I love to spotlight members of the community each and every week on Press This.

The post Press This: XR and WP– WordPress-Powered Spatial Computing appeared first on Torque.

This content was originally published here.