I’m seeing a lot of people being excited, today, about putting in their orders for the first visionOS devices. Although it would be easy to dismiss this as simple Apple fan dribbling, their excitement is sort of hyperbolic. “This is a once in a lifetime opportunity,” says one person, talking about how now is the time to develop programs for this platform before all the existing tech giants come along and repeat their horrible “attention slot-machine” applications for this “new way of computing.”
So, other people are going to go off on the part where this developer is claiming that Apple has invented augmented reality; I don’t need to do that bit. I’m just going to spend a little time thinking about what this “new way of computing” means.
Okay, so this is a VR headset that not only tracks your eye and head position, but also your hands. So you don’t need separate controllers for your hands and you don’t need external cameras in your VR game room. Cool, cool. Let’s go ahead and think about this. What does this mean for interacting with software? Well, it means you can ditch the mouse/trackpad, because the headset can see your hands and you just reach out and fiddle with the stuff you’re doing. Web pages, I guess, or image processing documents. Maybe there’ll be some kind of Graffiti for writing, or some bullshit floaty keyboard that’s got an annoying lag and shitty hit detection (think first generation iOS screen keyboard and how satisfying that was for typing emails). Okay. And games, of course: immersive flashy stuff where you are really in the middle of things. That could be cool. But yeah, you’re going to want to do multiplayer and collaborative stuff, and there will be some cool and tricky tech there.
But, does this make interacting with your bank better? Does it make reading your emails more exciting? Even after my spam filters, I still get over 100 junk messages every morning. How does letting my computer know where I’m looking and what I’m interested in, as well as where my hands are and how fast I’m eating my toast while reading the newspaper and going through my inbox make my life better? Oh, and since the cameras are looking at my hands, they’re really looking at the whole room, so they can figure out where my hands are. I already refuse to let Alexa into my house, on the grounds that I’m not interested in letting Amazon eavesdrop on me; why would I want my computer not only to be listening to everything in the room, but also looking at everything and everyone in the room? And why should I trust Apple any more than I trust Amazon?
Guys, this is not a fundamentally new way of interacting with a computer. It’s a progression. And, you know, it’s not unalloyed goodness. Take a device that does something cool, that’s specifically tailored for you. It knows everything about you and it anticipates your needs and desires. It knows all your private stuff, but that’s okay because it’s only for you. Now stick a network interface on it and let it talk to the rest of the world. Do you honestly think all that personal information is going to stay private, on that device? Of course it won’t.
Today, Amazon and Google aren’t tracking your eyes as you look at your screen (probably) because the current technology that can do that isn’t reliably enabled and deployed, and the information isn’t easily accessible to the application layer. Your smartphone is doing it, and your laptop might be, but your web browser has some limitations there. When your eyes *are* your mouse, and your hands *are* the keyboard, well, you’ve just wired yourself up to a polygraph so you can look at e-commerce sites, and that’s not even the scariest part — it’s just the most obvious.
“But wait,” I imagine you cry, “you aren’t getting it at all! This is transformative, it allows you to do physical stuff with data, in ways that have been really hard up to now! Think of remote operation of robotics! Actually fly with your drone! Really immersive and expansive astronomy! The world is your mollusk!” Oh, yeah. And that will all totally be worth it for somebody.
Yeah. AR has been around for years. When the iphone came out I said “finally someone is going to do this well.” And really they did. But for AR? What is it they’re going to do well? AR stuff, I guess – but nobody cares (where “nobody” means some thousands of people).
Good luck with that.
Exactly. And I’m reminded of Stephen Wolfram’s “A New Kind of Science,” which wasn’t, really. I have seen *two* crowdfunding appeals, today, from developers who want everyone to chip in so they can buy this new device. The only way this could be more redonkulous would be for them to involve a blockchain.