It’s no different than when your eyes are open. You can just ignore it to the point where it’s irrelevant to your brain, like you can anything, to be completely in vivid imagination. I don’t believe that anyone literally ”sees“ an image behind their eyelids. We can only summarize and everyone will fill in the blanks their own way. Same goes for trying to truly explain in complete detail what’s going on in your head to anyone else who doesn’t share that brain of yours. Like you can’t explain color to blind people. I can totally understand both sides, and id be willing to be that most people can, but get a misconception about what other people are really trying to explain. Peoples definitions of visualizing are almost guaranteed to be different. We’ll have to see if it becomes a reality when Apple launches the Vision Pro 2 in the future.This seems extremely susceptible to the issues of response bias. Since this is just a patent, there’s no way of knowing if Apple will ever put this idea into practice - but if it does, it could be a great new use case for the Vision Pro. Apple says this could include “hidden objects such as objects with known locations that are obscured from view by other physical objects.” That could be useful for navigation systems - for example in Apple’s long-rumored self-driving car. This technology might not be limited to invisible signals, but could also include physical objects that are simply unseen by the headset wearer. That would presumably offload some of the processing from the headset to another device, helping to keep the processing power and temperature at reasonable levels. Interestingly, Apple explains that these sensors don’t necessarily need to be on the headset itself, but could be found on a connected device, such as an Apple Watch or an iPhone. The headset would then display a visualization of the signal in the appropriate place, overlaid in augmented reality. That might mean it looks for a Wi-Fi router if internet signals are picked up, for instance. Once a sensor has picked up an invisible signal, the device then obtains a depth map of the environment around you and looks for contextual objects near where the signal was detected. For example, Apple says that “billboards, posters, or other print or screen media in the physical environment may emit non-visible light such as light that can be detected and visualized.” Think 3D movies (or commercials), only with a Vision Pro. AppleĪside from repair work, Apple’s patent details how headset wearers could get extra experiences with their device thanks to its ability to detect invisible signals. This ability could help engineers diagnose problems in your home, for example - or let you fix them yourself. Instead, Apple is apparently working on letting the Vision Pro visualize things like electrical currents, radio signals, Wi-Fi output, and more. No, we’re not talking about some kind of New Age mysticism. Before it launched, there was concern that Apple’s Vision Pro headset could struggle to find a strong ‘killer app.’ Well, Apple might have found one, as a fresh patent from the company explains that future versions of the device could let you see the invisible energy that floats around you.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |