Today's technology is undergoing rapid development. Business support and automation are benefiting from the evolution of technology. When business support becomes commonplace, it is right in front of us.
I was wondering if Apple is using technology to make smart glasses.
For the past few years, there have been constant rumors that "Apple will release smart glasses", but unfortunately the product was never unveiled at this WWDC. However, when I saw the keynote speech, I was wondering if Apple is using various technologies to make smart glasses in the future.
© Provided by ASCII
Then, what kind of technology will be the basis of Apple smart glasses? I would like to look back on the announcement at WWDC.
■ "Spatial audio" for AR content
First of all, attention is paid to the wireless earphone AirPods Pro. It was an unexpected development that the AirPods Pro update would be announced at the keynote.
AirPods Pro will support "spatial audio" with an update.
This allows you to enjoy three-dimensional sound when viewing content such as 5.1ch, 7.1ch surround, and Dolby Atmos.
However, what is interesting about the update of AirPods Pro is that the built-in accelerometer can grasp the movement of the head and output the sound without shifting the source.
When simply watching a movie or drama on an iPhone, iPad, or TV, if you can hear it as stereophonic sound, you will not need to use an accelerometer to detect the movement of your head. I always listen to the left and right sounds when viewed from the front of the screen, so my head doesn't move much.
However, if this is AR content, the story will change. When listening to sound with AirPods Pro while watching AR content on an iPad etc., it is assumed that you can move around the room freely while holding the iPad. At that time, the direction of the head moves 360 degrees. Accelerometer detection is useful there. This allows the sound to come from a well-defined direction no matter where the user is heading.
First of all, it will be used when enjoying AR games on the iPad, and in the future, by wearing smart glasses and wearing AirPods Pro, it will have a three-dimensional effect, and even if you move around, the sound will be heard from the determined direction. You should be able to hear it.
AirPods Pro is equipped with a powerful noise canceling function. If you apply spatial audio while canceling noise, you should be completely immersed in the world of AR.
■ "App Clips" to launch the app
Another keynote speech was App Clips, which made me wonder if this is for smart glasses.
App Clips has QR codes and NFC tags installed in the city. When you load it on your iPhone, a small app is downloaded and ready to use.
In the last few years, various stores have been making apps at the company, but it is difficult for users to download them. Therefore, Apple would like to prepare such a mechanism to make it easier to debut the app in order to revitalize the app market.
App Clips supports a function called "Sigh in with Apple" that allows you to smoothly register as a member with your Apple ID. You can also use Apple Pay. In other words, if you find App Clips in the city, you can become a member on the spot and complete the payment.
For example, if you have App Clips on a rental bicycle in the city, you can read it on your iPhone, complete membership registration and payment on the spot, and you can ride the bicycle immediately.
Apple is planning an Apple-specific QR code called "App Clips Code".
Therefore, I was curious about "Why are you preparing a QR code?"
■ Is it possible to make payments with the gesture function?
App Clips targets NFC tags and QR codes. Assuming the iPhone, it would be good if only NFC was supported. With the recent iPhones, it's almost possible to read NFC tags. It seems that all you need is an NFC that reacts immediately when you hold it over.
Certainly, I know that NFC tags are costly and should be compatible with QR codes that only need to be printed. But how much does Apple need to create its own QR code?
Then, what came to my mind was the preparation for smart glasses that would be introduced in the future.
Of course, smart glasses will be equipped with a camera. It would be very convenient if the smartglass camera could read the App Clips code and launch the app in the city.
The app is launched when the user sees the App Clips code on the smart glasses. If user authentication can be done with "Sign in with Apple" and payment can be done with "Apple Pay", the user does not need to operate anything by hand. Of course, just looking at the App Clips code is irresistible.
This time, Apple has included a function to read hand movements with ARKit. For example, if you stand with your thumb and load the gesture of "Like" into the smart glasses, the user interface that the payment is completed will be realistic because the user gives OK.
■ Maybe an unprecedented convenient and interesting smart glass will be born
The iPad Pro released in the spring of this year is equipped with a "LiDAR" sensor that measures the distance according to the time it takes for the laser to reflect off an object and return. As a result, the recognition of objects and spaces in AR has improved dramatically.
The LiDAR sensor is highly likely to be installed not only in the iPad Pro but also in new iPhone products that will be released this fall.
Space is recognized by the LiDAR sensor, and spatial audio is realized by AirPods Pro. Apple has prepared a mechanism to launch an app on the camera using the App Clips code.
By just combining these, a convenient and interesting smart glass that has never existed may be born.