top
2020/11/06 07:13:43
3D

Google 3DAR

Google 3DAR


Currently, the technology that is attracting the most attention among the technologies competing for development in various countries around the world is the technology for AR display in 3D. It's also secretly implemented in Google search. This time, I would like to introduce Google's AR article.

The animal 3D display of Google search supports "AR hidden in things". More realistic with ARCore depth recognition


Google Search's "3D display" has evolved to make real 3D animals look like they are in front of you.

In Google search on Android smartphones, the new 3D display function "Object Blending" makes it more natural than before, such as a virtual cat sticking out its head from the shadow and looking into it, and the lower body of the AR panda behind the desk is hidden. In addition, the display will be familiar to the actual 3D environment.

The 3D display of Google search is a function that allows you to refer to 3D models that move realistically when you search for Pomeranians, tigers, sharks, penguins, etc.

If you select "Display in surrounding space" ("AR" on iPhone), it will recognize a flat surface such as the floor and support AR that makes you look like you are there.

From this autumn, it became possible to see it in a Japanese environment, and it became a hot topic such as "Summoning a Google penguins to the room and taking a commemorative photo" and "Great white shark is so big! Scary!"

In this update, this AR display supports "object blending". After recognizing the surrounding environment not only as a flat surface but also as a deep 3D image, the hidden parts are hidden and displayed more naturally.

For example, a tiger reveals half of his body from the other side of a bush, or a wolf who can only look into his head from the edge of the bed.

With the conventional AR display, it is not possible to recognize the context of things, and since the 3D model was displayed just as if it was pasted on the screen, for example, a large animal that should be behind the room got on the desk in front. The position and size could be confusing, such as being displayed and looking like a miniature.

If object blending is enabled, it will be easier to grasp more realistically, although it depends on the performance of the smartphone and camera, the brightness of the surroundings, and the complexity.

If it doesn't draw well, or if you want to see the whole thing clearly without hiding it, you can turn object blending on and off by tapping the circle-like icon at the top right of the screen.


This process is called "occlusion" (concealment) in AR (Augmented Reality) technology, and is realized by Google's Depth API (depth detection) of his AR platform technology ARCore in 3D display of Google search.

Google Developers Blog: Blending Realities with the ARCore Depth API

A major feature of ARCore's depth recognition is that it can be used with smartphones that only have a normal single camera. Just as humans use the parallax of the left and right eyes as a clue to the stereoscopic effect, by acquiring and comparing a number of images that are slightly deviated from the movement of the smartphone, how far each pixel is in the foreground or in the back. To estimate.

According to Google developer Blog, there are more than 200 million devices in the world that can use ARCore depth recognition. For dual camera terminals and terminals with depth sensors, the accuracy is even higher (if supported).

Although it is used for occlusion processing in the 3D display of Google search, there are various uses for depth recognition, that is, stereoscopic recognition.

For example, an AR character climbs stairs, climbs over obstacles, virtual snow accumulates on real objects, and virtual objects that hit real furniture or trees react realistically.

Google is looking for developers who want to take advantage of his ARCore Depth API and is providing early access.

ARCore Depth API Call for Collaborators – Google

Speaking of AR occlusion processing, Apple has already supported "People Occlusion" from his iOS 13 that recognizes the parts of the human body and limbs and draws the context with AR.

This is the portrait mode of the camera app, which is similar to the process of estimating the outline of the subject and blurring the background. You can see AR characters between the fingers of your hand, and in the game Minecraft Earth, you can draw people standing in the block world.

On the other hand, it does not yet support depth recognition and occlusion processing other than the human body, so if you use the AR display of Google search on an iOS device, it will not be hidden in the shadow, but it will be hidden in your hands and people.

AR Micra Minecraft Earth gameplay premiere. ARKit 3 occlusion demonstration at # WWDC19


Niantic, the developer of Pokemon GO, is also developing occlusion processing with its own technology. In a demo video released about a year and a half ago, Pikachu and Eevee were seen running around behind the plants and hiding in the legs of a walking person.

It will not be introduced to Pokemon GO forever, but considering that the new function "Buddy and Adventure" has strengthened multi-user AR commemorative photography etc., it is time to see the depiction of Pokemon that blends into the environment more realistically. May be.

Niantic of Pokemon GO announces OS `` Real World Platform'' that connects virtual and real (2018)

TechCrunch Featured Article "8 Video Chat Apps to Support the Social Distance of the New Corona"

TechCrunch Japan Editorial Department Recommended Hardware Articles

Apple announces smart watches "Apple Watch Series 6" and "Apple Watch SE", released on September 18

Apple announces iPad Air with 5nm process SoC "Apple A14 Bionic"

Apple Watch Series 6 can measure blood oxygen saturation

download