WO2021036520A1 - 一种图像处理方法及其装置、设备和存储介质 - Google Patents
一种图像处理方法及其装置、设备和存储介质 Download PDFInfo
- Publication number
- WO2021036520A1 WO2021036520A1 PCT/CN2020/100713 CN2020100713W WO2021036520A1 WO 2021036520 A1 WO2021036520 A1 WO 2021036520A1 CN 2020100713 W CN2020100713 W CN 2020100713W WO 2021036520 A1 WO2021036520 A1 WO 2021036520A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- virtual object
- reference surface
- image
- display size
- determining
- Prior art date
Links
- 238000003672 processing method Methods 0.000 title claims abstract description 36
- 238000000034 method Methods 0.000 claims abstract description 55
- 238000009877 rendering Methods 0.000 claims abstract description 18
- 238000012545 processing Methods 0.000 claims description 39
- 238000004891 communication Methods 0.000 claims description 14
- 230000003287 optical effect Effects 0.000 claims description 11
- 238000004590 computer program Methods 0.000 claims description 10
- 238000010586 diagram Methods 0.000 description 25
- 230000003190 augmentative effect Effects 0.000 description 21
- 230000008569 process Effects 0.000 description 20
- 230000006870 function Effects 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 5
- 230000002452 interceptive effect Effects 0.000 description 4
- 239000000203 mixture Substances 0.000 description 4
- 238000013519 translation Methods 0.000 description 4
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000009286 beneficial effect Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000007423 decrease Effects 0.000 description 2
- 230000000694 effects Effects 0.000 description 2
- 239000011521 glass Substances 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 241001465754 Metazoa Species 0.000 description 1
- 241001494479 Pecora Species 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013461 design Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- 238000012800 visualization Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/20—Editing of 3D images, e.g. changing shapes or colours, aligning objects or positioning parts
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/24—Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30244—Camera pose
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2210/00—Indexing scheme for image generation or computer graphics
- G06T2210/21—Collision detection, intersection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2012—Colour editing, changing, or manipulating; Use of colour codes
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2219/00—Indexing scheme for manipulating 3D models or images for computer graphics
- G06T2219/20—Indexing scheme for editing of 3D models
- G06T2219/2016—Rotation, translation, scaling
Definitions
- This application relates to the field of computer vision technology, and relates to but is not limited to an image processing method and its device, equipment and storage medium.
- Augmented Reality is a technology that calculates the position and angle of the camera image in real time and adds the corresponding image.
- AR technology enhances the communication and interaction between users and the surrounding world with an immersive and interactive experience.
- AR technology is widely used in various fields such as education, shopping, art and games.
- the method used in related technologies is generally to manually zoom in or out of the object, and then select the placement position, which not only increases user operations, but also makes it impossible to perform an interactive AR experience process.
- the embodiments of the present application expect to provide an image processing method, apparatus, equipment, and storage medium.
- An embodiment of the present application provides an image processing method, the method includes:
- the virtual object is rendered on the image according to the display size.
- the determining the display size of the virtual object based on the pose information includes:
- the display size of the virtual object is determined based on the included angle.
- the display size can be determined by the angle between the second reference plane and the first reference plane where the image acquisition device is located, thereby realizing automatic adjustment of the display size of the virtual object, and improving the intelligence of the terminal.
- the determining the angle between the second reference surface where the image acquisition device is located and the first reference surface based on the pose information includes:
- first coordinate information of the image acquisition device in a first coordinate system based on the pose information, where the first coordinate system is a coordinate system established based on the first reference plane;
- an angle between the first reference surface and the second reference surface is determined.
- the angle between the first reference surface and the second reference surface can be determined through the above solution, thereby providing a necessary data basis for the subsequent determination of the display size based on the angle.
- the determining the display size of the virtual object based on the included angle includes:
- the display size of the virtual object is determined.
- the display size corresponding to the angle between the first reference surface and the second reference surface can be determined, so that the image processing device can render the virtual object based on the display size.
- the method further includes:
- the rendering the virtual object onto the image according to the display size includes:
- the placement area of the virtual object will also move accordingly, thereby ensuring that the virtual object is always in the display screen of the first device and improving user experience.
- the method further includes:
- the virtual object is rendered on the image according to preset first color information to The virtual object is prompted to exceed the boundary.
- the virtual object when the placement area of the virtual object exceeds the boundary of the first reference surface, the virtual object is rendered with the first color information that is different from the original color information of the virtual object, so that the user can intuitively understand that the virtual object has exceeded the boundary , So as to adjust the placement area of the virtual object.
- An embodiment of the present application provides an image processing device, and the device includes:
- the first determining module is configured to determine the first reference plane on which the virtual object is placed based on the image collected by the image collecting device;
- a first acquisition module configured to acquire the pose information of the image acquisition device relative to the first reference surface and the virtual object to be placed
- the second determining module is configured to determine the display size of the virtual object based on the pose information
- the first rendering module is configured to render the virtual object on the first reference surface in the image according to the display size.
- An embodiment of the present application provides an image processing device, the image processing device at least includes: a memory, a communication bus, and a processor, wherein: the memory is configured to store an image processing program; the communication bus is configured to implement the processor Connection and communication between the memory and the memory; the processor is configured to execute the image processing program stored in the memory to implement the steps of the image processing method described in the above solution.
- An embodiment of the present application provides a storage medium on which an image processing program is stored, and when the image processing program is executed by a processor, the steps of the image processing method described in the above solution are implemented.
- the embodiment of the present application provides a computer program, including computer readable code.
- the processor in the electronic device executes the image processing method described in the above solution. A step of.
- the embodiments of the present application provide an image processing method and device, equipment, and storage medium, wherein the method includes: first determining a first reference surface on which a virtual object is placed based on an image captured by an image capturing device; and then acquiring all the virtual objects.
- the image capture device is relative to the pose information of the first reference surface and the virtual object to be placed, and the display size of the virtual object is determined based on the pose information, wherein the pose information indicates that the image capture device is far away from the first reference surface.
- the display size of the virtual object is reduced.
- the display size of the virtual object is increased, and finally the virtual object is rendered according to the display size To the image; in this way, the display size of the virtual object can be automatically adjusted by adjusting the pose information between the image acquisition device and the virtual object placement plane, without manual adjustment, which not only simplifies the operation, but also improves the performance of the augmented reality application Interactive and entertaining.
- FIG. 1A is a schematic diagram of a network architecture of an image processing method according to an embodiment of this application.
- FIG. 1B is a schematic diagram of yet another network architecture of the image processing method according to an embodiment of this application.
- FIG. 2A is a schematic diagram of an implementation flow of an image processing method provided by an embodiment of this application.
- 2B is a schematic diagram of an application scenario of an image processing method according to an embodiment of the application.
- FIG. 3 is a schematic diagram of another implementation process of the image processing method provided by an embodiment of the application.
- FIG. 4 is a schematic diagram of still another implementation process of the image processing method provided by an embodiment of the application.
- FIG. 5 is a schematic diagram of an application scenario of an image processing method provided by an embodiment of the application.
- FIG. 6 is a schematic diagram of the composition structure of an image processing device provided by an embodiment of the application.
- FIG. 7 is a schematic diagram of the composition structure of an image processing device provided by an embodiment of the application.
- first ⁇ second ⁇ third involved only distinguishes similar objects, and does not represent a specific order of objects. Understandably, “first ⁇ second ⁇ third” Where permitted, the specific order or sequence can be interchanged, so that the embodiments of the present application described herein can be implemented in a sequence other than those illustrated or described herein.
- FIG. 1A is a schematic diagram of a network architecture of an image processing method according to an embodiment of the application.
- the network architecture includes a first device 101, a server 102, and a network 103.
- the first device 101 passes through The network 103 is connected to the server 102.
- the network 103 may be a wide area network or a local area network, or a combination of the two, and uses wireless links to implement data transmission.
- the mobile terminal of the first device 101 may be, for example, a smart phone or AR glasses.
- the first device 101 is exemplarily shown in the form of a smart phone.
- the first device 101 may include an image capture device 1011.
- the server 102 may refer to one server, or may be a server cluster composed of multiple servers, a cloud computing center, etc., which is not limited herein.
- the image acquisition device in the first device 101 can collect real image data, and then the first device 101 collects the real image data of the image acquisition device, the virtual object to be added, and the internal parameters of the image acquisition device. It is sent to the server 102 via the network 103, and the server 102 determines the plane in the real image data and the pose information of the image acquisition device relative to the plane, and then the server 102 determines the display size of the virtual object according to the pose information, and then The virtual object is rendered on the determined plane according to the display size, and the rendered augmented reality image data is sent to the first device 101, and the first device 101 outputs the augmented reality image data.
- FIG. 1B is a schematic diagram of another network architecture of the image processing method according to the embodiment of the application.
- the network architecture includes only the first device 101, and the first device 101 may be a smart phone or AR glasses, etc.
- the first device 101 is also exemplarily shown in the form of a smart phone.
- the first device 101 includes at least an image acquisition device and a processor.
- the image acquisition device can collect real image data, and then the processor in the first device 101 determines based on the real image data collected by the image acquisition device and the internal parameters of the image acquisition device The plane in the real image data and the pose information of the image acquisition device relative to the plane, and then the display size of the virtual object is determined according to the pose information, and the virtual object is rendered on the determined plane according to the display size to obtain Augmented reality image data, the first device 101 outputs the augmented reality image data.
- FIG. 1A and FIG. 1B With reference to the schematic diagrams of the network architecture shown in FIG. 1A and FIG. 1B, the following describes various embodiments of an image processing method, an image processing apparatus, and equipment.
- FIG. 2A is a schematic diagram of an implementation process of the image processing method provided by an embodiment of the application
- FIG. 2B is a schematic diagram of an application scenario of the image processing method according to an embodiment of the application, which will be performed in conjunction with the steps shown in FIG. 2A Description.
- the image processing device may be the server as shown in FIG. 1A, or the first device as shown in FIG. 1B.
- Step S201 The image processing device determines a first reference surface on which the virtual object is placed based on the image collected by the image collecting device.
- the first reference surface may be a flat surface, and in some embodiments, the first reference surface may also be an inclined surface.
- the first reference plane is a plane
- the first reference plane may be a horizontal plane.
- the image acquisition device is a photographing or imaging device used to acquire images of the current real scene for the purpose of augmented reality. Therefore, the images collected by the image acquisition device can be regarded as real world images.
- the image processing device must first identify each reference surface in the image, and determine the first reference surface from each reference surface.
- each plane in the image can be detected.
- plane visualization can also be used to help the user determine the first reference plane on which to place the virtual object. For example, you can highlight the plane that has been detected and where the virtual object can be placed; you can also create a visual difference between different planes to avoid confusion when placing the virtual object; or highlight the plane the user is viewing or pointing to, and Don't highlight multiple planes at once.
- the first reference surface on which the virtual object is placed is determined among the determined reference surfaces.
- the image acquisition device may be a device in the first device.
- the method further includes: the first device collects the image collected by the image acquisition device and the image acquisition device The internal parameters are sent to the server.
- Step S202 The image processing device acquires the pose information of the image acquisition device relative to the first reference surface and the virtual object to be placed.
- step S202 when step S202 is implemented, the coordinates of each vertex on the first reference plane are first acquired; then, the pose information of the image acquisition device relative to the first reference plane is calculated according to the coordinates of each vertex.
- the perspective N Point (PNP) problem can be solved according to the relative coordinates of each vertex, so as to obtain the pose information of the image acquisition device relative to the first reference surface, that is, the image acquisition device The amount of translation and rotation relative to the world coordinate system where the first reference plane is located.
- SLAM Simultaneous Localization And Mapping
- IMU Inertial Measurement Unit
- Step S203 The image processing device determines the display size of the virtual object based on the pose information.
- the second reference surface where the image capture device is located and the first reference surface can be determined according to the rotation amount or rotation matrix in the pose information.
- An included angle between the reference surfaces, and then the display size of the virtual object is determined based on the included angle.
- the display size of the virtual object is enlarged.
- the angle between the first reference surface becomes larger, it can be considered that the first device is far away from the first reference surface.
- the display size of the virtual object is reduced, so there is no need to manually adjust the display size of the virtual object, thereby simplifying the operation , And increase the interactivity of AR.
- Step S204 The image processing device renders the virtual object onto the image according to the display size.
- the image processing device may perform virtual and real superposition processing of the virtual object with the image according to the determined display size according to the pose information, so as to obtain an augmented reality image.
- Step S204 can be regarded as rendering the virtual object according to the display size on the display interface of the first device displaying the real image, for example, it can be a virtual item, virtual character, etc., and controlling the activity of the virtual object through the operation of the touch screen or the handle by the user. To achieve augmented reality effects.
- the first reference surface on which the virtual object is placed is determined based on the image captured by the image capture device; then the pose of the image capture device relative to the first reference surface is acquired. Information and the virtual object to be placed, and determine the display size of the virtual object based on the pose information. When the pose information indicates that the image acquisition device is far from the first reference surface, the display size of the virtual object is reduced.
- the pose information indicates that when the image acquisition device is close to the first reference surface, the display size of the virtual object is increased, and finally the virtual object is rendered on the image according to the display size; in this way, it is achieved by adjusting the image acquisition device and the virtual object Place the pose information between the planes to automatically adjust the display size of the virtual object without manual adjustment, which not only simplifies the operation, but also improves the interactivity and entertainment of the augmented reality application.
- step S203 can be implemented through the following steps:
- Step S2031 Determine the angle between the second reference surface where the image acquisition device is located and the first reference surface based on the pose information.
- the image processing device is a smart phone
- the second reference surface where the image acquisition device is located can be considered as the surface where the display screen of the smart phone is located.
- step S2031 can be implemented through the following steps:
- Step 11 Determine first coordinate information of the image acquisition device in a first coordinate system based on the pose information.
- the first coordinate system is a coordinate system established based on the first reference plane.
- the first coordinate system may adopt the world coordinate system. At this time, when calculating the angle between the second reference surface and the first reference surface, it is based on the world coordinate system, the first reference surface and the second reference surface. The spatial position relationship between the two reference planes is calculated by calculating the angle between them.
- the first coordinate system may also be a coordinate system formed by using a preset point in the first reference plane as the origin and three coordinate axes perpendicular to each other.
- the pose information includes the amount of translation and rotation of the image acquisition device relative to the coordinate system where the first reference plane is located
- the amount of translation and rotation of the image acquisition device in the first coordinate system can be determined according to the amount of translation and rotation.
- the first coordinate information is the amount of translation and rotation of the image acquisition device relative to the coordinate system where the first reference plane is located
- Step 12 Obtain second coordinate information of the intersection of a straight line passing through the optical center of the lens in the image acquisition device and perpendicular to the second reference surface and the first reference surface.
- the straight line passing through the optical center of the lens in the image acquisition device is perpendicular to the second reference plane where the image acquisition device is located.
- the second coordinate information can be determined after the first coordinate information of the image acquisition device and the angle between the straight line and the first reference plane are known .
- Step 13 Determine an included angle between the first reference surface and the second reference surface based on the first coordinate information and the second coordinate information.
- the first reference surface is 211 in FIG. 2B
- the second reference surface is 212 in FIG. 2B, which passes through the optical center O of the lens in the image capture device and is perpendicular to the second reference surface 212.
- the intersection of the straight line 213 and the first reference plane 211 is the point A in FIG. 2B.
- the first coordinate information may be the coordinates of the optical center O in the first coordinate system.
- the distance l between OAs can be determined according to formula (1) OA :
- the first reference plane 211 is a plane formed by two coordinate axes in the first coordinate system. Assuming that in the embodiment of the present application, the first reference plane is a plane formed by xoy, then in FIG. 2B, the distance between OP That is z 1 . Then,
- Step S2032 Determine the display size of the virtual object based on the included angle.
- step S2032 can be implemented through the following steps:
- Step 21 Obtain the corresponding relationship between the display size and the angle.
- the corresponding relationship may include a reference angle and a reference size corresponding to the reference angle, for example, the reference angle may be 45 degrees, corresponding to a reference size.
- the corresponding relationship may also include multiple reference angles, and each reference angle corresponds to a reference size.
- Step 22 Determine the display size of the virtual object based on the corresponding relationship and the included angle.
- step 22 may be implemented by determining the proportional relationship between the included angle and the reference angle, and then according to the ratio
- the reference size corresponding to the relationship and the reference angle determines the display size of the virtual object.
- the reference angle can be 45 degrees
- the reference size is 20*20
- the included angle between the first reference plane and the second reference plane is 30 degrees
- the image acquisition device is considered Focus on the close area and enlarge the size at this time.
- the display size is adjusted to 3/2 of the reference size, that is, the display size is 30*30. If the included angle between the first reference plane and the second reference plane is 60 degrees, and the included angle becomes larger relative to the reference angle, then it is considered that the image acquisition device is focused far away at this time, and the size should be reduced at this time.
- step 22 may be implemented by determining the reference angle closest to the included angle, and determining the reference size corresponding to the reference angle as The display size of the virtual object; it can also be that after the reference angle closest to the included angle is determined, the virtual object is determined according to the proportional relationship between the included angle and the closest reference angle and the reference size corresponding to the closest reference angle Display size.
- the angle between the second reference surface and the first reference surface where the image acquisition device is located can be used to determine whether the image acquisition device is focused near or far, and when focusing on the near place, increase Large display size, the display size is reduced when the focus is far away, so as to realize the automatic adjustment of the display size of the virtual object, and increase the AR interactivity and entertainment.
- the method further includes:
- Step 14 Determine a placement area of the virtual object on the first reference plane based on the second coordinate information and the display size of the virtual object.
- a circular area may be determined based on the coordinate point corresponding to the second coordinate information.
- the radius of the circular area may be determined according to the display size, for example, the radius may be determined according to the display size.
- the display size determines the radius of the circumcircle of the lens on the first reference plane.
- the placement area may also be determined according to the actual shape of the virtual object, for example, it may also be a square, a rectangle, a triangle, and so on.
- step S204 may be implemented as follows: based on the placement area of the virtual object on the first reference surface, rendering the virtual object to the image according to the display size Above; in this way, when the image acquisition device is focused on different positions, the placement area of the virtual object will also move accordingly, so as to ensure that the virtual object is always in the display screen of the first device and improve the user experience.
- FIG. 3 is a schematic diagram of another implementation process of the image processing method provided by the embodiment of the application. As shown in FIG. 3, after step S203, the method also includes:
- step S204' the image processing device determines the boundary of the first reference surface and the placement area of the virtual object.
- the boundary of the first reference surface can be determined according to the coordinates of each vertex of the first reference surface.
- the placement area of the virtual object can be determined in step 14.
- Step S205 The image processing device determines whether the placement area exceeds the boundary of the first reference surface.
- step S206 when the placement area exceeds the boundary of the first reference plane, go to step S206; when the placement area does not exceed the boundary of the first reference plane, go to step S207.
- Step S206 The image processing device renders the virtual object onto the image according to the preset first color information and display size, so as to prompt the virtual object to exceed the boundary.
- the first color information is different from the original color information of the virtual object, for example, it may be gray or red.
- Step S207 The image processing device renders the virtual object onto the image according to the preset second color information and display size.
- the second color information may be the original color information of the virtual object.
- FIG. 4 is a schematic diagram of still another implementation flow of the image processing method provided by an embodiment of the application. As shown in FIG. 4, the method includes:
- Step S401 The first device collects an image of a real scene through its own image acquisition device.
- the image acquisition device may be a camera. What the first device collects through its own image acquisition device is a real scene image, and the real scene image is also a real real world image collected by the image acquisition device.
- Step S402 The first device sends the real scene image to the server.
- the first device when the first device is recording, the first device will send each frame of image collected to the server.
- Step S403 After receiving the real scene image, the server recognizes each reference surface in the real scene image.
- the server will perform feature extraction on the received real scene image, and fit the extracted feature points with a preset plane, so as to determine each reference surface in the real scene image.
- step S404 the server sends the identified information of each reference plane to the first device.
- the reference surface information may be the coordinates of each vertex of the reference surface.
- the first device receives the various reference surface information sent by the server, it will visually distinguish different reference surfaces on the display interface. For example, when the recognized reference surface is the ground, it is shown as a red square. When the reference surface is the desktop, it is shown as a yellow square.
- step S405 the first device determines the first reference surface on which the virtual object is placed and the virtual object to be placed.
- the first device shows various reference planes, and the user can select the first reference plane on which to place the virtual object according to his own needs. At this time, the first device determines the first reference plane based on the user's selection operation.
- the first device may also determine the first reference plane according to a preset strategy according to the identified reference planes.
- the preset strategy may be to use the largest area of the reference planes as the first reference plane.
- the setting strategy can also be to use the one closest to the center of the image as the first reference plane among the reference planes.
- Step S406 The first device sends the determined first reference surface information and the virtual object to be placed to the server.
- the virtual objects to be placed may be characters, animals, buildings, objects, and so on.
- Step S407 The server determines the pose information of the image acquisition device relative to the first reference surface.
- the server determines the pose information of the image acquisition device relative to the first reference surface, which can also be considered as the pose information of the first device relative to the first reference surface.
- Step S408 The server determines, based on the pose information, the angle between the second reference surface where the image acquisition device is located and the first reference surface and the placement area of the virtual object.
- Step S409 The server determines the display size of the virtual object based on the included angle.
- the first reference plane may be a plane parallel to the horizontal plane
- the included angle between the second device and the first reference plane may be an angle between 0 and 90 degrees.
- the included angle gradually decreases from 90 degrees, it can be considered that the focus point of the image capture device is gradually approaching the user.
- the display size of the virtual object can be gradually increased.
- the included angle gradually increases from 0 degrees, it can be considered that the image capture
- the focus of the device is gradually moving away from the user, and the display size of the virtual object can be gradually reduced at this time.
- step S410 the server renders the virtual object on the real scene image according to the display size based on the placement area to obtain augmented reality image data.
- the server performs virtual and real superposition processing on the virtual object and the real scene image, so as to render the virtual object into the real scene image, thereby obtaining augmented reality image data.
- Step S411 The server sends the augmented reality image data to the first device.
- Step S412 Based on the augmented reality image data, the first device outputs and displays a corresponding augmented reality image.
- the server identifies the reference surface in the image, and sends the reference surface information to the first device ,
- the first device visually displays the identified reference surface so that the user can determine the first reference surface where the virtual object is placed.
- the server determines the angle between the first device and the first reference surface,
- the display size of the virtual object is determined according to the included angle, so that the virtual object is rendered into the real scene image according to the determined display size to obtain the augmented reality image data, which is sent to the first device for display by the first device.
- the server determines the display size of the virtual object and performs virtual and real superposition processing, which can not only reduce the calculation amount of the first device, but also realize the automatic adjustment of the display size, which can improve the user's immersive experience.
- the angle between the plane where the mobile phone camera is located and the plane where the object is placed are generally less than 90 degrees.
- the placed object will become larger due to the principle of near large and far smaller, and will be placed near; when the included angle increases, the mobile phone camera looks far away.
- the object will become smaller according to the corresponding line of sight and be placed far away.
- the placed object also moves with the camera's line of sight, and the placement position of the virtual object also moves accordingly.
- 5 is a schematic diagram of an application scenario of the image processing method provided by an embodiment of the application.
- the angle between the plane of the mobile phone camera 503 and the object placement plane 504 is less than 90 degrees.
- the angle between the plane of the mobile phone camera 503 and the object placement plane 504 increases, as shown in the figure As shown in 5, the line of sight is facing far away, and the placed object 501 is shrunk and placed in a remote place.
- the angle between the phone camera 503 plane and the placed object plane 504 is reduced, the line of sight is facing closer, and the placed object 502 is enlarged and placed close to Place.
- the line of sight will also rotate, and the placed virtual object will also rotate with the camera.
- the line of sight is always at right angles to the plane where the camera of the mobile phone is located.
- the color of the virtual object when the placed virtual object is too close or too far beyond the placed plane, the color of the virtual object will change (for example, become gray) to remind the user that the object cannot be successfully placed.
- the distance and size of the placed object can be determined by the line of sight direction of the mobile phone camera and the angle between the plane where the mobile phone camera is located and the object placement plane, so as to automatically adjust the position of the object and automatically adjust the position.
- the size of the objects in the three-dimensional (3Dimensions, 3D) real environment makes the use of AR more interactive and entertaining.
- a game scene is taken as an example for description.
- AR game scenes such as Little Black Sheep
- users need to place game objects, such as forests, barracks, ponds, and hills, on the recognized platform after completing the recognition of the object placement platform to complete the personality of the game scene Customized design.
- the user can adjust the angle between the camera and the object plane and change the camera's line of sight angle, so that the position and size of the object on the AR platform can be automatically adjusted with the camera.
- the method of determining the distance and size of the placed object in the AR scene through the line-of-sight direction of the mobile phone camera and the angle between the plane where the mobile phone camera is located and the object placement plane provided in this embodiment can be used for shopping in addition to being applied to game scenes. Scenes, educational scenes, etc. are not limited in this embodiment.
- FIG. 6 is a schematic diagram of the composition structure of the image processing device provided by an embodiment of the application. As shown in FIG. 6, the device 600 includes: a first determining module 601 and a first acquiring module 602.
- the second determining module 603 includes: a first determining part configured to determine, based on the pose information, the clip between the second reference surface where the image capture device is located and the first reference surface Angle; a second determining part configured to determine the display size of the virtual object based on the included angle.
- the first determining part includes: a first determining sub-part configured to determine first coordinate information of the image acquisition device in a first coordinate system based on the pose information, and the first coordinate system Is the coordinate system established based on the first reference plane; the first acquisition sub-part is configured to acquire the line passing through the optical center of the lens in the image acquisition device and perpendicular to the second reference plane and the first Second coordinate information of the intersection of the reference surface; a second determining sub-part configured to determine the distance between the first reference surface and the second reference surface based on the first coordinate information and the second coordinate information Angle.
- the second determination part includes: a second acquisition sub-part, configured to acquire the correspondence between a display size and an angle; and a third determination sub-part, configured to be based on the correspondence and the included angle To determine the display size of the virtual object.
- the device further includes: a third determining module configured to determine the placement area of the virtual object on the first reference surface based on the second coordinate information and the display size of the virtual object;
- the first rendering module 604 includes: a rendering part configured to render the virtual object on the image according to the display size based on the placement area of the virtual object on the first reference surface.
- the device further includes: a fourth determining module configured to determine the boundary of the first reference surface; a second rendering module configured to determine the placement of the virtual object on the first reference surface In the case where the position exceeds the boundary of the first reference surface, the virtual object is rendered on the image according to preset first color information to prompt the virtual object to exceed the boundary.
- part may be a part of a circuit, a part of a processor, a part of a program or software, etc., of course, it may also be a unit, a module, or a non-modularized one.
- the above-mentioned image processing method is implemented in the form of a software function module and sold or used as an independent product, it can also be stored in a computer readable storage medium.
- the computer software product is stored in a storage medium and includes several instructions for A computer device (which may be a personal computer, a server, or a network device, etc.) executes all or part of the methods described in the various embodiments of the present application.
- the aforementioned storage media include: U disk, mobile hard disk, read only memory (Read Only Memory, ROM), magnetic disk or optical disk and other media that can store program codes. In this way, the embodiments of the present application are not limited to any specific combination of hardware and software.
- an embodiment of the present application further provides a computer storage medium having computer executable instructions stored on the computer storage medium.
- the computer executable instructions are executed by a processor, the image processing method provided in the above embodiments is implemented. step.
- an embodiment of the present application further provides a computer program, including computer-readable code, and when the computer-readable code is executed in an electronic device, the processor in the electronic device executes to implement the embodiment of the present application Provide the steps of the image processing method.
- FIG. 7 is a schematic diagram of the composition and structure of the image processing device provided by an embodiment of the application.
- the device 700 includes: a processor 701 and at least one communication bus 702 , A user interface 703, at least one external communication interface 704 and a memory 705.
- the communication bus 702 is configured to realize connection and communication between these components.
- the user interface 703 may include a display screen, and the external communication interface 704 may include a standard wired interface and a wireless interface.
- the processor 701 is configured to execute an image processing program stored in the memory, so as to implement the steps in the image processing method provided in the foregoing embodiment.
- the disclosed device and method may be implemented in other ways.
- the device embodiments described above are merely illustrative.
- the division of the units is only a logical function division, and there may be other divisions in actual implementation, such as: multiple units or components can be combined, or It can be integrated into another system, or some features can be ignored or not implemented.
- the coupling, or direct coupling, or communication connection between the components shown or discussed may be indirect coupling or communication connection through some interfaces, devices or units, and may be in electrical, mechanical or other forms. of.
- the units described above as separate components may or may not be physically separate, and the components displayed as units may or may not be physical units; they may be located in one place or distributed on multiple network units; Some or all of the units may be selected according to actual needs to implement the solution of this embodiment.
- the functional units in the embodiments of the present application can be all integrated into one processing unit, or each unit can be individually used as a unit, or two or more units can be integrated into one unit;
- the unit can be implemented in the form of hardware, or in the form of hardware plus software functional units.
- this application can be provided as methods, systems, or computer program products. Therefore, this application may adopt the form of hardware embodiment, software embodiment, or a combination of software and hardware embodiments. Moreover, this application may adopt the form of a computer program product implemented on one or more computer-usable storage media (including but not limited to disk storage, optical storage, etc.) containing computer-usable program codes.
- These computer program instructions can also be stored in a computer-readable memory that can direct a computer or other programmable signal processing equipment to work in a specific manner, so that the instructions stored in the computer-readable memory produce an article of manufacture including the instruction device.
- the device implements the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- These computer program instructions can also be loaded on a computer or other programmable signal processing equipment, so that a series of operation steps are executed on the computer or other programmable equipment to produce computer-implemented processing, so as to execute on the computer or other programmable equipment.
- the instructions provide steps for implementing the functions specified in one process or multiple processes in the flowchart and/or one block or multiple blocks in the block diagram.
- the embodiments of the present application provide an image processing method and device, equipment, and storage medium, wherein the method includes: determining a first reference surface on which a virtual object is placed based on an image collected by an image collection device; and acquiring the image collection The pose information of the device relative to the first reference surface and the virtual object to be placed; the display size of the virtual object is determined based on the pose information; the virtual object is rendered on the image according to the display size.
- the method includes: determining a first reference surface on which a virtual object is placed based on an image collected by an image collection device; and acquiring the image collection The pose information of the device relative to the first reference surface and the virtual object to be placed; the display size of the virtual object is determined based on the pose information; the virtual object is rendered on the image according to the display size.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Human Computer Interaction (AREA)
- Architecture (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Processing Or Creating Images (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (15)
- 一种图像处理方法,所述方法包括:基于图像采集装置采集到的图像,确定放置虚拟对象的第一参考面;获取所述图像采集装置相对于所述第一参考面的位姿信息和待放置的虚拟对象;基于所述位姿信息确定所述虚拟对象的显示尺寸;将所述虚拟对象按照所述显示尺寸渲染到所述图像上。
- 根据权利要求1中所述的方法,其中,所述基于所述位姿信息确定所述虚拟对象的显示尺寸,包括:基于所述位姿信息确定所述图像采集装置所在的第二参考面与所述第一参考面之间的夹角;基于所述夹角确定所述虚拟对象的显示尺寸。
- 根据权利要求2中所述的方法,其中,所述基于所述位姿信息确定所述图像采集装置所在的第二参考面与所述第一参考面之间的夹角,包括:基于所述位姿信息确定所述图像采集装置在第一坐标系中的第一坐标信息,第一坐标系为基于所述第一参考面建立得到的坐标系;获取通过所述图像采集装置中镜头的光心且垂直于所述第二参考面的直线与所述第一参考面的交点的第二坐标信息;基于所述第一坐标信息和所述第二坐标信息,确定所述第一参考面与所述第二参考面之间的夹角。
- 根据权利要求2中所述的方法,其中,所述基于所述夹角确定所述虚拟对象的显示尺寸,包括:获取显示尺寸与角度之间的对应关系;基于所述对应关系和所述夹角,确定所述虚拟对象的显示尺寸。
- 根据所述权利要求3中所述的方法,其中,所述方法还包括:基于所述第二坐标信息和所述虚拟对象的显示尺寸确定所述虚拟对象在所述第一参考面的放置区域;对应地,所述将所述虚拟对象按照显示尺寸渲染到所述图像上,包括:基于所述虚拟对象在所述第一参考面的放置区域,将所述虚拟对象按照显示尺寸渲染到所述图像上。
- 根据权利要求5中所述的方法,其中,所述方法还包括:确定所述第一参考面的边界;在确定所述虚拟对象在所述第一参考面的放置区域超出所述第一参考面的边界的情况下,将所述虚拟对象按照预设的第一色彩信息渲染到所述图像上,以提示所述虚拟对象超出所述边界。
- 一种图像处理装置,所述图像处理装置包括:第一确定模块,配置为基于图像采集装置采集到的图像,确定放置虚拟对象的第一参考面;第一获取模块,配置为获取所述图像采集装置相对于所述第一参考面的位姿信息和待放置的虚拟对象;第二确定模块,配置为基于所述位姿信息确定所述虚拟对象的显示尺寸;第一渲染模块,配置为将所述虚拟对象按照显示尺寸渲染到所述图像中的第一参考面上。
- 根据权利要求7中所述的装置,其中,所述第二确定模块包括:第一确定部分,配置为基于所述位姿信息确定所述图像采集装置所在的第二参考面与所述第一参考面之间的夹角;第二确定部分,配置为基于所述夹角确定所述虚拟对象的显示尺寸。
- 根据权利要求8中所述的装置,其中,所述第一确定部分,包括:第一确定子部分,配置为基于所述位姿信息确定所述图像采集装置在第一坐标系中的第一坐标信息,第一坐标系为基于所述第一参考面建立得到的坐标系;第一获取子部分,配置为获取通过所述图像采集装置中镜头的光心且垂直于所述第二参考面的直线与所述第一参考面的交点的第二坐标信息;第二确定子部分,配置为基于所述第一坐标信息和所述第二坐标信息,确定所述第一参考面与所述第二参考面之间的夹角。
- 根据权利要求8中所述的装置,其中,所述第二确定部分包括:第二获取子部分,配置为获取显示尺寸与角度之间的对应关系;第三确定子部分,配置为基于所述对应关系和所述夹角,确定所述虚拟对象的显示尺寸。
- 根据所述权利要求9中所述的装置,其中,所述装置还包括:第三确定模块,配置为基于所述第二坐标信息和所述虚拟对象的显示尺寸确定所述虚拟对象在所述第一参考面的放置区域;对应地,所述第一渲染模块,包括:渲染部分,配置为基于所述虚拟对象在所述第一参考面的放置区域,将所述虚拟对象按照显示尺寸渲染到所述图像上。
- 根据权利要求7至11任一项所述的装置,其中,所述装置还包括:第四确定模块,配置为确定所述第一参考面的边界;第二渲染模块,配置为在确定所述虚拟对象在所述第一参考面的放置位置超出所述第一参考面的边界的情况下,将所述虚拟对象按照预设的第一色彩信息渲染到所述图像上,以提示所述虚拟对象超出所述边界。
- 一种图像处理设备,所述图像处理设备至少包括:存储器、通信总线和处理器,其中:所述存储器,配置为存储图像处理程序;所述通信总线,配置为实现处理器和存储器之间的连接通信;所述处理器,配置为执行存储器中存储的图像处理程序,以实现权利要求1至6任一项中所述的图像处理方法的步骤。
- 一种存储介质,所述存储介质上存储有图像处理程序,所述图像处理程序被处理器执行时实现权利要求1至6任一项中所述的图像处理方法的步骤。
- 一种计算机程序,包括计算机可读代码,当所述计算机可读代码在电子设备中运行时,所述电子设备中的处理器执行时实现权利要求1至6任一项中所述的图像处理方法的步骤。
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
SG11202112379VA SG11202112379VA (en) | 2019-08-28 | 2020-07-07 | Image processing method, device thereof, equipment, and storage medium |
JP2021564316A JP7261904B2 (ja) | 2019-08-28 | 2020-07-07 | 画像処理方法及びその装置、機器並びに記憶媒体 |
US17/518,692 US11880956B2 (en) | 2019-08-28 | 2021-11-04 | Image processing method and apparatus, and computer storage medium |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN201910803981.6 | 2019-08-28 | ||
CN201910803981.6A CN110533780B (zh) | 2019-08-28 | 2019-08-28 | 一种图像处理方法及其装置、设备和存储介质 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/518,692 Continuation US11880956B2 (en) | 2019-08-28 | 2021-11-04 | Image processing method and apparatus, and computer storage medium |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2021036520A1 true WO2021036520A1 (zh) | 2021-03-04 |
Family
ID=68664903
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2020/100713 WO2021036520A1 (zh) | 2019-08-28 | 2020-07-07 | 一种图像处理方法及其装置、设备和存储介质 |
Country Status (6)
Country | Link |
---|---|
US (1) | US11880956B2 (zh) |
JP (1) | JP7261904B2 (zh) |
CN (1) | CN110533780B (zh) |
SG (1) | SG11202112379VA (zh) |
TW (1) | TW202109450A (zh) |
WO (1) | WO2021036520A1 (zh) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113724397A (zh) * | 2021-08-27 | 2021-11-30 | 浙江商汤科技开发有限公司 | 虚拟对象的定位方法、装置、电子设备及存储介质 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110533780B (zh) | 2019-08-28 | 2023-02-24 | 深圳市商汤科技有限公司 | 一种图像处理方法及其装置、设备和存储介质 |
CN113703161B (zh) * | 2020-05-22 | 2023-07-25 | 宏碁股份有限公司 | 扩增实境系统与其锚定显示方法 |
CN111651051B (zh) * | 2020-06-10 | 2023-08-22 | 浙江商汤科技开发有限公司 | 一种虚拟沙盘展示方法及装置 |
CN111651057A (zh) * | 2020-06-11 | 2020-09-11 | 浙江商汤科技开发有限公司 | 一种数据展示方法、装置、电子设备及存储介质 |
CN112684885B (zh) * | 2020-12-25 | 2023-05-02 | 联想(北京)有限公司 | 一种显示控制方法及装置 |
CN113838164A (zh) * | 2021-08-18 | 2021-12-24 | 北京商询科技有限公司 | 一种网格绘制方法、装置以及存储介质 |
CN114419298A (zh) * | 2022-01-21 | 2022-04-29 | 北京字跳网络技术有限公司 | 虚拟物体的生成方法、装置、设备及存储介质 |
WO2023219612A1 (en) * | 2022-05-11 | 2023-11-16 | Innopeak Technology, Inc. | Adaptive resizing of manipulatable and readable objects |
CN115810100B (zh) * | 2023-02-06 | 2023-05-05 | 阿里巴巴(中国)有限公司 | 确定物体放置平面的方法、设备、存储介质 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080130985A1 (en) * | 2006-12-02 | 2008-06-05 | Electronic And Telecommunications Research Institute | Correlation extract method for generating 3d motion data, and motion capture system and method for easy composition of humanoid character on real background image using the same |
US20140292645A1 (en) * | 2013-03-28 | 2014-10-02 | Sony Corporation | Display control device, display control method, and recording medium |
CN107463261A (zh) * | 2017-08-11 | 2017-12-12 | 北京铂石空间科技有限公司 | 立体交互系统及方法 |
CN109240682A (zh) * | 2018-09-30 | 2019-01-18 | 上海葡萄纬度科技有限公司 | 基于ar的交互编程系统、方法、介质及智能设备 |
CN110533780A (zh) * | 2019-08-28 | 2019-12-03 | 深圳市商汤科技有限公司 | 一种图像处理方法及其装置、设备和存储介质 |
Family Cites Families (21)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI422999B (zh) | 2006-10-26 | 2014-01-11 | Seereal Technologies Sa | 全像顯示裝置、其製造方法及產生全像重建的方法 |
JP4640508B2 (ja) | 2009-01-09 | 2011-03-02 | ソニー株式会社 | 画像処理装置、画像処理方法、プログラム、及び撮像装置 |
EP2691935A1 (en) * | 2011-03-29 | 2014-02-05 | Qualcomm Incorporated | System for the rendering of shared digital interfaces relative to each user's point of view |
US9117281B2 (en) | 2011-11-02 | 2015-08-25 | Microsoft Corporation | Surface segmentation from RGB and depth images |
CN104102678B (zh) * | 2013-04-15 | 2018-06-05 | 腾讯科技(深圳)有限公司 | 增强现实的实现方法以及实现装置 |
JP2015038696A (ja) * | 2013-08-19 | 2015-02-26 | 国立大学法人佐賀大学 | 拡張現実装置、拡張現実方法及び拡張現実プログラム |
CN104423855B (zh) | 2013-08-26 | 2019-01-15 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN108572449B (zh) | 2014-03-31 | 2021-09-14 | 联想(北京)有限公司 | 显示装置和电子设备 |
US9551873B2 (en) | 2014-05-30 | 2017-01-24 | Sony Interactive Entertainment America Llc | Head mounted device (HMD) system having interface with mobile computing device for rendering virtual reality content |
JP6476657B2 (ja) * | 2014-08-27 | 2019-03-06 | 株式会社リコー | 画像処理装置、画像処理方法、およびプログラム |
CN105718232A (zh) | 2016-01-20 | 2016-06-29 | 昆山龙腾光电有限公司 | 一种任意角平面旋转显示方法及显示装置 |
US10115236B2 (en) * | 2016-09-21 | 2018-10-30 | Verizon Patent And Licensing Inc. | Placing and presenting virtual objects in an augmented reality environment |
CN108273265A (zh) * | 2017-01-25 | 2018-07-13 | 网易(杭州)网络有限公司 | 虚拟对象的显示方法及装置 |
CN108021241B (zh) | 2017-12-01 | 2020-08-25 | 西安维度视界科技有限公司 | 一种实现ar眼镜虚实融合的方法 |
CN108520552A (zh) * | 2018-03-26 | 2018-09-11 | 广东欧珀移动通信有限公司 | 图像处理方法、装置、存储介质及电子设备 |
CN108924538B (zh) | 2018-05-30 | 2021-02-26 | 太若科技(北京)有限公司 | Ar设备的屏幕拓展方法 |
CN108961423B (zh) * | 2018-07-03 | 2023-04-18 | 百度在线网络技术(北京)有限公司 | 虚拟信息处理方法、装置、设备及存储介质 |
CN109195020B (zh) | 2018-10-11 | 2021-07-02 | 三星电子(中国)研发中心 | 一种ar增强的游戏直播方法和系统 |
CN109903129A (zh) * | 2019-02-18 | 2019-06-18 | 北京三快在线科技有限公司 | 增强现实显示方法与装置、电子设备、存储介质 |
CN109961522B (zh) * | 2019-04-02 | 2023-05-05 | 阿波罗智联(北京)科技有限公司 | 图像投射方法、装置、设备和存储介质 |
CN110110647A (zh) | 2019-04-30 | 2019-08-09 | 北京小米移动软件有限公司 | 基于ar设备进行信息显示的方法、装置及存储介质 |
-
2019
- 2019-08-28 CN CN201910803981.6A patent/CN110533780B/zh active Active
-
2020
- 2020-07-07 SG SG11202112379VA patent/SG11202112379VA/en unknown
- 2020-07-07 WO PCT/CN2020/100713 patent/WO2021036520A1/zh active Application Filing
- 2020-07-07 JP JP2021564316A patent/JP7261904B2/ja active Active
- 2020-08-07 TW TW109126943A patent/TW202109450A/zh unknown
-
2021
- 2021-11-04 US US17/518,692 patent/US11880956B2/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20080130985A1 (en) * | 2006-12-02 | 2008-06-05 | Electronic And Telecommunications Research Institute | Correlation extract method for generating 3d motion data, and motion capture system and method for easy composition of humanoid character on real background image using the same |
US20140292645A1 (en) * | 2013-03-28 | 2014-10-02 | Sony Corporation | Display control device, display control method, and recording medium |
CN107463261A (zh) * | 2017-08-11 | 2017-12-12 | 北京铂石空间科技有限公司 | 立体交互系统及方法 |
CN109240682A (zh) * | 2018-09-30 | 2019-01-18 | 上海葡萄纬度科技有限公司 | 基于ar的交互编程系统、方法、介质及智能设备 |
CN110533780A (zh) * | 2019-08-28 | 2019-12-03 | 深圳市商汤科技有限公司 | 一种图像处理方法及其装置、设备和存储介质 |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113724397A (zh) * | 2021-08-27 | 2021-11-30 | 浙江商汤科技开发有限公司 | 虚拟对象的定位方法、装置、电子设备及存储介质 |
Also Published As
Publication number | Publication date |
---|---|
US20220058888A1 (en) | 2022-02-24 |
US11880956B2 (en) | 2024-01-23 |
CN110533780B (zh) | 2023-02-24 |
JP7261904B2 (ja) | 2023-04-20 |
CN110533780A (zh) | 2019-12-03 |
JP2022531190A (ja) | 2022-07-06 |
SG11202112379VA (en) | 2021-12-30 |
TW202109450A (zh) | 2021-03-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2021036520A1 (zh) | 一种图像处理方法及其装置、设备和存储介质 | |
US11238666B2 (en) | Display of an occluded object in a hybrid-reality system | |
CN110954083B (zh) | 移动设备的定位 | |
JP3926837B2 (ja) | 表示制御方法および装置、プログラム、並びに携帯機器 | |
JP2021530817A (ja) | 画像ディスプレイデバイスの位置特定マップを決定および/または評価するための方法および装置 | |
CN107710108B (zh) | 内容浏览 | |
US9268410B2 (en) | Image processing device, image processing method, and program | |
JP7387202B2 (ja) | 3次元顔モデル生成方法、装置、コンピュータデバイス及びコンピュータプログラム | |
JPWO2016203792A1 (ja) | 情報処理装置、情報処理方法及びプログラム | |
TW201835723A (zh) | 圖形處理方法和裝置、虛擬實境系統和計算機儲存介質 | |
CN104536579A (zh) | 交互式三维实景与数字图像高速融合处理系统及处理方法 | |
CN107950019A (zh) | 信息处理装置、信息处理方法和程序 | |
CN106980378B (zh) | 虚拟显示方法和系统 | |
KR20180120456A (ko) | 파노라마 영상을 기반으로 가상현실 콘텐츠를 제공하는 장치 및 그 방법 | |
CN110737326A (zh) | 虚拟对象的显示方法、装置、终端设备及存储介质 | |
US20220005281A1 (en) | Augmented reality (ar) imprinting methods and systems | |
CN106909219B (zh) | 基于三维空间的交互控制方法和装置、智能终端 | |
CN114371779A (zh) | 一种视线深度引导的视觉增强方法 | |
WO2022176450A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
JP6680886B2 (ja) | マルチメディア情報を表示する方法及び装置 | |
US20190129602A1 (en) | Systems and methods for imaging grounding | |
CN117075771A (zh) | 基于虚拟现实空间的画面显示方法、装置、设备及介质 | |
Fetzer et al. | 3D interaction design: Increasing the stimulus-response correspondence by using stereoscopic vision |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 20855926 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2021564316 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
32PN | Ep: public notification in the ep bulletin as address of the adressee cannot be established |
Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205A DATED 01.07.2022) |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 20855926 Country of ref document: EP Kind code of ref document: A1 |