WO2017206451A1 - Procédé de traitement d'informations d'image et dispositif de réalité augmentée - Google Patents

Procédé de traitement d'informations d'image et dispositif de réalité augmentée Download PDF

Info

Publication number
WO2017206451A1
WO2017206451A1 PCT/CN2016/107708 CN2016107708W WO2017206451A1 WO 2017206451 A1 WO2017206451 A1 WO 2017206451A1 CN 2016107708 W CN2016107708 W CN 2016107708W WO 2017206451 A1 WO2017206451 A1 WO 2017206451A1
Authority
WO
WIPO (PCT)
Prior art keywords
augmented reality
target object
reality device
image
coordinate system
Prior art date
Application number
PCT/CN2016/107708
Other languages
English (en)
Chinese (zh)
Inventor
刘均
刘新
宋朝忠
欧阳张鹏
Original Assignee
深圳市元征科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 深圳市元征科技股份有限公司 filed Critical 深圳市元征科技股份有限公司
Publication of WO2017206451A1 publication Critical patent/WO2017206451A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/73Deblurring; Sharpening

Definitions

  • Image information processing method and augmented reality device
  • the present invention relates to the field of image processing technologies, and in particular, to an image processing method and an augmented reality device.
  • smart wearable devices are gradually entering people's lives, among which smart glasses are one type of smart wearable devices.
  • the smart glasses have the functions of ordinary glasses.
  • they also have an independent operating system. Users can install software, games and other programs. They can complete various functions such as taking photos and video calls through voice or motion control, and can communicate through mobile communication networks. To achieve wireless network access.
  • Embodiments of the present invention provide an image processing method and an augmented reality device, which are advantageous for clearly acquiring image information of all objects in a viewing distance.
  • a first aspect of the embodiments of the present invention provides a method for processing image information, including:
  • the augmented reality device acquires relative position information of the target object relative to the augmented reality device
  • the augmented reality device determines location information of the target object based on the acquired relative position information of the target object and position information of the augmented reality device;
  • the augmented reality device sends a read request for reading an augmented reality image of the target object to a server, where the read request carries location information of the target object;
  • the augmented reality device receives and displays an augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device acquires a phase of the target object relative to the augmented reality device For location information, including:
  • the augmented reality device establishes a three-dimensional coordinate system with the augmented reality device as an origin;
  • the augmented reality device acquires a distance of the augmented reality device to the target object, and acquires a directed line segment starting from the augmented reality device and ending with the target object, and the three-dimensional coordinate system The positive angle of the three axes;
  • the augmented reality device determines the location information of the target object based on the obtained relative location information of the target object and the location information of the augmented reality device, including:
  • the augmented reality device determines the target based on the acquired distance of the augmented reality device to the target object and the angle between the directed line segment and the positive direction of the three coordinate axes of the three-dimensional coordinate system.
  • the augmented reality device acquires location information of the augmented reality device
  • the augmented reality device determines location information of the target object based on coordinates of the three-dimensional coordinate system established by the augmented reality device and position information of the augmented reality device.
  • the method further includes:
  • the augmented reality device acquires an image transformation instruction for the augmented reality image
  • the augmented reality device displays the transformed enhanced real image of the target object based on the image transformation instruction.
  • the image transformation instruction includes one or more of the following: changing a display ratio of the augmented reality image of the target object, and performing translation, rotation, and adjustment on the augmented reality image of the target object; The transparency of the augmented reality image of the target object.
  • a second aspect of the embodiments of the present invention provides an augmented reality device, including:
  • an obtaining unit configured to acquire relative position information of the target object relative to the augmented reality device
  • a determining unit configured to determine location information of the target object based on the acquired relative position information of the target object and location information of the augmented reality device
  • a sending unit configured to send, to the server, a read request for reading an augmented reality image of the target object, where the read request carries location information of the target object
  • a display unit configured to receive and display an augmented reality image of the target object generated by the server in response to the read request.
  • the acquiring unit includes:
  • a first acquiring subunit configured to acquire a distance of the augmented reality device to the target object, and acquire a directed line segment starting from the augmented reality device and ending with the target object, and The positive angle of the three coordinate axes of the three-dimensional coordinate system;
  • a first determining subunit configured to: based on the acquired distance of the augmented reality device to the target object, and an angle between the directed line segment and a positive direction of three coordinate axes of the three-dimensional coordinate system, Relative position information of the target object relative to the augmented reality device is determined.
  • the determining unit includes:
  • a second determining subunit configured to determine, according to the acquired distance of the augmented reality device to the target object, and an angle between the directed line segment and the forward direction of the three coordinate axes of the three-dimensional coordinate system The coordinates of the three-dimensional coordinate system in which the target object is established;
  • a second obtaining subunit configured to acquire location information of the augmented reality device
  • a third determining subunit configured to determine location information of the target object based on coordinates of the three-dimensional coordinate system established by the augmented reality device and location information of the augmented reality device.
  • the augmented reality device further includes:
  • a receiving unit configured to: after the display unit receives and displays an augmented reality image of the target object generated by the server in response to the reading request, acquiring an image transformation instruction for the augmented reality image;
  • a transforming unit configured to display the augmented reality image of the transformed target object based on the image transformation instruction.
  • the image transformation instruction includes one or more of the following: changing a display ratio of the augmented reality image of the target object, and performing translation, rotation, and adjustment on the augmented reality image of the target object The transparency of the augmented reality image of the target object.
  • a processor a receiver, a memory, a transmitter, and a communication bus, wherein the communication bus is configured to implement a connection between the processor, the receiver, the memory, and the transmitter Communication; [0045] storing the program code in the memory;
  • the processor may, by means of the communication bus, invoke code stored in the memory to perform some or all of the steps of any of the first aspects of the embodiments of the present invention.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object. Information, determining location information of the target object, and then transmitting a read request for reading an augmented reality image of the target object to the server, wherein the read request carries location information of the target object, and finally, enhancing
  • the real device receives and displays the augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device can generate an augmented reality image of the target object.
  • the generated augmented reality image is displayed, which in turn facilitates the user to clearly obtain image information of all objects within the line of sight.
  • FIG. 1 is a schematic flow chart of a method for processing image information according to a first embodiment of the present invention
  • FIG. 2 is a schematic flow chart of a method for processing image information according to a second embodiment of the present invention.
  • 2-1 is a schematic diagram of a three-dimensional coordinate system in a second embodiment of the present invention.
  • FIG. 3 is a schematic structural diagram of an augmented reality device according to a third embodiment of the present invention.
  • FIG. 4 is a schematic structural diagram of an augmented reality device according to a fourth embodiment of the present invention.
  • FIG. 5 is a schematic structural diagram of an augmented reality device according to a fifth embodiment of the present invention.
  • references to "embodiments” herein mean that the specific features, structures, or characteristics described in connection with the embodiments can be included in at least one embodiment of the invention.
  • the appearances of the phrases in various places in the specification are not necessarily referring to the same embodiments, and are not exclusive or alternative embodiments that are mutually exclusive. Those skilled in the art will understand and implicitly understand that the embodiments described herein can be combined with other embodiments.
  • FIG. 1 is a schematic flow chart of a method for processing image information according to a first embodiment of the present invention.
  • the image information processing method in the embodiment of the present invention includes the following steps:
  • the S10 augmented reality device acquires relative position information of the target object with respect to the augmented reality device.
  • the augmented reality device establishes a three-dimensional coordinate system with the augmented reality device as an origin, acquires a distance of the augmented reality device to the target object, and acquires the augmented reality device as a starting point.
  • the target object is an angle between the directed line segment of the end point and the positive direction of the three coordinate axes of the three-dimensional coordinate system, based on the acquired distance of the augmented reality device to the target object, and the directed line segment and
  • the three-dimensional coordinate system is an angle between the positive directions of the three coordinate axes, and determines relative position information of the target object with respect to the augmented reality device.
  • the augmented reality device determines location information of the target object based on the acquired relative location information of the target object and location information of the enhanced reality device.
  • the augmented reality device determines the location based on the acquired distance of the augmented reality device to the target object and the positive angle between the directed line segment and the three coordinate axes of the three-dimensional coordinate system.
  • Target And acquiring, by the augmented reality device, location information of the augmented reality device, where the augmented reality device is based on coordinates of the three-dimensional coordinate system established by the augmented reality device The location information of the augmented reality device is determined, and the location information of the target object is determined.
  • the augmented reality device sends a read request for reading an augmented reality image of the target object to a server, where the read request carries location information of the target object.
  • the server may be any service device that establishes a communication connection with the augmented reality device, such as a server, a service host, a service system, a service platform, and the like.
  • the augmented reality device receives and displays an augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device receives an augmented reality image of the target object generated by the server in response to the read request, and displays an augmented reality image of the target object on the display device.
  • the augmented reality device as an example of smart glasses, the smart glasses display an augmented reality image of the target object on the lens.
  • an image transformation instruction for the augmented reality image may also be acquired,
  • the image transformation instruction includes one or more of the following: changing a display ratio of the augmented reality image of the target object, translating, rotating, and adjusting an enhancement of the target object The transparency of the real image is then displayed based on the image transformation instruction to display the augmented reality image of the transformed target object.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object. Information, determining location information of the target object, and then transmitting a read request for reading an augmented reality image of the target object to the server, wherein the read request carries location information of the target object, and finally, enhancing The real device receives and displays the augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device can generate an augmented reality image of the target object.
  • FIG. 2 is a schematic flowchart of a method for processing image information according to a second embodiment of the present invention. As shown in FIG. 2, the image information processing method in the embodiment of the present invention includes the following steps:
  • the augmented reality device establishes a three-dimensional coordinate system with the augmented reality device as an origin.
  • the augmented reality device establishes a three-dimensional coordinate system with the reference feature point of the augmented reality device as an origin, wherein the reference feature point may be any point located on the augmented reality device.
  • the determination of the three coordinate axes of the three-dimensional coordinate system may be determined based on a three-axis acceleration sensor or may be determined based on a gravity sensor.
  • the augmented reality device acquires a distance of the augmented reality device to the target object, and acquires a directed line segment starting from the augmented reality device and ending with the target object, and the three-dimensional The positive angle of the coordinate axes of the three coordinate axes.
  • S203 Determine, according to the obtained distance between the augmented reality device and the target object, and an angle between the directed line segment and the positive direction of the three coordinate axes of the three-dimensional coordinate system, determining that the target object is relatively Relative position information of the augmented reality device.
  • FIG. 2-1 is a schematic diagram of a three-dimensional coordinate system in a second embodiment of the present invention, where the origin 0 of the three-dimensional coordinate system is the augmented reality device,
  • the augmented reality device measures the distance from the reference feature point of the augmented reality device to the ranging feature point of the target object by a distance measuring device, such as an infrared range finder, a laser range finder, or the like, ie, a directed
  • a distance measuring device such as an infrared range finder, a laser range finder, or the like, ie, a directed
  • the length of the line segment OA, and the angles ⁇ , ⁇ , and ⁇ of the directed line segment OA with respect to the positive directions of the X-axis, the Y-axis, and the Z-axis of the three-dimensional coordinate system are obtained.
  • the augmented reality device determines, according to the obtained distance between the augmented reality device and the target object, and the positive angle between the directed line segment and the three coordinate axes of the three-dimensional coordinate system. The coordinates of the three-dimensional coordinate system in which the target object is established.
  • the augmented reality device acquires location information of the augmented reality device.
  • the augmented reality device may acquire the ground coordinates of the local end, that is, the latitude, longitude, and elevation of the local end, by using the position sensor.
  • the augmented reality device determines location information of the target object based on the coordinates of the three-dimensional coordinate system established by the augmented reality device and the location information of the augmented reality device.
  • the position information of the target object that is, the latitude, longitude, and elevation of the target object, is determined.
  • the augmented reality device sends a read request for reading an augmented reality image of the target object to a server, where the read request carries location information of the target object.
  • the augmented reality device receives and displays an augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device acquires an image transformation instruction for the augmented reality image
  • the augmented reality device displays the transformed augmented reality image of the target object based on the image transformation instruction.
  • the image transformation instruction comprises one or more of: changing a display ratio of the augmented reality image of the target object, translating, rotating, and adjusting the target of the augmented reality image of the target object; The transparency of the augmented reality image of the object.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object. Information, determining location information of the target object, and then transmitting a read request for reading an augmented reality image of the target object to the server, wherein the read request carries location information of the target object, and finally, enhancing
  • the real device receives and displays the augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device can generate an augmented reality image of the target object.
  • the generated augmented reality image is displayed, which in turn facilitates the user to clearly obtain image information of all objects within the line of sight.
  • the device embodiment of the present invention is used to perform the method for implementing the first to second embodiments of the present invention.
  • the device embodiment of the present invention is used to perform the method for implementing the first to second embodiments of the present invention.
  • Only parts related to the embodiment of the present invention are shown.
  • FIG. 3 is a schematic structural diagram of an augmented reality device according to a third embodiment of the present invention.
  • the image information processing device in the embodiment of the present invention includes the following units:
  • an obtaining unit 301 configured to acquire relative position information of the target object relative to the augmented reality device
  • the determining unit 302 is configured to determine location information of the target object based on the acquired relative location information of the target object and location information of the enhanced real device.
  • a sending unit 303 configured to send, to the server, a read request for reading an augmented reality image of the target object, where the read request carries location information of the target object;
  • the display unit 304 is configured to receive and display an augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object. Information, determining location information of the target object, and then transmitting a read request for reading an augmented reality image of the target object to the server, wherein the read request carries location information of the target object, and finally, enhancing
  • the real device receives and displays the augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device can generate an augmented reality image of the target object.
  • the generated augmented reality image is displayed, which in turn facilitates the user to clearly obtain image information of all objects within the line of sight.
  • FIG. 4 is a schematic structural diagram of an augmented reality device according to a fourth embodiment of the present invention.
  • the image information processing device in the embodiment of the present invention includes the third embodiment of the present invention.
  • the obtaining unit 301, the determining unit 302, the sending unit 303, and the displaying unit 304 wherein the following units.
  • the obtaining unit 301 may further include:
  • the establishing subunit 3011 is configured to establish a three-dimensional coordinate system with the augmented reality device as an origin;
  • a first obtaining subunit 3012 configured to acquire a distance of the augmented reality device to the target object, and acquire a directed line segment and a starting point with the augmented reality device as a starting point and the target object as an end point Three The angle between the positive axes of the three coordinate axes of the dimensional coordinate system;
  • the first determining subunit 3013 is configured to: based on the acquired distance of the augmented reality device to the target object, and an angle between the directed line segment and the positive direction of the three coordinate axes of the three-dimensional coordinate system Determining relative position information of the target object with respect to the augmented reality device.
  • the determining unit 302 includes:
  • a second determining subunit 3021 configured to acquire, according to the acquired distance of the augmented reality device to the target object, an angle between the directed line segment and a forward direction of three coordinate axes of the three-dimensional coordinate system, Determining coordinates of the three-dimensional coordinate system in which the target object is established;
  • a second obtaining subunit 3022 configured to acquire location information of the augmented reality device
  • the third determining subunit 3023 is configured to determine location information of the target object based on coordinates of the three-dimensional coordinate system established by the augmented reality device and location information of the augmented reality device.
  • the augmented reality device may further include:
  • the receiving unit 305 is configured to: after the display unit receives and displays the augmented reality image of the target object generated by the server in response to the read request, acquiring an image transformation instruction for the augmented reality image;
  • the transform unit 306 is configured to display the transformed augmented reality image of the target object based on the image transformation instruction.
  • the image transformation instruction comprises one or more of: changing a display ratio of the augmented reality image of the target object, translating, rotating, and adjusting the target of the augmented reality image of the target object; The transparency of the augmented reality image of the object.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object.
  • the augmented reality device can generate an augmented reality image of the target object, and display the generated augmented reality image, thereby facilitating the wearer to clearly acquire the image information of all objects in the line of sight.
  • FIG. 5 is a schematic structural diagram of an augmented reality device according to a fifth embodiment of the present invention.
  • the augmented reality device in the embodiment of the present invention includes: at least one processor 501, such as a CPU, at least one receiver 503, at least one memory 504, at least one transmitter 505, and at least one communication bus 502.
  • the communication bus 502 is used to implement connection communication between these components.
  • the receiver 503 and the transmitter 505 of the device in the embodiment of the present invention may be wired transmission ports, or may be wireless devices, for example, including antenna devices, for performing signaling or data communication with other node devices.
  • the memory 504 may be a high speed RAM memory or a non-volatile memory such as at least one disk memory.
  • the memory 504 can optionally also be at least one storage device located remotely from the aforementioned processor 501.
  • a set of program codes is stored in memory 504, and said processor 501 can call the code stored in memory 504 via communication bus 502 to perform the associated functions.
  • the processor 501 is configured to acquire relative position information of the target object with respect to the augmented reality device, and determine, according to the acquired relative position information of the target object and location information of the augmented reality device, Position information of the target object; transmitting, to the server, a read request for reading an augmented reality image of the target object, the read request carrying location information of the target object; receiving and displaying the server responding to the read Taking an augmented reality image of the target object requested to be generated.
  • the processor 501 is configured to acquire relative position information of the target object relative to the augmented reality device, and specifically, to establish a three-dimensional coordinate system with the augmented reality device as an origin; Depicting a distance from the augmented reality device to the target object, and acquiring a positive angle between the directed line segment starting from the augmented reality device and ending the target object and the three coordinate axes of the three-dimensional coordinate system Determining the target object relative to the enhancement based on the acquired distance of the augmented reality device to the target object and the angle between the directed line segment and the forward direction of the three coordinate axes of the three-dimensional coordinate system Relative location information of real devices.
  • the processor 501 is configured to determine location information of the target object based on the acquired relative location information of the target object and the location information of the augmented reality device, where Determining a distance from the augmented reality device to the target object and the directed line segment and the The three-dimensional coordinate system is the positive angle of the three coordinate axes, determining the coordinates of the three-dimensional coordinate system established by the target object; acquiring the position information of the augmented reality device; based on the established device of the augmented reality device The coordinates of the three-dimensional coordinate system, the position information of the augmented reality device, and the position information of the target object are determined.
  • the processor 501 is further configured to acquire, after the augmented reality image of the target object generated by the server in response to the read request, An image transformation instruction of the image; displaying the augmented reality image of the transformed target object based on the image transformation instruction.
  • the image transformation instruction comprises one or more of: changing a display ratio of the augmented reality image of the target object, translating, rotating, and adjusting the target of the augmented reality image of the target object; The transparency of the augmented reality image of the object.
  • the augmented reality device first obtains the location information of the local end and the relative position information of the target object with respect to the local end, and secondly, the location information based on the local end and the relative position of the target object. Information, determining location information of the target object, and then transmitting a read request for reading an augmented reality image of the target object to the server, wherein the read request carries location information of the target object, and finally, enhancing
  • the real device receives and displays the augmented reality image of the target object generated by the server in response to the read request.
  • the augmented reality device can generate an augmented reality image of the target object.
  • the generated augmented reality image is displayed, which in turn facilitates the user to clearly obtain image information of all objects within the line of sight.
  • the embodiment of the present invention further provides a computer storage medium, wherein the computer storage medium may store a program, and the program executes some or all of the monitoring methods including any one of the service processes described in the foregoing method embodiments. step.
  • sequence of steps of the method of the embodiment of the present invention may be adjusted, merged, or deleted according to actual needs.
  • the unit of the terminal in the embodiment of the present invention may be integrated, further divided or deleted according to actual needs.
  • the disclosed apparatus may be implemented in other manners.
  • the device embodiments described above are schematic, for example, the division of the units, for a logical function division, the actual implementation may have another division manner, for example, multiple units or components may be combined or may be integrated into Another system, or some features can be ignored, or not executed.
  • the coupling or direct coupling or communication connection shown or discussed may be an indirect coupling or communication connection through some interface, device or unit, and may be electrical or otherwise.
  • the unit described as a separate component may or may not be physically distributed, and the component displayed as a unit may or may not be a physical unit, that is, may be located in one place, or may be distributed to multiple On the network unit. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of the embodiment.
  • each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may exist physically separately, or two or more units may be integrated into one unit.
  • the above integrated unit can be implemented in the form of hardware or in the form of a software functional unit.
  • the integrated unit if implemented in the form of a software functional unit and sold or used as a standalone product, may be stored in a computer readable storage medium.
  • the technical solution of the present invention may contribute to the prior art or all or part of the technical solution may be embodied in the form of a software product stored in a storage medium.
  • a number of instructions are included to cause a computer device (which may be a personal computer, server or network device, etc.) to perform all or part of the steps of the methods of the various embodiments of the present invention.
  • the foregoing storage medium includes: a USB flash drive, a read only memory (ROM, Read-Only)
  • RAM random access memory
  • mobile hard disk magnetic
  • program code such as a disc or a disc.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Processing Or Creating Images (AREA)

Abstract

L'invention concerne un procédé de traitement d'informations d'image et un dispositif de réalité augmentée. Le procédé comprend les opérations suivantes : un dispositif de réalité augmentée obtient des informations de position relative d'un objet cible par rapport au dispositif de réalité augmentée ; le dispositif de réalité augmentée détermine des informations de position de l'objet cible sur la base des informations de position relative obtenues de l'objet cible et des informations de position du dispositif de réalité augmentée ; le dispositif de réalité augmentée envoie, à un serveur, une requête de lecture utilisée de manière à lire une image de réalité augmentée de l'objet cible, la requête de lecture transportant les informations de position de l'objet cible ; le dispositif de réalité augmentée reçoit et affiche l'image de réalité augmentée de l'objet cible générée par le serveur en réponse à la requête de lecture. L'invention concerne également un dispositif de réalité augmentée correspondant. La solution technique fournie par les modes de réalisation de la présente invention peut générer et afficher l'image de réalité augmentée de l'objet cible, permettant ainsi d'aider un utilisateur à obtenir des informations d'image nette de tous les objets dans une plage visuelle.
PCT/CN2016/107708 2016-05-31 2016-11-29 Procédé de traitement d'informations d'image et dispositif de réalité augmentée WO2017206451A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN201610379925.0A CN106097258A (zh) 2016-05-31 2016-05-31 一种影像处理方法及增强现实设备
CN201610379925.0 2016-05-31

Publications (1)

Publication Number Publication Date
WO2017206451A1 true WO2017206451A1 (fr) 2017-12-07

Family

ID=57230698

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2016/107708 WO2017206451A1 (fr) 2016-05-31 2016-11-29 Procédé de traitement d'informations d'image et dispositif de réalité augmentée

Country Status (2)

Country Link
CN (1) CN106097258A (fr)
WO (1) WO2017206451A1 (fr)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019118599A3 (fr) * 2017-12-13 2019-07-25 Lowe's Companies, Inc. Virtualisation d'objets à l'aide de modèles d'objets et de données de position d'objets
US11875396B2 (en) 2016-05-10 2024-01-16 Lowe's Companies, Inc. Systems and methods for displaying a simulated room and portions thereof

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106097258A (zh) * 2016-05-31 2016-11-09 深圳市元征科技股份有限公司 一种影像处理方法及增强现实设备
CN106095075B (zh) * 2016-05-31 2019-12-10 深圳市元征科技股份有限公司 一种信息处理方法及增强现实设备
CN106959108B (zh) * 2017-03-23 2020-02-21 联想(北京)有限公司 位置确定方法、系统及电子设备
CN107291266B (zh) 2017-06-21 2018-08-31 腾讯科技(深圳)有限公司 图像显示的方法和装置
CN109587188B (zh) * 2017-09-28 2021-10-22 阿里巴巴集团控股有限公司 确定终端设备之间相对位置关系的方法、装置及电子设备
CN108573293B (zh) * 2018-04-11 2021-07-06 广东工业大学 一种基于增强现实技术的无人超市购物协助方法及系统
CN109669541B (zh) * 2018-09-04 2022-02-25 亮风台(上海)信息科技有限公司 一种用于配置增强现实内容的方法与设备
CN110992859B (zh) * 2019-11-22 2022-03-29 北京新势界科技有限公司 一种基于ar导视的广告牌展示方法及装置

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142005A (zh) * 2010-01-29 2011-08-03 株式会社泛泰 提供增强现实的系统、终端、服务器及方法
US20120081393A1 (en) * 2010-09-30 2012-04-05 Pantech Co., Ltd. Apparatus and method for providing augmented reality using virtual objects
CN102508363A (zh) * 2011-12-28 2012-06-20 王鹏勃 一种基于增强现实技术的无线显示眼镜及其实现方法
CN102568012A (zh) * 2010-10-13 2012-07-11 株式会社泛泰 提供增强现实服务的用户设备和方法
CN106097258A (zh) * 2016-05-31 2016-11-09 深圳市元征科技股份有限公司 一种影像处理方法及增强现实设备
CN106095075A (zh) * 2016-05-31 2016-11-09 深圳市元征科技股份有限公司 一种信息处理方法及增强现实设备

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8730156B2 (en) * 2010-03-05 2014-05-20 Sony Computer Entertainment America Llc Maintaining multiple views on a shared stable virtual space
JP5689180B2 (ja) * 2010-11-18 2015-03-25 チャフー カンパニーリミテッドChahoo Co.,Ltd. 情報認識手段を利用した管渠総合管理システム及び方法
CN104407700A (zh) * 2014-11-27 2015-03-11 曦煌科技(北京)有限公司 一种移动头戴式虚拟现实和增强现实的设备
CN104484523B (zh) * 2014-12-12 2017-12-08 西安交通大学 一种增强现实诱导维修系统的实现设备与方法
CN104574005B (zh) * 2015-02-15 2018-03-16 蔡耿新 集增强现实、体感、抠绿技术的广告展示管理系统和方法
CN105427209A (zh) * 2015-11-24 2016-03-23 余元辉 一种全景智慧旅游系统

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102142005A (zh) * 2010-01-29 2011-08-03 株式会社泛泰 提供增强现实的系统、终端、服务器及方法
US20120081393A1 (en) * 2010-09-30 2012-04-05 Pantech Co., Ltd. Apparatus and method for providing augmented reality using virtual objects
CN102568012A (zh) * 2010-10-13 2012-07-11 株式会社泛泰 提供增强现实服务的用户设备和方法
CN102508363A (zh) * 2011-12-28 2012-06-20 王鹏勃 一种基于增强现实技术的无线显示眼镜及其实现方法
CN106097258A (zh) * 2016-05-31 2016-11-09 深圳市元征科技股份有限公司 一种影像处理方法及增强现实设备
CN106095075A (zh) * 2016-05-31 2016-11-09 深圳市元征科技股份有限公司 一种信息处理方法及增强现实设备

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11875396B2 (en) 2016-05-10 2024-01-16 Lowe's Companies, Inc. Systems and methods for displaying a simulated room and portions thereof
WO2019118599A3 (fr) * 2017-12-13 2019-07-25 Lowe's Companies, Inc. Virtualisation d'objets à l'aide de modèles d'objets et de données de position d'objets
US11062139B2 (en) 2017-12-13 2021-07-13 Lowe's Conpanies, Inc. Virtualizing objects using object models and object position data
US11615619B2 (en) 2017-12-13 2023-03-28 Lowe's Companies, Inc. Virtualizing objects using object models and object position data
US12087054B2 (en) 2017-12-13 2024-09-10 Lowe's Companies, Inc. Virtualizing objects using object models and object position data

Also Published As

Publication number Publication date
CN106097258A (zh) 2016-11-09

Similar Documents

Publication Publication Date Title
WO2017206451A1 (fr) Procédé de traitement d'informations d'image et dispositif de réalité augmentée
US10389938B2 (en) Device and method for panoramic image processing
CN111445583B (zh) 增强现实处理方法及装置、存储介质和电子设备
WO2019223468A1 (fr) Procédé et appareil de suivi d'orientation de caméra, dispositif et système
KR102078427B1 (ko) 사운드 및 기하학적 분석을 갖는 증강 현실
WO2018018703A1 (fr) Procédé et dispositif de commutation de carte émulée d'un terminal mobile nfc
US10871800B2 (en) Apparatuses and methods for linking mobile computing devices for use in a dual-screen extended configuration
US11893702B2 (en) Virtual object processing method and apparatus, and storage medium and electronic device
US20170192734A1 (en) Multi-interface unified displaying system and method based on virtual reality
WO2017206452A1 (fr) Procédé de traitement d'informations et dispositif de réalité augmentée
JP2016219020A (ja) 拡張現実において、一定のレベルの情報を提供する方法・装置・コンピュータプログラム
WO2019179237A1 (fr) Procédé et dispositif d'acquisition de carte électronique de vue satellite, appareil et support de mémoire
CN103105926A (zh) 多传感器姿势识别
JP2021520540A (ja) カメラの位置決め方法および装置、端末並びにコンピュータプログラム
CN111768454A (zh) 位姿确定方法、装置、设备及存储介质
CN109992111B (zh) 增强现实扩展方法和电子设备
US20240144617A1 (en) Methods and systems for anchoring objects in augmented or virtual reality
CN111862349A (zh) 虚拟画笔实现方法、装置和计算机可读存储介质
CN115039015A (zh) 位姿跟踪方法、可穿戴设备、移动设备以及存储介质
CN110070617B (zh) 数据同步方法、装置、硬件装置
US20150371449A1 (en) Method for the representation of geographically located virtual environments and mobile device
US10782858B2 (en) Extended reality information for identified objects
KR102534449B1 (ko) 이미지 처리 방법, 장치, 전자 장치 및 컴퓨터 판독 가능 저장 매체
WO2022055419A2 (fr) Procédé et appareil d'affichage de personnage, dispositif électronique et support de stockage
KR101939530B1 (ko) 지형정보 인식을 기반으로 증강현실 오브젝트를 표시하는 방법 및 그 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16903831

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16903831

Country of ref document: EP

Kind code of ref document: A1