WO2022088989A1 - 增强现实影像显示方法及相关装置 - Google Patents
增强现实影像显示方法及相关装置 Download PDFInfo
- Publication number
- WO2022088989A1 WO2022088989A1 PCT/CN2021/116504 CN2021116504W WO2022088989A1 WO 2022088989 A1 WO2022088989 A1 WO 2022088989A1 CN 2021116504 W CN2021116504 W CN 2021116504W WO 2022088989 A1 WO2022088989 A1 WO 2022088989A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- distance
- mobile terminal
- tag device
- orientation
- augmented reality
- Prior art date
Links
- 230000003190 augmentative effect Effects 0.000 title claims abstract description 92
- 238000000034 method Methods 0.000 title claims abstract description 90
- 238000004891 communication Methods 0.000 claims description 46
- 230000006870 function Effects 0.000 claims description 42
- 238000004422 calculation algorithm Methods 0.000 claims description 41
- 238000002372 labelling Methods 0.000 claims description 27
- 230000003993 interaction Effects 0.000 claims description 22
- 238000004590 computer program Methods 0.000 claims description 20
- 238000009432 framing Methods 0.000 claims description 9
- 230000002146 bilateral effect Effects 0.000 claims description 5
- 238000012545 processing Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 16
- 238000005516 engineering process Methods 0.000 description 10
- 230000008569 process Effects 0.000 description 9
- 230000001360 synchronised effect Effects 0.000 description 5
- 230000008878 coupling Effects 0.000 description 3
- 238000010168 coupling process Methods 0.000 description 3
- 238000005859 coupling reaction Methods 0.000 description 3
- 230000000694 effects Effects 0.000 description 3
- 230000007613 environmental effect Effects 0.000 description 3
- 238000001914 filtration Methods 0.000 description 3
- 230000004927 fusion Effects 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 230000010354 integration Effects 0.000 description 2
- 239000004065 semiconductor Substances 0.000 description 2
- 230000002618 waking effect Effects 0.000 description 2
- 238000004458 analytical method Methods 0.000 description 1
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 239000007787 solid Substances 0.000 description 1
- 238000001228 spectrum Methods 0.000 description 1
- 230000003068 static effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T19/00—Manipulating 3D models or images for computer graphics
- G06T19/006—Mixed reality
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
Definitions
- the present application relates to the field of image display, and in particular, to an augmented reality image display method and related devices.
- Ultra Wide Band (UWB) technology is a wireless carrier communication technology. It does not use a sinusoidal carrier, but uses nanosecond non-sinusoidal narrow pulses to transmit data. Therefore, it occupies a wide spectrum range and can be used. for high-precision positioning.
- UWB technology can realize the positioning of tags, it does not support the integration with real-time scenes. Even if the positioning position is to be integrated into the real-time scene, it is necessary to establish three-dimensional coordinates or establish a spatial model in advance. In this way, the position of the positioning tag in the real-time scene cannot be displayed intuitively, and the user experience is not high.
- Embodiments of the present application provide an augmented reality image display method and a related device, so as to visually display the position of a positioning tag in a real-time scene.
- an embodiment of the present application provides an augmented reality image display method, which is applied to a mobile terminal, and the method includes:
- the mobile terminal first obtains the distance and orientation between the mobile terminal and the tag device, then displays a positioning icon and distance indication information, and then determines the tag device according to the distance and orientation position, and finally turn on the camera function, and display the image of the current viewing range and the augmented reality position indication information of the tag device.
- positioning is performed, not only can it be integrated with the real scene, and the position of the label can be visually displayed, but also the tedious steps of establishing a three-dimensional coordinate model can be omitted, and the user experience can be improved.
- an embodiment of the present application provides an augmented reality image display method, which is applied to a labeling device, and the method includes:
- the distance and orientation are used for the mobile terminal to perform the following operations: display a positioning icon and distance indication information, the positioning icon is used to indicate that positioning is being performed, and the distance indication information is used to indicate the distance; and The distance and the orientation determine the position of the tag device; and the camera function is turned on, and the image of the current viewing range and the augmented reality position indication information of the tag device are displayed.
- an embodiment of the present application provides an augmented reality image display device, which is applied to a mobile terminal, and the device includes:
- an acquisition unit for acquiring the distance and orientation between the mobile terminal and the tag device
- a display unit for displaying a positioning icon and distance indication information, the positioning icon being used to indicate that positioning is being performed, and the distance indication information being used to indicate the distance;
- a determining unit for determining the position of the labeling device according to the distance and orientation
- the turning-on unit is used to turn on the camera function and display the image of the current viewing range and the augmented reality position indication information of the labeling device.
- an embodiment of the present application provides an augmented reality image display device, which is applied to a labeling device, and the device includes:
- a sending unit configured to send the distance and/or orientation of the tag device relative to the mobile terminal to the mobile terminal
- the distance and orientation are used for the mobile terminal to perform the following operations: display a positioning icon and distance indication information, the positioning icon is used to indicate that positioning is being performed, and the distance indication information is used to indicate the distance; and The distance and the orientation determine the position of the tag device; and the camera function is turned on, and the image of the current viewing range and the augmented reality position indication information of the tag device are displayed.
- embodiments of the present application provide a mobile terminal, including a processor, a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured by Executed by the processor, the program includes instructions for executing steps in any method in the first aspect of the embodiments of this application.
- embodiments of the present application provide a tag device, including a processor, a memory, a communication interface, and one or more programs, wherein the one or more programs are stored in the memory and configured by Executed by the processor, the program includes instructions for executing steps in any method in the second aspect of the embodiments of this application.
- an embodiment of the present application provides a chip, including: a processor for calling and running a computer program from a memory, so that a device installed with the chip executes the first aspect or the second embodiment of the present application.
- an embodiment of the present application provides a computer-readable storage medium on which a computer program/instruction is stored, and when the computer program/instruction is executed by a processor, implements the steps of any method of the first aspect or the second aspect .
- an embodiment of the present application provides a computer program product, including a computer program/instruction, and when the computer program/instruction is executed by a processor, implements the steps of any method of the first aspect or the second aspect.
- the computer program product may be a software installation package.
- FIG. 1a is an architecture diagram of an augmented reality image display system provided by an embodiment of the present application
- FIG. 1b is a schematic diagram of a system framework of a mobile terminal provided by an embodiment of the present application.
- FIG. 1c is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- FIG. 2a is a schematic flowchart of a method for displaying an augmented reality image provided by an embodiment of the present application
- 2b is a schematic diagram of an interface operation and effect provided by an embodiment of the present application.
- FIG. 3 is a schematic flowchart of another augmented reality image display method provided by an embodiment of the present application.
- FIG. 4 is a schematic flowchart of another augmented reality image display method provided by an embodiment of the present application.
- FIG. 5 is a block diagram of functional units of an augmented reality image display device provided by an embodiment of the present application.
- FIG. 6 is a block diagram of functional units of another augmented reality image display device provided by an embodiment of the present application.
- FIG. 7 is a block diagram of functional units of another augmented reality image display device provided by an embodiment of the present application.
- FIG. 8 is a block diagram of functional units of another augmented reality image display device provided by an embodiment of the present application.
- Ultra Wide Band is a wireless carrier communication technology. According to the standards of the Federal Communications Commission (Federal Communications Commission of the United States), the working frequency band of UWB is 3.1-10.6GHz, -10dB bandwidth and system center The ratio of frequencies is greater than 20%, and the system bandwidth is at least 500MHz. Data is transmitted using narrow, non-sinusoidal pulses in nanoseconds to microseconds.
- AR Augmented Reality
- 3D registration tilt registration technology
- virtual reality fusion display virtual reality fusion display
- human-computer interaction The process is to first collect data from the real scene through cameras and sensors, transmit it to the processor for analysis and reconstruction, and then update users in real time through AR headsets or accessories such as cameras, gyroscopes, and sensors on smart mobile devices.
- the spatial position change data in the real environment so as to obtain the relative position of the virtual scene and the real scene, realize the alignment of the coordinate system and perform the fusion calculation of the virtual scene and the real scene, and finally present the synthetic image to the user to realize the augmented reality. interactive operation.
- the embodiments of the present application provide an augmented reality image display method and related apparatuses.
- the embodiments of the present application will be described in detail below with reference to the accompanying drawings.
- FIG. 1a is an architecture diagram of an augmented reality image display system provided by an embodiment of the present application.
- the augmented reality image display system 100 includes a mobile terminal 101 and a label device 102, and the mobile terminal 101 and the label device 102 communicate with each other through UWB technology.
- the mobile terminal 101 includes a first UWB module
- the tag device 102 includes a second UWB module, for the mobile terminal 101 and/or the tag device 102 to determine the mobile terminal through the first UWB module and the second UWB module Distance and bearing to the labeling device.
- a camera module is further included on the back of the mobile terminal 101 for acquiring a real-time image of the environment according to the camera.
- FIG. 1b is a schematic diagram of a system framework of a mobile terminal provided by an embodiment of the present application.
- the system framework of the mobile terminal includes a user layer, an intermediate layer and a chip layer.
- the chip layer includes a camera bottom sensor and a UWB bottom chip.
- the user collects raw data through the camera bottom sensor and the UWB bottom chip, including camera scene data. , tag distance and bearing data.
- the middle layer includes camera driver and input and output (input and output, IO) interaction, UWB driver and IO interaction, and the middle layer is a driver layer for realizing interaction and control logic with the chip layer.
- the user layer mainly includes the application program, which is the realization of the UI interface, and the effect of the fusion algorithm is presented.
- the middle layer is used to process the raw data obtained from the camera's underlying sensor and the UWB underlying chip, and then send it to the application program of the user layer for interface display.
- FIG. 1c is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
- the electronic device may be the mobile terminal 101 or any electronic device in the tag device 102.
- the electronic device is applied to an augmented reality image display system, and the electronic device includes an application processor 120, a memory 130, a communication interface 140 and one or more programs 131, wherein the one or more programs 131 is stored in the aforementioned memory 130 and configured to be executed by the aforementioned application processor 120, and the one or more programs 131 include instructions for performing any of the steps in the aforementioned method embodiments.
- the communication unit is used to support the communication between the first electronic device and other devices.
- the terminal may also include a storage unit for storing program codes and data of the terminal.
- the processing unit may be an application processor 120 or a controller, such as a central processing unit (Central Processing Unit, CPU), a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application-specific integrated circuit (Application- Specific Integrated Circuit, ASIC), Field Programmable Gate Array (Field Programmable Gate Array, FPGA) or other programmable logic devices, transistor logic devices, hardware components or any combination thereof. It may implement or execute the various exemplary logical blocks, units and circuits described in connection with this disclosure.
- the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
- the communication unit may be the communication interface 140 , a transceiver, a transceiver circuit, etc., and the storage unit may be the memory 130 .
- the memory 130 may be volatile memory or non-volatile memory, or may include both volatile and non-volatile memory.
- the non-volatile memory may be read-only memory (ROM), programmable read-only memory (PROM), erasable programmable read-only memory (EPROM), electrically programmable Erase programmable read-only memory (electrically EPROM, EEPROM) or flash memory.
- Volatile memory may be random access memory (RAM), which acts as an external cache.
- RAM random access memory
- SRAM static random access memory
- DRAM dynamic random access memory
- DRAM synchronous dynamic random access memory
- SDRAM synchronous dynamic random access memory
- DDR SDRAM double data rate synchronous dynamic random access memory
- enhanced SDRAM enhanced synchronous dynamic random access memory
- SLDRAM synchronous connection dynamic random access memory Fetch memory
- direct memory bus random access memory direct rambus RAM, DR RAM
- the application processor 120 is configured to perform any step performed by the mobile terminal or the tag device in the above method embodiments, and when performing data transmission such as sending, the communication interface 140 can be selectively invoked to complete the corresponding operation.
- FIG. 2a is a schematic flowchart of an augmented reality image display method provided by an embodiment of the present application, which is applied to a mobile terminal. As shown in the figure, the augmented reality image display method includes the following operations.
- S201 Acquire the distance and orientation between the mobile terminal and the tag device.
- the orientation may be obtained based on the mobile terminal, the angle of the label device relative to the mobile terminal, or the angle of the mobile terminal relative to the label device based on the label device.
- distance and orientation data it can be obtained according to UWB positioning technology, or can be obtained through Bluetooth technology or laser positioning.
- S202 Display a positioning icon and distance indication information, where the positioning icon is used to indicate that positioning is being performed, and the distance indication information is used to indicate the distance.
- the relevant distance indication information can be displayed on the mobile terminal according to the previously acquired distance and orientation data, and the distance indication information includes: The distance between the current tag device and the mobile terminal.
- S203 Determine the position of the label device according to the distance and orientation.
- the position of the label device refers to the position of the label device relative to the mobile terminal, including the distance to the mobile terminal and the angle to the mobile terminal.
- the mobile terminal includes a camera module, which can automatically turn on the camera according to user instructions or program settings, and obtain image information of the current environment where the mobile terminal is located. After obtaining the current environment image, combined with the obtained location information of the tag, The location of the tag is fused with the current image, so that the actual tag can be found in the environmental image.
- a camera module which can automatically turn on the camera according to user instructions or program settings, and obtain image information of the current environment where the mobile terminal is located. After obtaining the current environment image, combined with the obtained location information of the tag, The location of the tag is fused with the current image, so that the actual tag can be found in the environmental image.
- the mobile terminal first obtains the distance and orientation between the mobile terminal and the label device, then displays a positioning icon and distance indication information, and then determines the position of the label device according to the distance and orientation, and finally Turn on the camera function, and display the image of the current viewing range and the augmented reality location indication information of the tag device.
- positioning is performed, not only can it be integrated with the real scene, and the position of the label can be visually displayed, but also the tedious steps of establishing a three-dimensional coordinate model can be omitted, and the user experience can be improved.
- the augmented reality location indication information includes a location range indication icon and a location center indication icon; the location range indication icon covers the actual image information of the labeling device, and the location center indication icon points to the Actual image information of the labeling device.
- the position range indication icon is obtained by the camera module of the mobile terminal, and all the images covering the label device can be switched to the actual image information corresponding to the current label device according to the position range indication icon, and the position center indication icon uses Yu indicates the position of the tag device in the current image information in the image information.
- the actual image information of the label device pointed by the location center indication icon may be the entire image information covering the label device, or a part of the entire image information covering the label device, and the image content displayed by this part of the image can be displayed by the user. operation or switch according to the position range indicator icon.
- the location range indication icon and the location center indication icon are displayed on the interface of the mobile terminal, so that the user can obtain all the actual image information covering the label device, and can also obtain part of the actual image information covering the label device as required. , not only can be integrated with the real scene, the position of the label can be displayed intuitively, but also the user experience can be improved.
- the method further includes: detecting that the tag device is not included in the current viewing range, displaying an image of the current viewing range, and displaying a direction pointing icon according to the position, the The directional pointing icon is used to indicate the offset of the labeling device relative to the current viewing range.
- the location of the labeling device may not be in the environmental image obtained by the mobile terminal.
- the labeling device cannot be displayed in the environmental image, and only the image of the current viewing range and the position display direction indication icon are displayed, showing the relationship between the labeling device and the environment.
- the offset of the current framing range for different framing ranges, will reflect different offsets accordingly, and the angle of the labeling device relative to the different framing ranges can be determined.
- the labeling device may be within the current framing range, and at this time, the position of the labeling device can be indicated in the current framing range.
- the offset of the labeling device and the current framing range can be represented to meet the needs of different scenarios, so that the user can obtain the spatial position of the labeling device at any time. Improve user experience.
- the angular deviation of the position of the label device relative to the center line of the camera of the mobile terminal is determined, and corresponding position prompt information is output.
- the center line of the camera of the mobile terminal is used as the benchmark to obtain the offset of the label device, which can be equivalent to taking the user's current perspective as the center. line to determine the offset of the tag device relative to the user's current line of sight.
- determining the angular offset of the position of the label device relative to the center line of the camera of the mobile terminal, and outputting the corresponding position prompt information can facilitate users to obtain the label device in the current environment image.
- the actual spatial location is convenient for users to locate it and improve user experience.
- the location prompt information includes at least one of the following: voice location prompt information, text location prompt information, and image location prompt information.
- the user can be prompted by voice, for example, the label device is located 30 degrees in front of the right side of the mobile terminal, or is located at 1 o'clock or 1 o'clock of the current mobile terminal. in the front right of the mobile terminal, etc. It is also possible to directly prompt the user on the display interface of the offset angle between the label device and the mobile terminal in the form of text, or display the image information on the interface to express the relative position of the label device and the mobile terminal in the form of arrows.
- the offset information of the label device relative to the mobile terminal is prompted to the user by means of voice position prompt information, text position prompt information, and image position prompt information, which can intuitively show the spatial position of the label device in the environment. It is convenient for users to quickly locate the label device.
- the acquiring the distance and orientation between the local device and the tag device includes: determining the local end through the first UWB module and the second UWB module according to a bilateral bidirectional ranging DSTWR algorithm The distance between the device and the label device; the orientation of the label device relative to the local device is determined through the first UWB module and the second UWB module according to the PDOA algorithm of the phase difference of arrival.
- the first UWB module is located on the mobile terminal, and the second UWB module is located on the label device.
- the mobile terminal can quickly switch the antenna on the first UWB module to obtain the time stamp between the arrival of the signal of the second UWB module between the two antennas. Then, the distance difference is obtained, and then the angle difference is further calculated and stored in the relevant register of the mobile terminal.
- the realization of PDOA algorithm can be completed by dw3000 chip.
- the mobile terminal can calculate the position of the tag device according to the time difference between two interactions. After multiple interactions, it can calculate multiple sets of distance information. Finally, after filtering these multiple sets of distance information, the tag can be obtained. Distance information of the device relative to the mobile terminal.
- the mobile terminal calculates the distance and orientation between the local device and the tag device according to the DSTWR algorithm and the PDOA algorithm, which can accurately locate the tag device and improve the positioning accuracy.
- the acquiring the distance and orientation between the local device and the tag device includes: receiving the distance and orientation reported from the tag device, where the distance is obtained by the tag device according to the DSTWR algorithm.
- the orientation is determined by the message interaction between the second UWB module and the first UWB module, and the orientation is determined by the tag device through the message interaction between the second UWB module and the first UWB module according to the PDOA algorithm.
- the first UWB module is located on the mobile terminal, and the second UWB module is located on the tag device.
- the tag device can quickly switch the antenna on the second UWB module to obtain the time stamp between the arrival of the signal of the first UWB module between the two antennas. difference, and then the distance difference is obtained, and then the angle difference is further calculated and stored in the relevant register of the label device.
- the realization of PDOA algorithm can be completed by dw3000 chip.
- the label device can calculate the position of the label device according to the time difference between two interactions. After multiple interactions, multiple sets of distance information can be calculated. Finally, after filtering these multiple sets of distance information, the label can be obtained. Distance information of the device relative to the mobile terminal. Finally, the calculated distance and azimuth are sent to the mobile terminal for displaying the position of the tag device in the real-time image of the environment acquired by the mobile terminal.
- the acquiring the distance and orientation between the local device and the tag device includes: determining the distance between the local device and the tag device through the first UWB module and the second UWB module according to a DSTWR algorithm. The distance between the tag devices; the orientation reported from the tag device is received, and the orientation is determined by the tag device through the message interaction between the second UWB module and the first UWB module according to the PDOA algorithm.
- the first UWB module is located on the mobile terminal, and the second UWB module is located on the label device.
- the mobile terminal can quickly switch the antenna on the first UWB module to obtain the time stamp between the arrival of the signal of the second UWB module between the two antennas.
- the mobile terminal can calculate the position of the label device according to the time difference between two interactions. After multiple interactions, it can calculate multiple sets of distance information. Finally, after filtering these multiple sets of distance information, the label device can be obtained. Distance information relative to the mobile terminal.
- the tag device can quickly switch the antenna on the second UWB module to obtain the time stamp difference between the signal of the first UWB module reaching the two antennas, and then obtain the distance difference, and then further calculate the angle difference, which is stored in the relevant register of the tag device middle.
- the realization of PDOA algorithm can be completed by dw3000 chip.
- the first UWB module and the second UWB module can be used to determine the orientation between the local device and the tag device according to the PDOA algorithm, and the distance reported from the tag device can be received.
- the orientation is determined by the tag device through message interaction between the second UWB module and the first UWB module according to the DSTWR algorithm.
- the mobile terminal calculates the distance between the local device and the label device according to the DSTWR algorithm, and the label device calculates the orientation between the local device and the label device according to the PDOA algorithm, which can accurately locate the label device. , to improve the positioning accuracy.
- the mobile terminal also displays the distance while displaying the image of the current viewing range and the position indication information of the tag device.
- the distance information of the tag device from the mobile terminal may also be displayed at the same time.
- the distance information between the label device and the mobile terminal is displayed at the same time, and the spatial position change between the label device and the mobile terminal can be displayed in real time, which is convenient for users. Accurate positioning of label devices to improve user experience.
- the camera function can be turned on, the acquisition of real-time images can be realized, and the camera can be quickly turned on, which is convenient for user operations and improves user experience.
- the method further includes: establishing a communication connection with the Bluetooth module of the tag device through the Bluetooth module of the mobile terminal; waking up the tag device from a sleep state through the communication connection, and
- the first UWB module of the mobile terminal and the second UWB module of the tag device are configured for communication.
- the connection between the mobile terminal and the tag device may be through Bluetooth connection, so as to realize the wake-up and parameter configuration of the tag device.
- the tag device When the tag device is not connected to Bluetooth, it enters the low-power mode. After the Bluetooth connection, the tag device wakes up and configures the network parameters, so that the UWB chip between the tag and the mobile phone communicates, and the state machine for ranging and angle measurement runs. . After the mobile terminal enters the application program, the underlying UWB module starts to work.
- the wake-up and parameter configuration of the tag device can be realized by connecting the tag device through Bluetooth technology, which can reduce the power consumption of the tag device, increase the usage time of the tag device, and improve the user experience.
- the tag device is in a low power consumption mode in the sleep state.
- the usage time of the tag device can be increased, and the user experience can be improved.
- FIG. 2b is a schematic diagram of an interface operation and effect provided by an embodiment of the present application.
- the target application can be clicked on the mobile phone side to open the application.
- click the connection bar at the top of the mobile phone to connect the Bluetooth of the mobile phone and the Bluetooth of the tag device to wake up the tag device to start working and deliver parameter configuration.
- the bottom of the user interface will appear.
- the tag device will enter the working state.
- the tag device connected to the mobile terminal is 80:E1:26:19.
- the location range indication will appear on the user interface.
- the icon and the location center indication icon, and the distance information between the mobile terminal and the label device will be displayed at the bottom.
- the distance between the mobile terminal and the label device in the figure is 1.2 meters.
- the mobile phone camera can be opened by swiping up from the bottom of the mobile phone interface, so that the user interface at this time can display the scene information recorded by the camera in real time.
- a virtual icon such as shown in (4) in Figure 2b.
- a direction indicator icon will appear in the middle of the interface to remind the offset of the tag device and the mobile terminal, that is, whether the mobile tag is on the left or right of the current scene, etc. Adjust the camera direction, know the scene range where the tag device is located, and then display information such as the position of the tag device. As shown in (5) in Figure 2b, you can also see the physical label equipment, virtual icons of real scenes such as desks in the bar room, computer monitors and printers, and the location information of the label equipment on the interface.
- FIG. 3 is a schematic flowchart of another augmented reality image display method provided by an embodiment of the present application.
- the augmented reality image display method is applied to a mobile terminal, and the method includes the following steps:
- the mobile terminal determines the distance according to the bilateral two-way ranging DSTWR algorithm, determines the orientation according to the phase difference of arrival PDOA algorithm, then displays the positioning icon and distance indication information on the interface, and then determines the position of the label device according to the distance and orientation. , and finally turn on the camera function, and display the image of the current viewing range and the augmented reality position indication information of the tag device.
- the mobile terminal determines the distance according to the bilateral two-way ranging DSTWR algorithm, determines the orientation according to the phase difference of arrival PDOA algorithm, then displays the positioning icon and distance indication information on the interface, and then determines the position of the label device according to the distance and orientation. , and finally turn on the camera function, and display the image of the current viewing range and the augmented reality position indication information of the tag device.
- FIG. 4 is a schematic flowchart of another augmented reality image display method provided by an embodiment of the present application.
- the augmented reality image display method is applied to a labeling device, and the method includes the following steps:
- S401 Send the distance and/or orientation of the tag device relative to the mobile terminal to a mobile terminal; wherein the distance and orientation are used by the mobile terminal to perform the following operations: displaying a positioning icon and distance indication information, the The positioning icon is used to indicate that positioning is being performed, the distance indication information is used to indicate the distance; and the position of the tag device is determined according to the distance and the bearing; and the camera function is turned on, and the image of the current viewing range and all The augmented reality location indication information of the tag device.
- the tag device can calculate the relative distance of the tag device relative to the mobile terminal according to the second UWB module located on the tag device and the first UWB module located on the mobile terminal, and then according to the bilateral two-way ranging DSTWR algorithm and the phase difference of arrival PDOA algorithm respectively. distance and bearing, and report the calculated distance and bearing to the mobile terminal, so that the mobile terminal can display the position information of the tag in the real-time image information obtained by the mobile terminal.
- the tag device only calculates the orientation information or distance information, and sends the calculated information to the mobile terminal, and the remaining orientation information or distance information not calculated by the tag device is calculated by the mobile terminal.
- send a signal through the second UWB module so that the mobile terminal can calculate the distance and azimuth according to the information sent by the tag device.
- the tag device sends the distance and/or orientation of the tag device relative to the mobile terminal to the mobile terminal, so that the tag device can be positioned quickly and accurately, and it can be integrated with the real scene to intuitively present
- the location of the labels improves the user experience.
- the method before the sending the distance and/or orientation of the tag device relative to the mobile terminal to the mobile terminal, the method further includes: communicating with the second UWB module through the second UWB module according to a DSTWR algorithm. a first UWB module, for determining the distance between the tag device and the mobile terminal; and/or, according to the PDOA algorithm, through the second UWB module and the first UWB module, determining the relative distance between the tag device and the mobile terminal; the orientation of the mobile terminal.
- the method further includes: establishing a communication connection with the Bluetooth module of the mobile terminal through the Bluetooth module of the tag device; receiving a wake-up indication of the mobile terminal through the communication connection, according to the The wake-up instruction wakes up the tag device from the sleep state; receives UWB communication configuration information, and performs communication configuration on the second UWB module of the tag device and the first UWB module of the mobile terminal according to the communication configuration information.
- the tag device is in a low power consumption mode in the sleep state.
- An embodiment of the present application provides an augmented reality image display apparatus, and the augmented reality image display apparatus may be a mobile terminal.
- the augmented reality image display device is configured to perform the steps performed by the terminal in the above augmented reality image display method.
- the augmented reality image display device provided by the embodiments of the present application may include modules corresponding to corresponding steps.
- the augmented reality image display device may be divided into functional modules according to the above method examples.
- each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
- the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
- the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
- FIG. 5 shows a possible schematic structural diagram of the augmented reality image display device involved in the above embodiment.
- the augmented reality image display device 5 includes
- an acquisition unit 50 configured to acquire the distance and orientation between the mobile terminal and the tag device
- a display unit 51 configured to display a positioning icon and distance indication information, where the positioning icon is used to indicate that positioning is being performed, and the distance indication information is used to indicate the distance;
- a determining unit 52 configured to determine the position of the label device according to the distance and orientation
- the turning-on unit 53 is used to turn on the camera function and display the image of the current viewing range and the augmented reality position indication information of the tagging device.
- the augmented reality location indication information includes a location range indication icon and a location center indication icon; the location range indication icon covers the actual image information of the labeling device, and the location center indication icon points to the Actual image information of the labeling device.
- the apparatus is further configured to: detect that the label device is not included in the current viewing range, display the image of the current viewing range, and display a direction pointing icon according to the position, so that The direction pointing icon is used to indicate the offset of the labeling device relative to the current viewing range.
- the apparatus is further configured to: determine the angular offset of the position of the label device relative to the center line of the camera of the mobile terminal, and output corresponding position prompt information.
- the location prompt information includes at least one of the following: voice location prompt information, text location prompt information, and image location prompt information.
- the acquiring unit 50 is specifically configured to: pass the first UWB module and all The second UWB module determines the distance between the local device and the label device; the first UWB module and the second UWB module determine the relative distance between the label device and the label device through the first UWB module and the second UWB module according to the arrival phase difference PDOA algorithm.
- the orientation of the device is specifically configured to: pass the first UWB module and all The second UWB module determines the distance between the local device and the label device; the first UWB module and the second UWB module determine the relative distance between the label device and the label device through the first UWB module and the second UWB module according to the arrival phase difference PDOA algorithm.
- the orientation of the device is specifically configured to: pass the first UWB module and all The second UWB module determines the distance between the local device and the label device; the first UWB module and the second UWB module determine the relative distance between the label device and the label device through the first UWB module and the second U
- the acquiring unit 50 is specifically configured to: receive the distance and orientation reported from the tag device, where the distance is The position is determined by the tag device through message interaction between the second UWB module and the first UWB module according to the DSTWR algorithm, and the orientation is determined by the tag device through the second UWB module and the first UWB module according to the PDOA algorithm. It is determined by the message interaction performed by the UWB module.
- the acquiring unit 50 is specifically configured to: pass the first UWB module and the second UWB module according to the DSTWR algorithm module to determine the distance between the local device and the tag device; receive the orientation reported from the tag device, the orientation is that the tag device passes the second UWB module and the first UWB module according to the PDOA algorithm The message interaction is determined.
- the mobile terminal also displays the distance while displaying the image of the current viewing range and the position indication information of the tagging device.
- the enabling of the camera function is implemented by: sliding the display screen of the mobile terminal from the bottom of the display screen in a direction from the bottom to the top.
- the apparatus further includes: establishing a communication connection with the Bluetooth module of the tag device through the Bluetooth module of the mobile terminal; waking up the tag device from a sleep state through the communication connection,
- the first UWB module of the mobile terminal and the second UWB module of the tag device are configured for communication.
- the tag device is in a low power consumption mode in the sleep state.
- the augmented reality image display device 6 includes a processing module 60 and a communication module 61 .
- the processing module 60 is used to control and manage the actions of the augmented reality image display device, for example, the steps performed by the acquisition unit 50 , the display unit 51 , the determination unit 52 and the opening unit 53 , and/or used to perform the techniques described herein other processes.
- the communication module 61 is used to support the interaction between the augmented reality image display device and other devices.
- the augmented reality image display device may further include a storage module 62, and the storage module 62 is configured to store program codes and data of the augmented reality image display device.
- the processing module 60 may be a processor or a controller, such as a central processing unit (Central Processing Unit, CPU), a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), ASIC, FPGA or other programmable Logic devices, transistor logic devices, hardware components, or any combination thereof. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
- the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
- the communication module 61 may be a transceiver, an RF circuit, a communication interface, or the like.
- the storage module 62 may be a memory.
- Both the augmented reality image display device 5 and the augmented reality image display device 6 can execute the steps performed by the terminal in the augmented reality image display method shown in FIG. 2 a and FIG. 3 .
- the embodiment of the present application provides another augmented reality image display apparatus, and the augmented reality image display apparatus may be a label device.
- the augmented reality image display device is configured to perform the steps performed by the terminal in the above augmented reality image display method.
- the augmented reality image display device provided by the embodiments of the present application may include modules corresponding to corresponding steps.
- the augmented reality image display device may be divided into functional modules according to the above method examples.
- each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
- the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
- the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
- FIG. 7 shows a possible schematic structural diagram of the augmented reality image display device involved in the above embodiment.
- the augmented reality image display device 7 includes a sending unit 70 for sending the distance and/or orientation of the tag device relative to the mobile terminal to the mobile terminal; wherein the distance and orientation are used for all
- the mobile terminal performs the following operations: displaying a positioning icon for indicating that positioning is being performed and distance indicating information for indicating the distance; and determining the tag device according to the distance and orientation and turn on the camera function, and display the image of the current viewing range and the augmented reality location indication information of the tag device.
- the apparatus before the sending to the mobile terminal the distance and/or orientation of the tag device relative to the mobile terminal, is further configured to: communicate with the second UWB module through the second UWB module according to the DSTWR algorithm. the first UWB module, to determine the distance between the tag device and the mobile terminal; and/or, to determine the tag through the second UWB module and the first UWB module according to a PDOA algorithm The orientation of the device relative to the mobile terminal.
- the apparatus is further configured to: establish a communication connection with the Bluetooth module of the mobile terminal through the Bluetooth module of the tag device; receive a wake-up indication of the mobile terminal through the communication connection, according to the The wake-up instruction wakes up the tag device from a sleep state; receives UWB communication configuration information, and performs communication configuration on the second UWB module of the tag device and the first UWB module of the mobile terminal according to the communication configuration information.
- the tag device is in a low power consumption mode in the sleep state.
- FIG. 8 a schematic structural diagram of another augmented reality image display device provided by an embodiment of the present application is shown in FIG. 8 .
- the augmented reality image display device 8 includes a processing module 80 and a communication module 81 .
- the processing module 80 is used to control and manage the actions of the augmented reality image display device, eg, the steps performed by the sending unit 70, and/or other processes used to perform the techniques described herein.
- the communication module 81 is used to support the interaction between the augmented reality image display device and other devices.
- the augmented reality image display device may further include a storage module 82, and the storage module 82 is used for storing program codes and data of the augmented reality image display device.
- the processing module 80 may be a processor or a controller, such as a central processing unit (Central Processing Unit, CPU), a general-purpose processor, a digital signal processor (Digital Signal Processor, DSP), ASIC, FPGA or other programmable Logic devices, transistor logic devices, hardware components, or any combination thereof. It may implement or execute the various exemplary logical blocks, modules and circuits described in connection with this disclosure.
- the processor may also be a combination that implements computing functions, such as a combination of one or more microprocessors, a combination of a DSP and a microprocessor, and the like.
- the communication module 81 may be a transceiver, an RF circuit, a communication interface, or the like.
- the storage module 82 may be a memory.
- Both the augmented reality image display device 7 and the augmented reality image display device 8 can perform the steps performed by the terminal in the augmented reality image display method shown in FIG. 4 .
- the embodiment of the present application provides another augmented reality image display apparatus, and the augmented reality image display apparatus may be a label device.
- the augmented reality image display device is configured to perform the steps performed by the terminal in the above augmented reality image display method.
- the augmented reality image display device provided by the embodiments of the present application may include modules corresponding to corresponding steps.
- the augmented reality image display device may be divided into functional modules according to the above method examples.
- each functional module may be divided corresponding to each function, or two or more functions may be integrated into one processing module.
- the above-mentioned integrated modules can be implemented in the form of hardware, and can also be implemented in the form of software function modules.
- the division of modules in the embodiments of the present application is schematic, and is only a logical function division, and there may be other division manners in actual implementation.
- the above embodiments may be implemented in whole or in part by software, hardware, firmware or any other combination.
- the above-described embodiments may be implemented in whole or in part in the form of a computer program product.
- the computer program product includes one or more computer instructions or computer programs. When the computer instructions or computer programs are loaded or executed on a computer, all or part of the processes or functions described in the embodiments of the present application are generated.
- the computer may be a general purpose computer, special purpose computer, computer network, or other programmable device.
- the computer instructions may be stored in or transmitted from one computer-readable storage medium to another computer-readable storage medium, for example, the computer instructions may be downloaded from a website site, computer, server, or data center Transmission by wire or wireless to another website site, computer, server or data center.
- the computer-readable storage medium can be any available medium that can be accessed by a computer or a data storage device such as a server, a data center, or the like containing one or more sets of available media.
- the usable media may be magnetic media (eg, floppy disks, hard disks, magnetic tapes), optical media (eg, DVDs), or semiconductor media.
- the semiconductor medium may be a solid state drive.
- Embodiments of the present application further provide a computer storage medium, wherein the computer storage medium stores a computer program for electronic data exchange, and the computer program causes the computer to execute part or all of the steps of any method described in the above method embodiments , the above computer includes electronic equipment.
- Embodiments of the present application further provide a computer program product, where the computer program product includes a non-transitory computer-readable storage medium storing a computer program, and the computer program is operable to cause a computer to execute any one of the method embodiments described above. some or all of the steps of the method.
- the computer program product may be a software installation package, and the computer includes an electronic device.
- the size of the sequence numbers of the above-mentioned processes does not mean the sequence of execution, and the execution sequence of each process should be determined by its functions and internal logic, and should not be dealt with in the embodiments of the present application. implementation constitutes any limitation.
- the disclosed method, apparatus and system may be implemented in other manners.
- the device embodiments described above are only illustrative; for example, the division of the units is only a logical function division, and there may be other division methods in actual implementation; for example, multiple units or components may be combined or Can be integrated into another system, or some features can be ignored, or not implemented.
- the shown or discussed mutual coupling or direct coupling or communication connection may be through some interfaces, indirect coupling or communication connection of devices or units, and may be in electrical, mechanical or other forms.
- the units described as separate components may or may not be physically separated, and components displayed as units may or may not be physical units, that is, may be located in one place, or may be distributed to multiple network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution in this embodiment.
- each functional unit in each embodiment of the present invention may be integrated into one processing unit, or each unit may be physically included individually, or two or more units may be integrated into one unit.
- the above-mentioned integrated unit may be implemented in the form of hardware, or may be implemented in the form of hardware plus software functional units.
- the above-mentioned integrated units implemented in the form of software functional units can be stored in a computer-readable storage medium.
- the above-mentioned software functional unit is stored in a storage medium, and includes several instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to execute some steps of the methods described in the various embodiments of the present invention.
- the aforementioned storage medium includes: U disk, mobile hard disk, Read-Only Memory (ROM for short), Random Access Memory (RAM for short), magnetic disk or CD, etc. that can store program codes medium.
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Computer Graphics (AREA)
- Computer Hardware Design (AREA)
- Software Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- User Interface Of Digital Computer (AREA)
- Telephone Function (AREA)
Abstract
一种增强现实影像显示方法及相关装置,应用于移动终端,所述方法包括:获取所述移动终端与标签设备之间的距离和方位(S201);显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离(S202);根据所述距离和方位确定所述标签设备的位置(S203);开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息(S204)。这样,在进行定位时不仅可与现实场景融合,直观呈现出标签的位置,且还能省去建立三维坐标模型的繁琐步骤,提高用户体验。
Description
本申请涉及影像显示领域,具体涉及一种增强现实影像显示方法及相关装置。
超宽带(Ultra Wide Band,UWB)技术是一种无线载波通信技术,它不采用正弦载波,而是利用纳秒级的非正弦波窄脉冲传输数据,因此其所占的频谱范围很宽,可用于高精度定位。虽然现有的UWB技术可以实现对标签的定位,但是并不支持与实时场景的融合,即使是要将定位位置融合到实时场景中,也需要建立三维坐标,或提前建立空间模型。这样不能直观的显示定位标签在实时场景中的位置,用户体验度不高。
发明内容
本申请实施例提供了一种增强现实影像显示方法及相关装置,以期直观显示定位标签在实时场景中的位置。
第一方面,本申请实施例提供了一种增强现实影像显示方法,应用于移动终端,所述方法包括:
获取所述移动终端与标签设备之间的距离和方位;
显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;
根据所述距离和方位确定所述标签设备的位置;
开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
可以看出,本申请实施例中,移动终端首先获取所述移动终端与标签设备之间的距离和方位,然后显示定位图标和距离指示信息,再然后根据所述距离和方位确定所述标签设备的位置,最后开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。这样,在进行定位时不仅可与现实场景融合,直观呈现出标签的位置,且还能省去建立三维坐标模型的繁琐步骤,提高用户体验。
第二方面,本申请实施例提供了一种增强现实影像显示方法,应用于标签设备,所述方法包括:
向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;
其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
第三方面,本申请实施例提供了一种增强现实影像显示装置,应用于移动终端,所述装置包括:
获取单元,用于获取所述移动终端与标签设备之间的距离和方位;
显示单元,用于显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;
确定单元,用于根据所述距离和方位确定所述标签设备的位置;
开启单元,用于开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现 实位置指示信息。
第四方面,本申请实施例提供了一种增强现实影像显示装置,应用于标签设备,所述装置包括:
发送单元,用于向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;
其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
第五方面,本申请实施例提供一种移动终端,包括处理器、存储器、通信接口以及一个或多个程序,其中,所述一个或多个程序被存储在所述存储器中,并且被配置由所述处理器执行,所述程序包括用于执行本申请实施例第一方面任一方法中的步骤的指令。
第六方面,本申请实施例提供一种标签设备,包括处理器、存储器、通信接口以及一个或多个程序,其中,所述一个或多个程序被存储在所述存储器中,并且被配置由所述处理器执行,所述程序包括用于执行本申请实施例第二方面任一方法中的步骤的指令。
第七方面,本申请实施例提供了一种芯片,包括:处理器,用于从存储器中调用并运行计算机程序,使得安装有所述芯片的设备执行如本申请实施例第一方面或第二方面任一方法中所描述的部分或全部步骤。
第八方面,本申请实施例提供了一种计算机可读存储介质,其上存储有计算机程序/指令,该计算机程序/指令被处理器执行时实现第一方面或第二方面任一方法的步骤。
第九方面,本申请实施例提供了一种计算机程序产品,包括计算机程序/指令,该计算机程序/指令被处理器执行时实现第一方面或第二方面任一方法的步骤。该计算机程序产品可以为一个软件安装包。
为了更清楚地说明本申请实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1a是本申请实施例提供的一种增强现实影像显示系统架构图;
图1b是本申请实施例提供的一种移动终端的系统框架示意图;
图1c是本申请实施例提供的一种电子设备的结构示意图;
图2a是本申请实施例提供的一种增强现实影像显示方法的流程示意图;
图2b是本申请实施例提供的一种界面操作及效果示意图;
图3是本申请实施例提供的另一种增强现实影像显示方法的流程示意图;
图4是本申请实施例提供的另一种增强现实影像显示方法的流程示意图;
图5是本申请实施例提供的一种增强现实影像显示装置的功能单元组成框图;
图6是本申请实施例提供的另一种增强现实影像显示装置的功能单元组成框图;
图7是本申请实施例提供的另一种增强现实影像显示装置的功能单元组成框图;
图8是本申请实施例提供的另一种增强现实影像显示装置的功能单元组成框图。
为了使本技术领域的人员更好地理解本申请方案,下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申 请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请的说明书和权利要求书及上述附图中的术语“第一”、“第二”等是用于区别不同对象,而不是用于描述特定顺序。此外,术语“包括”和“具有”以及它们任何变形,意图在于覆盖不排他的包含。例如包含了一系列步骤或单元的过程、方法、系统、产品或设备没有限定于已列出的步骤或单元,而是可选地还包括没有列出的步骤或单元,或可选地还包括对于这些过程、方法、产品或设备固有的其他步骤或单元。
在本文中提及“实施例”意味着,结合实施例描述的特定特征、结构或特性可以包含在本申请的至少一个实施例中。在说明书中的各个位置出现该短语并不一定均是指相同的实施例,也不是与其它实施例互斥的独立的或备选的实施例。本领域技术人员显式地和隐式地理解的是,本文所描述的实施例可以与其它实施例相结合。
为了更好地理解本申请实施例的方案,下面先对本申请实施例可能涉及的相关术语和概念进行介绍。
超宽带(Ultra Wide band,UWB)是一种无线载波通信技术,根据美国联邦通信委员会(Federal Communications Commission of the United States)的标准,UWB的工作频段为3.1-10.6GHz,-10dB带宽与系统中心频率的比值大于20%,系统带宽至少为500MHz。利用纳秒至微秒级的非正弦波窄脉冲传输数据。
增强现实(Augmented Reality,AR)的三大技术要点:三维注册(跟踪注册技术)、虚拟现实融合显示、人机交互。其流程是首先通过摄像头和传感器将真实场景进行数据采集,并传入处理器对其进行分析和重构,再通过AR头显或智能移动设备上的摄像头、陀螺仪、传感器等配件实时更新用户在现实环境中的空间位置变化数据,从而得出虚拟场景和真实场景的相对位置,实现坐标系的对齐并进行虚拟场景与现实场景的融合计算,最后将其合成影像呈现给用户,实现增强现实的交互操作。
目前,根据UWB技术进行定位时,并不支持与实时场景的融合,即使是要将定位位置融合到实时场景中,也需要建立三维坐标,或提前建立空间模型。这样不能直观的显示定位标签在实时场景中的位置,用户体验度不高。
针对上述问题,本申请实施例提供了一种增强现实影像显示方法及相关装置,下面结合附图对本申请实施例进行详细介绍。
请参阅图1a,图1a是本申请实施例提供的一种增强现实影像显示系统架构图。该增强现实影像显示系统100包括移动终端101和标签设备102,所述移动终端101和标签设备102通过UWB技术交互通信。所述移动终端101包括第一UWB模块,所述标签设备102包括第二UWB模块,用于所述移动终端101和/或所述标签设备102通过第一UWB模块和第二UWB模块确定移动终端与标签设备间的距离和方位。在所述移动终端101的背面还包括相机模块,用于根据所述相机获取环境实时影像。
请参阅图1b,图1b是本申请实施例提供的一种移动终端的系统框架示意图。该移动终端的系统框架包括用户层,中间层和芯片层,所述芯片层包括摄像头底层传感器和UWB底层芯片,用户通过所述摄像头底层传感器和UWB底层芯片实现原始数据的采集,包括相机场景数据,标签距离和方位数据。所述中间层包括摄像头驱动及输入输出(input and output,IO)交互、UWB驱动及IO交互,所述中间层是驱动层,用于实现与芯片层的交互及控制逻辑。用户层主要包括应用程序,是UI界面的实现,融合算法的效果呈现。中间层用于分别将摄像头底层传感器和UWB底层芯片获取原始数据处理后,再次发送给用户层的应用程序进行界面显示。
请参阅图1c,图1c是本申请实施例提供的一种电子设备的结构示意图。该电子设备可 以是移动终端101,也可以是标签设备102中的任意一个电子设备。如图所示,所述电子设备应用于增强现实影像显示系统,所述电子设备包括应用处理器120、存储器130、通信接口140以及一个或多个程序131,其中,所述一个或多个程序131被存储在上述存储器130中,且被配置由上述应用处理器120执行,所述一个或多个程序131包括用于执行上述方法实施例中任一步骤的指令。
通信单元用于支持第一电子设备与其他设备的通信。终端还可以包括存储单元用于存储终端的程序代码和数据。
其中,处理单元可以是应用处理器120或控制器,例如可以是中央处理器(Central Processing Unit,CPU),通用处理器,数字信号处理器(Digital Signal Processor,DSP),专用集成电路(Application-Specific Integrated Circuit,ASIC),现场可编程门阵列(Field Programmable Gate Array,FPGA)或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,单元和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信单元可以是通信接口140、收发器、收发电路等,存储单元可以是存储器130。
所述存储器130可以是易失性存储器或非易失性存储器,或可包括易失性和非易失性存储器两者。其中,非易失性存储器可以是只读存储器(read-only memory,ROM)、可编程只读存储器(programmable ROM,PROM)、可擦除可编程只读存储器(erasable PROM,EPROM)、电可擦除可编程只读存储器(electrically EPROM,EEPROM)或闪存。易失性存储器可以是随机存取存储器(random access memory,RAM),其用作外部高速缓存。通过示例性但不是限制性说明,许多形式的随机存取存储器(random access memory,RAM)可用,例如静态随机存取存储器(static RAM,SRAM)、动态随机存取存储器(DRAM)、同步动态随机存取存储器(synchronous DRAM,SDRAM)、双倍数据速率同步动态随机存取存储器(double data rate SDRAM,DDR SDRAM)、增强型同步动态随机存取存储器(enhanced SDRAM,ESDRAM)、同步连接动态随机存取存储器(synchlink DRAM,SLDRAM)和直接内存总线随机存取存储器(direct rambus RAM,DR RAM)。
具体实现中,所述应用处理器120用于执行如上述方法实施例中由移动终端或标签设备执行的任一步骤,且在执行诸如发送等数据传输时,可选择的调用所述通信接口140来完成相应操作。
请参阅图2a,图2a是本申请实施例提供的一种增强现实影像显示方法的流程示意图,应用于移动终端,如图所示,本增强现实影像显示方法包括以下操作。
S201,获取所述移动终端与标签设备之间的距离和方位。
其中,所述方位可以是以移动终端为基准得到的,标签设备相对于移动终端的角度,也可以是以标签设备为基准,移动终端相对于标签设备的角度。在获取距离和方位数据时,可根据UWB定位技术获取,还可以通过蓝牙技术或激光定位等获取。
S202,显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离。
其中,在所述移动终端的定位功能打开后,可以在界面上显示相应的定位图标,且同时根据之前获取的距离和方位数据将相关的距离指示信息显示到移动终端上,该距离指示信息包括当前标签设备与所述移动终端的距离。
S203,根据所述距离和方位确定所述标签设备的位置。
其中,所述标签设备的位置是指该标签设备相对于所述移动终端的位置,包括与该移动终端的距离和与该移动终端的角度。
S204,开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
其中,移动终端包括相机模块,可以根据用户指示或根据程序设定自动打开摄像头,获取该移动终端当前所处的环境的影像信息,在获取了当前环境影像后,结合获取的标签的位置信息,将标签的位置与当前影像进行融合处理,使得最终能在环境影像中找到标签实物。
可见,本实例中,移动终端首先获取所述移动终端与标签设备之间的距离和方位,然后显示定位图标和距离指示信息,再然后根据所述距离和方位确定所述标签设备的位置,最后开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。这样,在进行定位时不仅可与现实场景融合,直观呈现出标签的位置,且还能省去建立三维坐标模型的繁琐步骤,提高用户体验。
在一个可能的实例中,所述增强现实位置指示信息包括位置范围指示图标和位置中心指示图标;所述位置范围指示图标覆盖所述标签设备的实际影像信息,所述位置中心指示图标指向所述标签设备的实际影像信息。
其中,所述位置范围指示图标是移动终端的相机模块获取的,所有覆盖了该标签设备的影像,可以根据该位置范围指示图标切换当前标签设备对应的实际影像信息,所述位置中心指示图标用于表示在当前影像信息中,标签设备在该影像信息中的位置。位置中心指示图标指向的标签设备的实际影像信息可以是覆盖所述标签设备的全部影像信息,也可以是覆盖所述标签设备的全部影像信息的一部分,且这部分影像显示的影像内容可以通过用户操作或根据位置范围指示图标切换。
可见,本实例中,在移动终端的界面显示位置范围指示图标和位置中心指示图标,使得用户能获取覆盖所述标签设备的全部实际影像信息,也能根据需求获取覆盖标签设备的部分实际影像信息,不仅可与现实场景融合,直观呈现出标签的位置,还能提高用户体验。
在一个可能的实例中,所述方法还包括:检测到所述当前取景范围内不包括所述标签设备,则显示所述当前取景范围的影像,以及根据所述位置显示方向指向图标,所述方向指向图标用于指示所述标签设备相对于当前取景范围的偏移。
其中,标签设备的位置可能不在移动终端获取的环境影像中,此时无法使标签设备在环境影像中显示出来,则仅显示当前的取景范围的影像和位置显示方向指示图标,展示出标签设备与当前取景范围的偏移,对于不同的取景范围,会相应体现出不同的偏移,可以确定标签设备相对于不同取景范围的角度。随着当前取景范围的变化,标签设备可能会处在当前取景范围内,此时就可以在当前取景范围内表示出该标签设备的位置。
可见,本实例中,即使标签设备的位置不在当前取景范围内,也可以表示出标签设备与当前取景范围的偏移,满足不同的场景需求,使得用户随时可以获取该标签设备的空间位置,可以提高用户体验。
在一个可能的实例中,确定所述标签设备的位置相对于所述移动终端的摄像头的中心线的角度偏移情况,输出对应的位置提示信息。
其中,当移动终端为手机等手持电子设备时,由于这些设备时常和用户一起,因此以移动终端的摄像头的中心线为基准,获取标签设备的偏移情况,可以相当于以用户当前视角为中心线,确定标签设备相对于用户当前视线的偏移。
可见,本实例中,确定所述标签设备的位置相对于所述移动终端的摄像头的中心线的角度偏移情况,输出对应的位置提示信息,可以方便用户之间获取标签设备在当前环境影像中的实际空间位置,方便用户对其定位,提高用户体验。
在一个可能的实例中,所述位置提示信息包括以下至少一种:语音位置提示信息、文 字位置提示信息、图像位置提示信息。
其中,可以在获取了标签设备与摄像头的中心线的角度偏移情况后,以语音的方式提示用户,例如标签设备位于移动终端的右前方30度,或位于当前移动终端的1点钟方向或在移动终端右前方等。还可以在显示界面上直接以文字的方式提示用户标签设备与移动终端的偏移角度,或者在界面显示的影像信息,以箭头等的形式表现标签设备与移动终端的相对位置。
可见,本实例中,通过语音位置提示信息、文字位置提示信息、图像位置提示信息的方式提示用户标签设备相对于移动终端的偏移信息,可以直观的表现出标签设备在环境中的空间位置,方便用户快速对标签设备进行定位。
在一个可能的实例中,所述获取本端设备与标签设备之间的距离和方位,包括:根据双边双向测距DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;根据到达相位差PDOA算法通过所述第一UWB模块和所述第二UWB模块,确定所述标签设备相对于本端设备的方位。
其中,第一UWB模块位于移动终端上,第二UWB模块位于标签设备上,移动终端可以通过快速切换第一UWB模块上的天线,得到第二UWB模块的信号到达两个天线之间的时间戳差,进而得到距离差,然后进一步算出角度差,并存在移动终端的相关寄存器中。对于PDOA算法的实现可以通过dw3000芯片完成。同样,移动终端可以根据两次交互的时间差,可以计算出标签设备的位置,经过多次交互后,可以计算出多组距离信息,最后对这些多组距离信息进行滤波处理后,就可以得到标签设备相对于移动终端的距离信息。
可见,本实例中,移动终端根据DSTWR算法和PDOA算法计算出本端设备与标签设备之间的距离和方位,可以准确的对标签设备进行定位,提高定位精度。
在一个可能的实例中,所述获取本端设备与标签设备之间的距离和方位,包括:接收来自所述标签设备上报的距离和方位,所述距离是所述标签设备根据DSTWR算法通过所述第二UWB模块与所述第一UWB模块进行消息交互确定的,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
其中,第一UWB模块位于移动终端上,第二UWB模块位于标签设备上,标签设备可以通过快速切换第二UWB模块上的天线,得到第一UWB模块的信号到达两个天线之间的时间戳差,进而得到距离差,然后进一步算出角度差,并存在标签设备的相关寄存器中。对于PDOA算法的实现可以通过dw3000芯片完成。同样,标签设备可以根据两次交互的时间差,可以计算出标签设备的位置,经过多次交互后,可以计算出多组距离信息,最后对这些多组距离信息进行滤波处理后,就可以得到标签设备相对于移动终端的距离信息。最后再将计算出的距离和方位发送给移动终端,用于在移动终端获取的环境实时影像中显示该标签设备的位置。
可见,本实例中,标签设备根据DSTWR算法和PDOA算法计算出本端设备与标签设备之间的距离和方位,可以准确的对标签设备进行定位,提高定位精度。
在一个可能的实例中,所述获取本端设备与标签设备之间的距离和方位,包括:根据DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;接收来自所述标签设备上报方位,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
其中,第一UWB模块位于移动终端上,第二UWB模块位于标签设备上,移动终端可以通过快速切换第一UWB模块上的天线,得到第二UWB模块的信号到达两个天线之间的时间戳差,移动终端根据两次交互的时间差,可以计算出标签设备的位置,经过多次交互后,可以计算出多组距离信息,最后对这些多组距离信息进行滤波处理后,就可以得到标 签设备相对于移动终端的距离信息。标签设备可以通过快速切换第二UWB模块上的天线,得到第一UWB模块的信号到达两个天线之间的时间戳差,进而得到距离差,然后进一步算出角度差,并存在标签设备的相关寄存器中。对于PDOA算法的实现可以通过dw3000芯片完成。当然,与上述方案相同的,可以根据PDOA算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的方位,接收来自所述标签设备上报距离,所述方位是所述标签设备根据DSTWR算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
可见,本实例中,移动终端根据DSTWR算法计算出本端设备与标签设备之间的距离,标签设备根据PDOA算法计算出本端设备与标签设备之间的方位,可以准确的对标签设备进行定位,提高定位精度。
在一个可能的实例中,所述移动终端显示当前取景范围的影像和所述标签设备的位置指示信息的同时,还显示有所述距离。
其中,在通过移动终端的界面显示出标签设备在当前实际影像中的位置的时候,还可以同时显示该标签设备距离该移动终端的距离信息。
可见,本实例中,在移动终端的界面上对标签设备在影像信息中定位显示时,同时显示标签设备与移动终端的距离信息,可以实时显示出标签设备与移动终端的空间位置变化,方便用户对标签设备进行准确定位,提高用户体验。
在一个可能的实例中,所述相机功能的开启通过如下方式实现:针对所述移动终端的显示屏从该显示屏的底部按照由所述底部向顶部的方向滑动。
可见,本实例中,将显示屏按照从该显示屏的底部向顶部的方向滑动,就可以开启相机功能,实现对实时影像的获取,可以快速打开相机,方便用户操作,提高用户体验。
在一个可能的实例中,所述方法还包括:通过所述移动终端的蓝牙模块与所述标签设备的蓝牙模块建立通信连接;通过所述通信连接将所述标签设备由休眠状态唤醒,并对所述移动终端的第一UWB模块和所述标签设备的第二UWB模块进行通信配置。
其中,移动终端与标签设备的连接方式可以是通过蓝牙连接,实现对标签设备的唤醒及参数配置。标签设备在未连接蓝牙时,进入低功耗模式,蓝牙连接后,标签设备被唤醒,进行网络参数的配置,使得标签和手机之间的UWB芯片进行通信,运行测距和测角的状态机。移动终端进入应用程序后,底层UWB模块开始工作。
可见,本实例中,通过蓝牙技术连接标签设备,才实现对标签设备的唤醒及参数配置,可以降低标签设备的功耗,提高标签设备的使用时长,提高用户体验。
在一个可能的实例中,所述标签设备在所述休眠状态下处于低功耗模式。
可见,本实例中,标签设备在所述休眠状态下处于低功耗模式可以增大该标签设备的使用时长,提高用户体验。
下面结合具体示例进行详细说明。
请参阅图2b,图2b是本申请实施例提供的一种界面操作及效果示意图。如图2b中的(1)所示当标签设备放在桌子上并上电后,可以在手机侧点击目标应用程序以开启应用。然后如图2b中的(2)所示,点击手机顶部的连接栏,使得手机的蓝牙和标签设备端的蓝牙进行连接,以唤醒标签设备开始工作并下发参数配置,此时用户界面底部会出现当前移动终端所在的房间的房间信息,如图中的移动终端所处的房间信息为办公室。如图2b中的(3)所示,蓝牙连接成功后,标签设备会进入工作状态,如图移动终端连接的标签设备为80:E1:26:19,此时用户界面上会出现位置范围指示图标和位置中心指示图标,底部会增加显示移动终端距所述标签设备的距离信息,例如图中的移动终端与标签设备的距离为1.2米。且此时可以从手机界面的底部上划打开手机相机,使得此时的用户界面可以显示相机 实时录制的场景信息,如果此时标签在获取的场景范围内,那个界面上就会同时出现标签定位虚拟图标,例如图2b中的(4)所示。如果标签设备不在当前的场景范围内,那么界面的中部会出现方向指示图标,用于提示标签设备与移动终端的偏移,也就是移动标签在当前场景的左边还是右边等,用户可以根据方向图标调整相机方向,知道标签设备所在的场景范围,然后进行标签设备位置等信息的显示。如图2b中的(5)所示,还可以在界面上看到标签设备实物,棒室书桌、电脑显示器和打印机等现实场景和标签设备位置信息的虚拟图标。
请参阅图3,图3是本申请实施例提供的另一种增强现实影像显示方法的流程示意图,本增强现实影像显示方法应用于移动终端,该方法包括如下步骤:
S301,根据双边双向测距DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;
S302,根据到达相位差PDOA算法通过所述第一UWB模块和所述第二UWB模块,确定所述标签设备相对于本端设备的方位;
S303,显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;
S304,根据所述距离和方位确定所述标签设备的位置;
S305,开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
可见,本实例中,移动终端根据双边双向测距DSTWR算法确定距离,根据到达相位差PDOA算法确定方位,然后在界面上显示定位图标和距离指示信息,再然后根据距离和方位确定标签设备的位置,最后开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。这样,在进行定位时不仅可与现实场景融合,直观呈现出标签的位置,且还能省去建立三维坐标模型的繁琐步骤,提高用户体验。
请参阅图4,图4是本申请实施例提供的另一种增强现实影像显示方法的流程示意图,本增强现实影像显示方法应用于标签设备,该方法包括如下步骤:
S401,向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
其中,标签设备可以根据位于标签设备上的第二UWB模块和位于移动终端上的第一UWB模块,再分别根据双边双向测距DSTWR算法和到达相位差PDOA算法技术出标签设备相对于移动终端的距离和方位,并将计算出的距离和方位上报给移动终端,使得移动终端可以将该标签的位置信息在通过移动终端获取的实时影像信息中显示出来。或者标签设备仅计算出方位信息或距离信息,并将计算出的信息发送给移动终端,剩下标签设备未计算出的方位信息或距离信息由移动终端计算得出。或者通过第二UWB模块发送信号,使得移动终端可以根据标签设备发送的信息计算出所述距离和方位。
可见,本实例中,标签设备向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位,这样可以快速准确的对标签设备进行定位,且可与现实场景融合,直观呈现出标签的位置,提高用户体验。
在一个可能的实例中,所述向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位之前,所述方法还包括:根据DSTWR算法通过所述第二UWB模块与所述第一 UWB模块,确定所述标签设备与所述移动终端之间的所述距离;和/或,根据PDOA算法通过所述第二UWB模块和所述第一UWB模块,确定所述标签设备相对于所述移动终端的方位。
在一个可能的实例中,所述方法还包括:通过所述标签设备的蓝牙模块与所述移动终端的蓝牙模块建立通信连接;通过所述通信连接接收所述移动终端的唤醒指示,根据所述唤醒指示将所述标签设备由休眠状态唤醒;接收UWB通信配置信息,根据所述通信配置信息对所述标签设备的第二UWB模块和所述移动终端的第一UWB模块进行通信配置。
在一个可能的实例中,所述标签设备在所述休眠状态下处于低功耗模式。
本申请实施例提供一种增强现实影像显示装置,该增强现实影像显示装置可以为移动终端。具体的,增强现实影像显示装置用于执行以上增强现实影像显示方法中终端所执行的步骤。本申请实施例提供的增强现实影像显示装置可以包括相应步骤所对应的模块。
本申请实施例可以根据上述方法示例对增强现实影像显示装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图5示出上述实施例中所涉及的增强现实影像显示装置的一种可能的结构示意图。如图5所示,增强现实影像显示装置5包括
获取单元50,用于获取所述移动终端与标签设备之间的距离和方位;
显示单元51,用于显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;
确定单元52,用于根据所述距离和方位确定所述标签设备的位置;
开启单元53,用于开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
在一个可能的示例中,所述增强现实位置指示信息包括位置范围指示图标和位置中心指示图标;所述位置范围指示图标覆盖所述标签设备的实际影像信息,所述位置中心指示图标指向所述标签设备的实际影像信息。
在一个可能的示例中,所述装置还用于:检测到所述当前取景范围内不包括所述标签设备,则显示所述当前取景范围的影像,以及根据所述位置显示方向指向图标,所述方向指向图标用于指示所述标签设备相对于当前取景范围的偏移。
在一个可能的示例中,所述装置还用于:确定所述标签设备的位置相对于所述移动终端的摄像头的中心线的角度偏移情况,输出对应的位置提示信息。
在一个可能的示例中,所述位置提示信息包括以下至少一种:语音位置提示信息、文字位置提示信息、图像位置提示信息。
在一个可能的示例中,在所述获取本端设备与标签设备之间的距离和方位方面,所述获取单元50具体用于:根据双边双向测距DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;根据到达相位差PDOA算法通过所述第一UWB模块和所述第二UWB模块,确定所述标签设备相对于本端设备的方位。
在一个可能的示例中,在所述获取本端设备与标签设备之间的距离和方位方面,所述获取单元50具体用于:接收来自所述标签设备上报的距离和方位,所述距离是所述标签设备根据DSTWR算法通过所述第二UWB模块与所述第一UWB模块进行消息交互确定的,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块 进行消息交互确定的。
在一个可能的示例中,在所述获取本端设备与标签设备之间的距离和方位方面,所述获取单元50具体用于:根据DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;接收来自所述标签设备上报方位,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
在一个可能的示例中,所述移动终端显示当前取景范围的影像和所述标签设备的位置指示信息的同时,还显示有所述距离。
在一个可能的示例中,所述相机功能的开启通过如下方式实现:针对所述移动终端的显示屏从该显示屏的底部按照由所述底部向顶部的方向滑动。
在一个可能的示例中,所述装置还包括:通过所述移动终端的蓝牙模块与所述标签设备的蓝牙模块建立通信连接;通过所述通信连接将所述标签设备由休眠状态唤醒,并对所述移动终端的第一UWB模块和所述标签设备的第二UWB模块进行通信配置。
在一个可能的示例中,所述标签设备在所述休眠状态下处于低功耗模式。
在采用集成的单元的情况下,本申请实施例提供的另一种增强现实影像显示装置的结构示意图如图6所示。在图6中,增强现实影像显示装置6包括:处理模块60和通信模块61。处理模块60用于对增强现实影像显示装置的动作进行控制管理,例如,获取单元50、显示单元51、确定单元52和开启单元53所执行的步骤,和/或用于执行本文所描述的技术的其它过程。通信模块61用于支持增强现实影像显示装置与其他设备之间的交互。如图6所示,增强现实影像显示装置还可以包括存储模块62,存储模块62用于存储增强现实影像显示装置的程序代码和数据。
其中,处理模块60可以是处理器或控制器,例如可以是中央处理器(Central Processing Unit,CPU),通用处理器,数字信号处理器(Digital Signal Processor,DSP),ASIC,FPGA或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信模块61可以是收发器、RF电路或通信接口等。存储模块62可以是存储器。
其中,上述方法实施例涉及的各场景的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。上述增强现实影像显示装置5和增强现实影像显示装置6均可执行上述图2a、图3所示的增强现实影像显示方法中终端所执行的步骤。
本申请实施例提供另一种增强现实影像显示装置,该增强现实影像显示装置可以为标签设备。具体的,增强现实影像显示装置用于执行以上增强现实影像显示方法中终端所执行的步骤。本申请实施例提供的增强现实影像显示装置可以包括相应步骤所对应的模块。
本申请实施例可以根据上述方法示例对增强现实影像显示装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
在采用对应各个功能划分各个功能模块的情况下,图7示出上述实施例中所涉及的增强现实影像显示装置的一种可能的结构示意图。如图7所示,增强现实影像显示装置7包括发送单元70,用于向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所 述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
在一个可能的示例中,在所述向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位之前,所述装置还用于:根据DSTWR算法通过所述第二UWB模块与所述第一UWB模块,确定所述标签设备与所述移动终端之间的所述距离;和/或,根据PDOA算法通过所述第二UWB模块和所述第一UWB模块,确定所述标签设备相对于所述移动终端的方位。
在一个可能的示例中,所述装置还用于:通过所述标签设备的蓝牙模块与所述移动终端的蓝牙模块建立通信连接;通过所述通信连接接收所述移动终端的唤醒指示,根据所述唤醒指示将所述标签设备由休眠状态唤醒;接收UWB通信配置信息,根据所述通信配置信息对所述标签设备的第二UWB模块和所述移动终端的第一UWB模块进行通信配置。
在一个可能的示例中,所述标签设备在所述休眠状态下处于低功耗模式。
在采用集成的单元的情况下,本申请实施例提供的另一种增强现实影像显示装置的结构示意图如图8所示。在图8中,增强现实影像显示装置8包括:处理模块80和通信模块81。处理模块80用于对增强现实影像显示装置的动作进行控制管理,例如,发送单元70所执行的步骤,和/或用于执行本文所描述的技术的其它过程。通信模块81用于支持增强现实影像显示装置与其他设备之间的交互。如图8所示,增强现实影像显示装置还可以包括存储模块82,存储模块82用于存储增强现实影像显示装置的程序代码和数据。
其中,处理模块80可以是处理器或控制器,例如可以是中央处理器(Central Processing Unit,CPU),通用处理器,数字信号处理器(Digital Signal Processor,DSP),ASIC,FPGA或者其他可编程逻辑器件、晶体管逻辑器件、硬件部件或者其任意组合。其可以实现或执行结合本申请公开内容所描述的各种示例性的逻辑方框,模块和电路。所述处理器也可以是实现计算功能的组合,例如包含一个或多个微处理器组合,DSP和微处理器的组合等等。通信模块81可以是收发器、RF电路或通信接口等。存储模块82可以是存储器。
其中,上述方法实施例涉及的各场景的所有相关内容均可以援引到对应功能模块的功能描述,在此不再赘述。上述增强现实影像显示装置7和增强现实影像显示装置8均可执行上述图4所示的增强现实影像显示方法中终端所执行的步骤。
本申请实施例提供另一种增强现实影像显示装置,该增强现实影像显示装置可以为标签设备。具体的,增强现实影像显示装置用于执行以上增强现实影像显示方法中终端所执行的步骤。本申请实施例提供的增强现实影像显示装置可以包括相应步骤所对应的模块。
本申请实施例可以根据上述方法示例对增强现实影像显示装置进行功能模块的划分,例如,可以对应各个功能划分各个功能模块,也可以将两个或两个以上的功能集成在一个处理模块中。上述集成的模块既可以采用硬件的形式实现,也可以采用软件功能模块的形式实现。本申请实施例中对模块的划分是示意性的,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式。
上述实施例,可以全部或部分地通过软件、硬件、固件或其他任意组合来实现。当使用软件实现时,上述实施例可以全部或部分地以计算机程序产品的形式实现。所述计算机程序产品包括一个或多个计算机指令或计算机程序。在计算机上加载或执行所述计算机指令或计算机程序时,全部或部分地产生按照本申请实施例所述的流程或功能。所述计算机可以为通用计算机、专用计算机、计算机网络、或者其他可编程装置。所述计算机指令可以存储在计算机可读存储介质中,或者从一个计算机可读存储介质向另一个计算机可读存储介质传输,例如,所述计算机指令可以从一个网站站点、计算机、服务器或数据中心通过有线或无线方式向另一个网站站点、计算机、服务器或数据中心进行传输。所述计算机 可读存储介质可以是计算机能够存取的任何可用介质或者是包含一个或多个可用介质集合的服务器、数据中心等数据存储设备。所述可用介质可以是磁性介质(例如,软盘、硬盘、磁带)、光介质(例如,DVD)、或者半导体介质。半导体介质可以是固态硬盘。
本申请实施例还提供一种计算机存储介质,其中,该计算机存储介质存储用于电子数据交换的计算机程序,该计算机程序使得计算机执行如上述方法实施例中记载的任一方法的部分或全部步骤,上述计算机包括电子设备。
本申请实施例还提供一种计算机程序产品,上述计算机程序产品包括存储了计算机程序的非瞬时性计算机可读存储介质,上述计算机程序可操作来使计算机执行如上述方法实施例中记载的任一方法的部分或全部步骤。该计算机程序产品可以为一个软件安装包,上述计算机包括电子设备。
应理解,在本申请的各种实施例中,上述各过程的序号的大小并不意味着执行顺序的先后,各过程的执行顺序应以其功能和内在逻辑确定,而不应对本申请实施例的实施过程构成任何限定。
在本申请所提供的几个实施例中,应该理解到,所揭露的方法、装置和系统,可以通过其它的方式实现。例如,以上所描述的装置实施例仅仅是示意性的;例如,所述单元的划分,仅仅为一种逻辑功能划分,实际实现时可以有另外的划分方式;例如多个单元或组件可以结合或者可以集成到另一个系统,或一些特征可以忽略,或不执行。另一点,所显示或讨论的相互之间的耦合或直接耦合或通信连接可以是通过一些接口,装置或单元的间接耦合或通信连接,可以是电性,机械或其它的形式。
所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部单元来实现本实施例方案的目的。
另外,在本发明各个实施例中的各功能单元可以集成在一个处理单元中,也可以是各个单元单独物理包括,也可以两个或两个以上单元集成在一个单元中。上述集成的单元既可以采用硬件的形式实现,也可以采用硬件加软件功能单元的形式实现。
上述以软件功能单元的形式实现的集成的单元,可以存储在一个计算机可读取存储介质中。上述软件功能单元存储在一个存储介质中,包括若干指令用以使得一台计算机设备(可以是个人计算机,服务器,或者网络设备等)执行本发明各个实施例所述方法的部分步骤。而前述的存储介质包括:U盘、移动硬盘、只读存储器(Read-Only Memory,简称ROM)、随机存取存储器(Random Access Memory,简称RAM)、磁碟或者光盘等各种可以存储程序代码的介质。
虽然本发明披露如上,但本发明并非限定于此。任何本领域技术人员,在不脱离本发明的精神和范围内,可轻易想到变化或替换,均可作各种更动与修改,包含上述不同功能、实施步骤的组合,包含软件和硬件的实施方式,均在本发明的保护范围。
Claims (21)
- 一种增强现实影像显示方法,其特征在于,应用于移动终端,所述方法包括:获取所述移动终端与标签设备之间的距离和方位;显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;根据所述距离和方位确定所述标签设备的位置;开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
- 根据权利要求1所述的方法,其特征在于,所述增强现实位置指示信息包括位置范围指示图标和位置中心指示图标;所述位置范围指示图标覆盖所述标签设备的实际影像信息,所述位置中心指示图标指向所述标签设备的实际影像信息。
- 根据权利要求1或2所述的方法,其特征在于,所述方法还包括:检测到所述当前取景范围内不包括所述标签设备,则显示所述当前取景范围的影像,以及根据所述位置显示方向指向图标,所述方向指向图标用于指示所述标签设备相对于当前取景范围的偏移。
- 根据权利要求3所述的方法,其特征在于,所述方法还包括:确定所述标签设备的位置相对于所述移动终端的摄像头的中心线的角度偏移情况,输出对应的位置提示信息。
- 根据权利要求4所述的方法,其特征在于,所述位置提示信息包括以下至少一种:语音位置提示信息、文字位置提示信息、图像位置提示信息。
- 根据权利要求1-5任一项所述的方法,其特征在于,所述获取本端设备与标签设备之间的距离和方位,包括:根据双边双向测距DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;根据到达相位差PDOA算法通过所述第一UWB模块和所述第二UWB模块,确定所述标签设备相对于本端设备的方位。
- 根据权利要求1-5任一项所述的方法,其特征在于,所述获取本端设备与标签设备之间的距离和方位,包括:接收来自所述标签设备上报的距离和方位,所述距离是所述标签设备根据DSTWR算法通过所述第二UWB模块与所述第一UWB模块进行消息交互确定的,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
- 根据权利要求1-5任一项所述的方法,其特征在于,所述获取本端设备与标签设备之间的距离和方位,包括:根据DSTWR算法通过所述第一UWB模块和所述第二UWB模块,确定本端设备与所述标签设备之间的距离;接收来自所述标签设备上报方位,所述方位是所述标签设备根据PDOA算法通过所述第二UWB模块和所述第一UWB模块进行消息交互确定的。
- 根据权利要求1-8任一项所述的方法,其特征在于,所述移动终端显示当前取景范围的影像和所述标签设备的位置指示信息的同时,还显示有所述距离。
- 根据权利要求1-8任一项所述的方法,其特征在于,所述相机功能的开启通过如下方式实现:针对所述移动终端的显示屏从该显示屏的底部按照由所述底部向顶部的方向滑动。
- 根据权利要求6-8任一项所述的方法,其特征在于,所述方法还包括:通过所述移动终端的蓝牙模块与所述标签设备的蓝牙模块建立通信连接;通过所述通信连接将所述标签设备由休眠状态唤醒,并对所述移动终端的第一UWB模块和所述标签设备的第二UWB模块进行通信配置。
- 根据权利要求11所述的方法,其特征在于,所述标签设备在所述休眠状态下处于低功耗模式。
- 一种增强现实影像显示方法,其特征在于,应用于标签设备,所述方法包括:向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
- 根据权利要求13所述的方法,其特征在于,所述向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位之前,所述方法还包括:根据DSTWR算法通过所述第二UWB模块与所述第一UWB模块,确定所述标签设备与所述移动终端之间的所述距离;和/或,根据PDOA算法通过所述第二UWB模块和所述第一UWB模块,确定所述标签设备相对于所述移动终端的方位。
- 根据权利要求13或14所述的方法,其特征在于,所述方法还包括:通过所述标签设备的蓝牙模块与所述移动终端的蓝牙模块建立通信连接;通过所述通信连接接收所述移动终端的唤醒指示,根据所述唤醒指示将所述标签设备由休眠状态唤醒;接收UWB通信配置信息,根据所述通信配置信息对所述标签设备的第二UWB模块和所述移动终端的第一UWB模块进行通信配置。
- 根据权利要求15所述的方法,其特征在于,所述标签设备在所述休眠状态下处于低功耗模式。
- 一种增强现实影像显示装置,其特征在于,应用于移动终端,所述装置包括:获取单元,用于获取所述移动终端与标签设备之间的距离和方位;显示单元,用于显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;确定单元,用于根据所述距离和方位确定所述标签设备的位置;开启单元,用于开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
- 一种增强现实影像显示装置,其特征在于,应用于标签设备,所述装置包括:发送单元,用于向移动终端发送所述标签设备相对于所述移动终端的距离和/或方位;其中,所述距离和方位用于所述移动终端执行以下操作:显示定位图标和距离指示信息,所述定位图标用于指示正在执行定位,所述距离指示信息用于指示所述距离;以及根据所述距离和方位确定所述标签设备的位置;以及开启相机功能,并显示当前取景范围的影像和所述标签设备的增强现实位置指示信息。
- 一种移动终端,其特征在于,包括处理器、存储器,以及一个或多个程序,所述一个或多个程序被存储在所述存储器中,并且被配置由所述处理器执行,所述程序包括用于执行如权利要求1-12任一项所述的方法中的步骤的指令。
- 一种标签设备,其特征在于,包括处理器、存储器,以及一个或多个程序,所述 一个或多个程序被存储在所述存储器中,并且被配置由所述处理器执行,所述程序包括用于执行如权利要求13-16任一项所述的方法中的步骤的指令。
- 一种计算机可读存储介质,其特征在于,其上存储有计算机程序/指令,该计算机程序/指令被处理器执行时实现权利要求1-12或13-16任一项所述的方法。
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011167661.5 | 2020-10-27 | ||
CN202011167661.5A CN114489314B (zh) | 2020-10-27 | 2020-10-27 | 增强现实影像显示方法及相关装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022088989A1 true WO2022088989A1 (zh) | 2022-05-05 |
Family
ID=81381847
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/CN2021/116504 WO2022088989A1 (zh) | 2020-10-27 | 2021-09-03 | 增强现实影像显示方法及相关装置 |
Country Status (2)
Country | Link |
---|---|
CN (1) | CN114489314B (zh) |
WO (1) | WO2022088989A1 (zh) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115546304A (zh) * | 2022-11-24 | 2022-12-30 | 海纳云物联科技有限公司 | 一种基于摄像机所在的三维坐标系检测定位的方法及装置 |
CN115996372A (zh) * | 2023-03-16 | 2023-04-21 | 炬芯科技股份有限公司 | 电子设备及电子设备的数据传输方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107209246A (zh) * | 2014-11-13 | 2017-09-26 | 诺基亚技术有限公司 | 使用蓝牙低能量的方位计算 |
CN108091246A (zh) * | 2017-12-28 | 2018-05-29 | 苏州印象镭射科技有限公司 | 增强现实标签、增强现实标签的制备方法及增强现实标签的检测方法 |
CN109242081A (zh) * | 2018-07-13 | 2019-01-18 | 燕山大学 | 基于无线供电技术的物品定位装置 |
CN110248165A (zh) * | 2019-07-02 | 2019-09-17 | 高新兴科技集团股份有限公司 | 标签显示方法、装置、设备及存储介质 |
US10598507B1 (en) * | 2018-11-28 | 2020-03-24 | Carl LaMont | Systems, methods, and apparatus for locating objects |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107229706A (zh) * | 2017-05-25 | 2017-10-03 | 广州市动景计算机科技有限公司 | 一种基于增强现实的信息获取方法及其装置 |
-
2020
- 2020-10-27 CN CN202011167661.5A patent/CN114489314B/zh active Active
-
2021
- 2021-09-03 WO PCT/CN2021/116504 patent/WO2022088989A1/zh active Application Filing
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN107209246A (zh) * | 2014-11-13 | 2017-09-26 | 诺基亚技术有限公司 | 使用蓝牙低能量的方位计算 |
CN108091246A (zh) * | 2017-12-28 | 2018-05-29 | 苏州印象镭射科技有限公司 | 增强现实标签、增强现实标签的制备方法及增强现实标签的检测方法 |
CN109242081A (zh) * | 2018-07-13 | 2019-01-18 | 燕山大学 | 基于无线供电技术的物品定位装置 |
US10598507B1 (en) * | 2018-11-28 | 2020-03-24 | Carl LaMont | Systems, methods, and apparatus for locating objects |
CN110248165A (zh) * | 2019-07-02 | 2019-09-17 | 高新兴科技集团股份有限公司 | 标签显示方法、装置、设备及存储介质 |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115546304A (zh) * | 2022-11-24 | 2022-12-30 | 海纳云物联科技有限公司 | 一种基于摄像机所在的三维坐标系检测定位的方法及装置 |
CN115996372A (zh) * | 2023-03-16 | 2023-04-21 | 炬芯科技股份有限公司 | 电子设备及电子设备的数据传输方法 |
Also Published As
Publication number | Publication date |
---|---|
CN114489314B (zh) | 2024-05-28 |
CN114489314A (zh) | 2022-05-13 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3335097B1 (en) | Method for measuring angles between displays and electronic device using the same | |
EP3690625B1 (en) | Method and device for dynamically displaying icon according to background image | |
CN110537165B (zh) | 一种显示方法及装置 | |
CN108781235B (zh) | 一种显示方法及装置 | |
US10502580B2 (en) | Method and apparatus for providing augmented reality function in electronic device | |
US8619152B2 (en) | Mobile terminal and operating method thereof | |
WO2022088989A1 (zh) | 增强现实影像显示方法及相关装置 | |
EP2965299B1 (en) | Modifying functionality based on distances between devices | |
US10748348B2 (en) | Method, apparatus and electronic device for displaying an image and storage medium | |
US20110319131A1 (en) | Mobile terminal capable of providing multiplayer game and operating method thereof | |
CN104238900B (zh) | 一种页面定位方法及装置 | |
KR102140290B1 (ko) | 입력 처리 방법 및 그 전자 장치 | |
JP6371485B2 (ja) | エアマウスリモコンの最適化方法、装置、端末機器、プログラム、及び記録媒体 | |
CN112822325B (zh) | 定位显示控制方法及相关装置 | |
CN111385525B (zh) | 视频监控方法、装置、终端及系统 | |
US11228860B2 (en) | Terminal positioning method, apparatus, electronic device and storage medium | |
CN113518423A (zh) | 定位方法、装置及电子设备 | |
US9459707B2 (en) | Display apparatus and method of controlling the same | |
WO2018023216A1 (zh) | 一种多视角图像的控制方法及相关装置 | |
WO2022089246A1 (zh) | 定位方法及装置 | |
EP4250775A1 (en) | Method and system for quickly popping up control window of electronic device, and mobile device | |
CN113485664A (zh) | 投屏方法、装置和电子设备 | |
TW202328877A (zh) | 三維手勢交互方法及其電子裝置 | |
CN117560676A (zh) | 通信连接建立方法和电子设备 | |
CN115334346A (zh) | 界面显示方法、视频发布方法、视频编辑方法及装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21884720 Country of ref document: EP Kind code of ref document: A1 |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21884720 Country of ref document: EP Kind code of ref document: A1 |