WO2021184388A1 - Procédé et appareil d'affichage d'image, et dispositif électronique portable - Google Patents

Procédé et appareil d'affichage d'image, et dispositif électronique portable Download PDF

Info

Publication number
WO2021184388A1
WO2021184388A1 PCT/CN2020/080518 CN2020080518W WO2021184388A1 WO 2021184388 A1 WO2021184388 A1 WO 2021184388A1 CN 2020080518 W CN2020080518 W CN 2020080518W WO 2021184388 A1 WO2021184388 A1 WO 2021184388A1
Authority
WO
WIPO (PCT)
Prior art keywords
tag
electronic tag
electronic device
electronic
portable electronic
Prior art date
Application number
PCT/CN2020/080518
Other languages
English (en)
Chinese (zh)
Inventor
邵帅
Original Assignee
Oppo广东移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Oppo广东移动通信有限公司 filed Critical Oppo广东移动通信有限公司
Priority to PCT/CN2020/080518 priority Critical patent/WO2021184388A1/fr
Priority to PCT/CN2020/083372 priority patent/WO2021184442A1/fr
Priority to CN202080087336.XA priority patent/CN114830067A/zh
Publication of WO2021184388A1 publication Critical patent/WO2021184388A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K17/00Methods or arrangements for effecting co-operative working between equipments covered by two or more of main groups G06K1/00 - G06K15/00, e.g. automatic card files incorporating conveying and reading operations
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06KGRAPHICAL DATA READING; PRESENTATION OF DATA; RECORD CARRIERS; HANDLING RECORD CARRIERS
    • G06K7/00Methods or arrangements for sensing record carriers, e.g. for reading patterns

Definitions

  • This application relates to the field of electronic tag tracking, and in particular to an image display method and device, and portable electronic equipment.
  • Electronic tag tracking system is mainly used for item tracking.
  • the electronic tag In the electronic tag tracking system, the electronic tag is placed on the tracked item, and the user uses the reader to locate the location of the electronic tag to complete the item tracking. The final result will be presented to the user in the form of pictures, lights, or sounds.
  • the electronic tag tracking system is not intuitive enough to present the tracking process and results, that is, the presentation of the tracking results of the electronic tag tracking system does not consider the real environment the user is in, which affects the user experience.
  • the embodiments of the present application provide an image display method and device, and a portable electronic device to at least solve the related technology, because the presentation of the tracking result of the electronic tag tracking system does not consider the user's real environment, which affects the user experience. problem.
  • a portable electronic device including: an image acquisition module for acquiring image data of a real scene; a signal transceiving module for communicating with a reader or a first electronic tag The electronic tag communicates; the processing module is used to obtain the relative position of the portable electronic device and the first electronic tag based on the signal received by the signal transceiver module; fuse the position information corresponding to the relative position with the image data of the real scene to obtain the first image; The display module is used to display the first image.
  • the signal transceiver module includes: one or more second electronic tags, used to send the first radio frequency signal to the reader; wireless communication circuit, used to receive the reader based on the first radio frequency signal and the second radio frequency signal determined And send the relative position to the processing module, where the second radio frequency signal is the radio frequency signal sent by the first electronic tag to the reader.
  • each of the one or more second electronic tags has a tag identifier, wherein each tag identifier corresponds to an installation position of the second electronic tag on the portable electronic device.
  • the portable electronic device includes: augmented reality (Augmented Reality, AR for short) glasses;
  • the augmented reality glasses include at least: two temple structures, two lens ring structures, and a nose bridge structure for connecting the two lens ring structures
  • the signal transceiver module includes: a wireless carrier chip for sending a third radio frequency signal to the first electronic tag; receiving the reflected signal returned by the first electronic tag after receiving the third radio frequency signal; the processing module is also used for The relative position is determined based on the third radio frequency signal and the reflected signal.
  • the portable electronic device includes: AR glasses, the AR glasses at least include: two temple structures, two lens ring structures, and a nose bridge structure for connecting the two lens ring structures;
  • the wireless carrier chip includes: a microprocessor And at least three wireless carrier antennas; among them, at least one of the at least three wireless carrier antennas is arranged in the nose bridge structure, and the remaining wireless carrier antennas are arranged in the two mirror leg structures;
  • the microprocessor is arranged in the two mirrors Any one of the leg structures in the temple structure.
  • the portable electronic device includes: a smart mobile terminal; the wireless carrier chip includes: a microprocessor and at least one wireless carrier antenna; at least one wireless carrier antenna is set in the back shell of the smart mobile terminal, and the microprocessor is set in the smart In the main circuit board of the mobile terminal.
  • the relative position includes at least one of the following: a relative angle and a relative distance between the first electronic tag and the portable electronic device.
  • the first electronic tag is an electronic tag set on the target tracking device.
  • an electronic tag tracking system including: a portable electronic device, a reader, and a first electronic tag; wherein the portable electronic device includes the above portable electronic device; and the reader is used for Collect the label information of the first electronic label and the label information of the second electronic label set on the portable electronic device; determine the difference between the portable electronic device and the first electronic label based on the label information of the first electronic label and the label information of the second electronic label Relative position, and send the relative position to the portable electronic device.
  • the reader is configured to determine the first relative position of the reader and the portable electronic device based on the tag information of the second electronic tag; determine the second relative position of the reader and the first electronic tag based on the tag information of the first electronic tag Position; the relative position of the portable electronic device and the first electronic tag is determined based on the first relative position and the second relative position.
  • an electronic tag tracking system including: a portable electronic device and a first electronic tag; wherein, the portable electronic device includes the above portable electronic device; the portable electronic device is used to collect the first electronic device; Tag information of an electronic tag; the relative position of the portable electronic device and the first electronic tag is determined based on the tag information of the first electronic tag.
  • an image display method including: acquiring the relative position of the portable electronic device and the first electronic tag; The data is fused to obtain the first image; the first image is displayed.
  • acquiring the relative position of the portable electronic device and the first electronic tag includes: sending a first radio frequency signal to the reader corresponding to the first electronic tag; receiving the relative position determined by the reader based on the first radio frequency signal and the second radio frequency signal. Location, where the second radio frequency signal is the radio frequency signal sent by the first electronic tag to the reader;
  • one or more second electronic tags are provided in the portable electronic device; sending the first radio frequency signal to the reader corresponding to the first electronic tag includes: sending to the reader through one or more second electronic tags One or more first radio frequency signals.
  • each of the one or more second electronic tags has a tag identifier, wherein each tag identifier corresponds to an installation position of the second electronic tag on the portable electronic device.
  • the above method further includes: presenting a list of electronic tags to the target object; receiving a selection instruction of the target object; and in response to the selection instruction, selecting one from the electronic tag list Or multiple target electronic tags, and the selected one or more target electronic tags are used as the first electronic tags.
  • an image display device including: an acquisition module for acquiring the relative position of a portable electronic device and a first electronic tag; and a fusion module for combining position information corresponding to the relative position
  • the first image is obtained by fusion with the image data of the real scene acquired by the portable electronic device; the first display module is used to display the first image.
  • the acquisition module includes: a first transceiver unit, configured to send a first radio frequency signal to a reader corresponding to the first electronic tag; and receive a relative position determined by the reader based on the first radio frequency signal and the second radio frequency signal, wherein ,
  • the second radio frequency signal is the radio frequency signal sent by the first electronic tag to the reader;
  • the second transceiver unit sends the third radio frequency signal to the first electronic tag; receiving the reflection returned by the first electronic tag after receiving the third radio frequency signal Signal; the relative position is determined based on at least one of the third radio frequency signal and the reflected signal.
  • one or more second electronic tags are provided in the portable electronic device; the first transceiver unit is configured to send one or more first radio frequency signals to the reader through the one or more second electronic tags.
  • each of the one or more second electronic tags has a tag identifier, wherein each tag identifier corresponds to an installation position of the second electronic tag on the portable electronic device.
  • the device further includes: a second display module for displaying a list of electronic tags to the target object; a receiving module for receiving a selection instruction of the target object; and a selection module for selecting from the electronic tag list in response to the selection instruction Select one or more target electronic tags, and use the selected one or more target electronic tags as the first electronic tags.
  • a second display module for displaying a list of electronic tags to the target object
  • a receiving module for receiving a selection instruction of the target object
  • a selection module for selecting from the electronic tag list in response to the selection instruction Select one or more target electronic tags, and use the selected one or more target electronic tags as the first electronic tags.
  • a non-volatile storage medium includes a stored program, and the device where the storage medium is located is controlled to execute the above-mentioned image display when the program is running. method.
  • the relative position and the real environment can be combined, so that the electronic tag
  • the presentation of the tracking results of the system is more intuitive, which improves the user experience, and thus solves the technical problems that affect the user experience because the presentation of the tracking results of the electronic tag tracking system does not consider the user's real environment in related technologies.
  • Fig. 1 is a schematic structural diagram of an electronic tag tracking system according to an embodiment of the present application
  • Fig. 2 is a schematic structural diagram of another electronic tag tracking system according to an embodiment of the present application.
  • Fig. 3 is a schematic flowchart of an image display method according to an embodiment of the present application.
  • Fig. 4(a) is a schematic diagram of the principle of tag location tracking based on an AR device according to an embodiment of the present application
  • Figure 4(b) is a schematic diagram of another process of tracking the position of tags based on AR devices according to an embodiment of the present application
  • Fig. 5 is a schematic diagram of an AR device integrated with an RFID tag according to an embodiment of the present application.
  • FIG. 6 is a schematic diagram of the operation flow of an electronic tag tracking system according to an embodiment of the present application.
  • FIG. 7 is a schematic diagram of an AR device integrating Ultra Wide Band (UWB) tags according to an embodiment of the present application.
  • UWB Ultra Wide Band
  • Fig. 8 is a schematic diagram of a smart terminal integrated with a UWB system according to an embodiment of the present application.
  • FIG. 9 is a schematic structural diagram of a portable electronic device according to an embodiment of the present application.
  • FIG. 10 is a schematic structural diagram of a signal receiving and sending device according to an embodiment of the present application.
  • FIG. 11 is a schematic structural diagram of a portable electronic device according to an embodiment of the present application.
  • Fig. 12 is a schematic structural diagram of a portable electronic device according to an embodiment of the present application.
  • FIG. 13 is a schematic structural diagram of a wireless carrier chip according to an embodiment of the present application.
  • Fig. 14 is a schematic structural diagram of an image display device according to an embodiment of the present application.
  • Fig. 15 is a schematic structural diagram of another image display device according to an embodiment of the present application.
  • AR technology refers to combined imaging technology, which is a technology that uses computer image algorithms to combine and interact with the virtual world on the screen and the real scene.
  • the biggest difference between AR and virtual reality (Virtual reality, VR for short) is that VR only presents the user's computer simulated images, while AR combines the real environment with the simulated images, so wearing AR devices will not affect the user's normal life behavior .
  • the electronic tag tracking system is an application of wireless communication technology independent of AR technology.
  • An electronic tag tracking system usually includes a reader and the electronic tag being tracked.
  • the reader can be independent, that is, the reader has independent calculation and storage functions.
  • the reader can also be connected to the server to obtain instructions to complete data transmission.
  • Electronic tags usually contain tag processing chips and tag antennas.
  • the structure of electronic tags using different wireless transmission protocols will also be different. For example, an electronic tag using passive video identification (Radio Frequency Identification, referred to as RFID) technology does not require a power supply, so the electronic tag only consists of a tag chip and a tag antenna.
  • RFID passive video identification
  • Electronic tag tracking system is mainly used for item tracking.
  • the electronic tag is placed on the tracked item, and the user uses the reader to locate the position of the electronic tag to complete the item tracking.
  • the final result will be presented to the user in the form of pictures, lights, or sounds.
  • some electronic tags use Bluetooth technology to communicate with readers.
  • the readers include but are not limited to smart mobile terminals.
  • the tracking of items is achieved by means of sound waves, and the location of items is determined by the volume of the electronic tag.
  • the location of the items can also be displayed on the mobile phone in the form of a map.
  • RFID readers can locate electronic devices.
  • the method of the tag is to read the Received Signal Strength Indicator (RSSI) transmitted by the tag, and determine the location of the tag according to the strength of the RSSI signal.
  • RSSI Received Signal Strength Indicator
  • the user needs to hold the RFID reader and change its physical location according to the RSSI feedback instructions.
  • the visible light of the human eye is an electromagnetic wave with a frequency of 405-790 THz.
  • the electromagnetic wave spectrum used in wireless communication is 3KHz-300GHz, which is invisible to the naked eye.
  • the frequencies of the wireless technologies commonly used in electronic tags are Bluetooth (Blue Tooth, BT)-2.4 GHz, Wireless Fidelity (Wireless Fidelity, WIFI)-2.4/5 GHz, RFID-100KHz to 2.4 GHz, and UWB-3 to 10 GHz.
  • Bluetooth Bluetooth
  • BT Bluetooth
  • Wireless Fidelity Wireless Fidelity
  • WIFI Wireless Fidelity
  • RFID-100KHz to 2.4 GHz and UWB-3 to 10 GHz.
  • RF radio frequency
  • the electronic tag If the electronic tag is covered by a paper object, the electronic tag cannot be seen with the naked eye, but its electromagnetic waves can penetrate the paper product and be captured by the reader.
  • related technologies use sound waves in the positioning process, and the map presents the location of the tracked item or the relative position of the item and the reader. This method cannot most intuitively combine the label position with the real physical environment.
  • the advantages of the AR device that combines the real environment and the analog image are fully utilized to visually present the position of the electronic tag to the user, for example, the position corresponding to the electronic tag
  • the graphic objects are superimposed on the shooting interface for shooting real-world images, so that the user can "see” the label position in the real world, thereby enhancing the user experience. It will be described in detail below in conjunction with specific embodiments.
  • the embodiment of this application provides two system architectures of Figures 1 and 2 according to whether there is a reader in the electronic tag tracking system.
  • the portable electronic device in the electronic tag tracking system of Figure 1 can communicate with the reader directly from The reader receives the above-mentioned relative position, or the portable electronic device in the electronic tag tracking system of Figure 1 directly communicates with the first electronic tag.
  • the processing module After acquiring the signal of the first electronic tag, the processing module is based on the signal extracted from the signal.
  • the label information determines the above-mentioned relative position. specifically:
  • Fig. 1 is a schematic structural diagram of an electronic tag tracking system according to an embodiment of the present application. As shown in Fig. 1, the system includes: a portable electronic device 10, a reader 12 and a first electronic tag 14, in which:
  • the portable electronic device 10 is used to collect real-world images and provide a shooting interface for displaying the real world; the portable electronic device 10 includes: an image acquisition module for acquiring image data of a real scene; a signal transceiver module for communicating with The reader of the first electronic tag or the first electronic tag communicates; the processing module is configured to obtain the relative position of the portable electronic device and the first electronic tag based on the signal received by the signal transceiving module; and correspond the relative position
  • the location information of the first electronic tag is fused with the image data of the real scene to obtain the first image; the display module is used to display the first image; the reader 12 is used to collect the tag information of the first electronic tag and the tag information set on the portable electronic device
  • the tag information of the second electronic tag; the relative position of the portable electronic device and the first electronic tag is determined based on the tag information of the first electronic tag and the tag information of the second electronic tag, and the relative position is sent to the portable electronic device.
  • the reader 12 is also used to determine the first relative position of the reader and the portable electronic device based on the tag information of the second electronic tag; determine the second relative position of the reader and the first electronic tag based on the tag information of the first electronic tag Position; the relative position of the portable electronic device and the first electronic tag is determined based on the first relative position and the second relative position.
  • Fig. 2 is a schematic structural diagram of another electronic tag tracking system according to an embodiment of the present application. As shown in Figure 3, the system includes:
  • the embodiments of the present application provide an embodiment of an image display method. It should be noted that the steps shown in the flowchart of the accompanying drawings can be executed in a computer system such as a set of computer-executable instructions And, although the logical sequence is shown in the flowchart, in some cases, the steps shown or described may be performed in a different order than here.
  • FIG. 3 is a schematic flowchart of an image display method according to an embodiment of the present application. As shown in FIG. 3, the method includes the following steps:
  • step S300 the portable electronic device directly interacts with the first electronic tag, so that the portable electronic device determines the relative position between the portable electronic device and the first electronic tag according to the interactive data.
  • the specific method will be described in detail below. No longer.
  • step S300 and step S300' are two parallel steps, that is, there is no sequence between them, and step S300 and step S300' are two optional steps, that is, in practical applications, If the portable electronic device has already acquired the relevant data, it may not need to perform step S300 and step S300'.
  • Step S302 Obtain the relative position of the portable electronic device and the first electronic tag.
  • portable electronic devices include, but are not limited to, AR devices, such as AR glasses or smart mobile terminals with AR functions.
  • Solution 1 which corresponds to step S300:
  • the AR device first sends a first radio frequency signal to the reader corresponding to the first electronic tag, and then receives the reader based on the first radio frequency signal The relative position determined with a second radio frequency signal, wherein the second radio frequency signal is a radio frequency signal sent by the first electronic tag to the reader.
  • the reader can determine the distance between the AR device and the reader and the distance between the reader and the first electronic tag based on the signal strength of the first radio frequency signal and the second radio frequency signal.
  • the reader can maintain a mapping between signal strength and distance. The relationship, that is, the reader can determine the above two distances (the distance between the AR device and the reader, and the distance between the reader and the first electronic tag) based on the mapping relationship.
  • the above-mentioned first radio frequency signal may also be an echo signal (that is, a reflection signal) fed back after the reader sends a trigger signal to the electronic tag in the portable electronic device. At this time, the reader may be based on the transmission time of the trigger signal and the feedback signal.
  • the receiving time determines the distance between the two, for example, based on the time difference between the transmitting time and the above receiving time and the signal transmission speed.
  • the calculation method of the distance between the reader and the first electronic tag is the same as the calculation method of the distance between the reader and the portable electronic device, and will not be repeated here.
  • one or more second electronic tags are provided in the portable electronic device, so that the portable electronic device can send one or more second electronic tags to the reader through the one or more second electronic tags.
  • a first radio frequency signal is provided in the portable electronic device, so that the portable electronic device can send one or more second electronic tags to the reader through the one or more second electronic tags.
  • each second electronic tag of the one or more second electronic tags has a tag identification, wherein each tag identification corresponds to an installation position of a second electronic tag in the portable electronic device, for example,
  • the tag identifier of the second electronic tag is 1, indicating that the second electronic tag is located on the left leg of the AR glasses, and the tag identifier of the second electronic tag is 2, indicating that the second electronic tag is located on the right leg of the AR glasses, etc.
  • the position of the second electronic tag can be determined more quickly, and the distance between the second electronic tag and the reader can be quickly determined.
  • a relative position can be determined based on each second electronic tag; then the portable electronic device and the first electronic tag can be determined based on the multiple relative positions determined by the multiple second electronic tags.
  • the final relative position between an electronic tag for example, the average value of the above multiple relative positions is calculated, and the average value is used as the final relative position.
  • the relative positions between different objects involved in the embodiments of the present application include but are not limited to relative coordinate information, and the relative coordinate information may be determined based on the coordinates of different objects in the world coordinate system. It can be understood that the above-mentioned relative coordinate information may also be determined based on the coordinates of different objects in other types of coordinate systems, which is not limited in the embodiment of the present application.
  • Solution two that is, the solution corresponding to step S300': as shown in Figure 4(b)
  • the AR device sends a third radio frequency signal to the first electronic tag; after receiving the first electronic tag, the third radio frequency signal is received The returned reflected signal; the relative position is determined based on the third radio frequency signal and the reflected signal.
  • it is determined based on the transmission time of the third radio frequency signal and the reception time of the reflected signal, or is determined based on the signal strength of the reflected signal.
  • Step S304 fusing the position information corresponding to the relative position with the image data of the real scene acquired by the portable electronic device to obtain a first image.
  • the location information includes but not location coordinates, location objects, etc., and its manifestation may be graphical objects for display.
  • graphic objects including but not limited to bar objects, sphere objects, and cylinders.
  • the graphic object can also be an indicator, for example, it can be a number or a letter, and different indicators correspond to different types of tracked devices.
  • step S304 may be represented as the following process, but is not limited to this: superimpose the graphic object corresponding to the relative position to the image displayed on the shooting interface of the portable electronic device to obtain the target display interface, where , The shooting interface is used to display real-world images to be shot.
  • step S306 the first image is displayed.
  • Solution one is designed based on RFID technology combined with an augmented reality electronic tag tracking system, which includes passive RFID electronic tags, readers, and AR devices.
  • the RFID communication protocol is adopted between the electronic tag and the reader.
  • AR devices and readers use other wireless communication protocols (such as Bluetooth, WIFI, UWB) for data exchange.
  • AR devices do not exchange data in any form with electronic tags.
  • the electronic tag tracking system based on augmented reality does not need to know the absolute position of the electronic tag, just obtain the relative position of the electronic tag and the AR device.
  • This technical solution integrates an RFID tag on the AR device to obtain the relative position of the AR device and the tracked RFID tag.
  • three passive RFID tags are integrated on the AR glasses, which are respectively placed on the left and right legs of the device and the beam used to connect the two lens rings.
  • Passive RFID tags are composed of a tag antenna and a tag chip. This electronic tag relies on the captured RF energy to drive and therefore does not require an internal power supply.
  • the RFID chip contains a memory that can store a unique identification code-Electronic Product Code (EPC), which is used to track items, such as the identification codes A01, A02, and A03 of the three tag antennas in Figure 5.
  • EPC Electronic Product Code
  • the reader internally stores the matching table between EPC and the article. As shown in Table 1, this matching table is stored in the RFID reader.
  • the operation process of the system is shown in Figure 6.
  • the AR device sends instructions to the RFID reader, and the RFID reader starts to obtain the information of the RFID tag in the environment.
  • the tag information is the tag’s Electronic Product Code (EPC), RSSI , Phase and time scale.
  • the read tags include the RFID tag on the AR and the tracked tag.
  • the reader calculates the relative position of the tracked tag and the AR device by processing the RSSI and phase information of the RFID tag on the AR and the tracked tag, and transmits this information to the AR device. After obtaining this information, the AR device uses image technology to visually present the location of the tracked tag to the user. What the user sees is the superposition of the real environment and the virtual calculated label position, so as to intuitively know the position of the electronic label.
  • the RFID reader sends an instruction to obtain the information of the RFID tag including EPC, RSSI and phase.
  • the reader can calculate the distance r between the tag and the reader.
  • Combining RSSI and phase information can also be used to build a model of the environment, or use signal processing methods to improve the accuracy of distance calculations.
  • the Angle of Arrival algorithm Angle of Arrival, AOA
  • the angle ⁇ of the tag to the reader can be calculated.
  • the average RSSI and phase method can be used to improve the positioning accuracy.
  • r tag is the distance between the tag and the reader
  • ⁇ tag is the relative angle between the tag and the reader.
  • the relative distance P AR (r AR , ⁇ Ar ) between the AR device and the reader can be calculated, and the distance between the target tag and the AR device can be calculated as:
  • This coordinate P d is the relative distance between the tag and the AR device.
  • the reader feeds this information back to the AR device, and the AR device maps this coordinate to the screen through a visual algorithm.
  • This coordinate coincides with the actual environment observed by the user through the AR device, so as to achieve "See” the effect of RFID tags.
  • Solution two is designed based on UWB technology combined with an augmented reality electronic tag tracking system.
  • the AR device of the system and UWB tag directly carry out data transmission without a reader, and the AR device contains an ultra-wideband (UWB) antenna and a UWB processor. Chips, and microprocessors.
  • UWB antenna is placed in front of the AR glasses, and is placed at three positions on the left, center, and right; the UWB processor chip can transmit and receive UWB signals; the microprocessor performs the calculation of the positioning algorithm.
  • the AR device communicates with the UWB tag through the UWB antenna transmitting signal, and receives the reflected signal, and then calculates the relative position of the UWB electronic tag and the AR device through the microprocessor of the AR device.
  • the AR device then superimposes the position of the UWB electronic tag with the actual environment through the above-mentioned visual processing, and presents it to the user.
  • the aforementioned AR device includes but is not limited to AR glasses.
  • the positioning algorithm in this scheme is similar to the positioning algorithm in scheme 1. It can be considered that the reader is directly integrated into the AR device, so there is no need to calculate the relative position of the reader, the tag and the AR device, and the UWB tag and AR device can be obtained directly
  • the specific calculation process is as follows: the positioning algorithm is divided into two parts, namely distance and angle.
  • the UWB processor chip and microprocessor integrated in the AR device can calculate the distance between the AR device and the electronic tag through Time of Flight (ToF), and then use the AOA method mentioned above to calculate the electronic tag and The angle between the AR device and the relative position P d (r d , ⁇ d ) of the electronic tag and the AR device are obtained, and then the AR device processes this coordinate in a computer vision method and presents it to the user.
  • ToF Time of Flight
  • the second solution is not limited to the head-mounted AR device of FIG. 7, and can also be applied to smart terminals with rear cameras and UWB processing chips.
  • the UWB antenna is integrated into the back shell of the smart terminal.
  • the UWB processing chip is integrated into the circuit board in the smart terminal, and the calculation of the positioning algorithm can use the processor of the smart terminal.
  • Multi-tag tracking mainly depends on the wireless communication standard used by the electronic tag.
  • RFID and UWB both support one-to-many communication mode, while Bluetooth, WIFI, and Zigbee technology also support one-to-many communication mode.
  • the user can input multiple tags that need to be tracked, or select all electronic tags in reality.
  • all tags in this scenario will be presented to the user in a visual form.
  • display an electronic tag list to the target object before acquiring the relative position of the portable electronic device and the first electronic tag, display an electronic tag list to the target object; receive a selection instruction of the target object; in response to the selection instruction, select from the electronic tag list
  • One or more target electronic tags, and the selected one or more target electronic tags are used as the first electronic tags.
  • the electronic tag list can be displayed in the above-mentioned shooting interface, or the electronic tag list can be displayed in a display interface dedicated to display the electronic tag list.
  • the relative position of the portable electronic device and the first electronic tag is added to the shooting interface of the portable device in the form of a graphic object, so that the electronic tag system is
  • the presentation of the tracking result is more intuitive, which improves the user experience, and thus solves the technical problem that in the related technology, because the presentation of the tracking result of the electronic tag tracking system does not consider the real environment of the user, the technical problem that affects the user experience is solved.
  • Fig. 9 is a schematic structural diagram of a portable electronic device according to an embodiment of the present application. As shown in Figure 9, the equipment includes:
  • the image acquisition module 902 is used to acquire image data of a real scene, for example, to acquire an image of the real world, and provide a shooting interface for displaying the real world;
  • the signal transceiving module 904 is used to communicate with the reader of the first electronic tag or the first electronic tag;
  • the first electronic tag is an electronic tag set on the target tracking device.
  • the signal transceiver module there are two ways to form the signal transceiver module:
  • the signal transceiving module 904 includes: one or more second electronic tags 1002 for sending a first radio frequency signal to the reader, wherein each of the one or more second electronic tags The second electronic tags each have a tag identification, where each tag identification corresponds to the installation position of a second electronic tag in the portable electronic device; the wireless communication circuit 1004 is used to receive the reader based on the first radio frequency signal and the second radio frequency signal Determine the relative position and send the relative position to the processing module, where the second radio frequency signal is the radio frequency signal sent by the first electronic tag to the reader.
  • the wireless communication circuit includes but is not limited to UWB communication circuit, for example, includes UWB antenna.
  • the relative position includes at least one of the following: a relative angle and a relative distance between the first electronic tag and the portable electronic device.
  • the above-mentioned portable electronic device includes: AR glasses 11; there are at least three second electronic tags; among them, at least one second electronic tag of the at least three second electronic tags Set in the nose bridge structure of the augmented reality glasses, the remaining second electronic tags are set in the two temple structures, for example, in some optional embodiments, when there are three second electronic tags, as shown in FIG. 11, the first The second electronic tag 1104 is arranged in the nose bridge structure of the AR glasses, the second electronic tag 1102 and the second electronic tag 1106 are arranged in the two temple structures of the AR glasses (that is, they are dispersedly arranged in the two temple structures). This scheme is due to the installation position of the three electronic tags.
  • the length of the electronic tag is greater than the length of the nose bridge structure, a part of the electronic tag is set in the nose bridge structure, and the rest is set on the top of the lens ring structure.
  • the electronic tag is set on the nose bridge structure as the center. The nose bridge structure and the top side of the lens ring structure connected with the nose bridge structure.
  • the portable electronic device is a smart mobile terminal 12.
  • the wireless carrier chip 13 includes: a microprocessor 1302 and at least one wireless carrier antenna 1304. At least one wireless carrier antenna 1304 is provided in the rear case 1202 of the smart mobile terminal.
  • the microprocessor 1302 is arranged in the main circuit board (not shown in the figure) of the smart mobile terminal, and the main circuit board is a circuit board for realizing the communication function and other functions of the smart mobile terminal.
  • the signal transceiver module 904 includes, but is not limited to: a wireless carrier chip for sending a third radio frequency signal to the first electronic tag; and receiving the reflected signal returned by the first electronic tag after receiving the third radio frequency signal.
  • the processing module 906 is configured to obtain the relative position of the portable electronic device and the first electronic tag based on the signal received by the signal transceiving module; fuse the position information corresponding to the relative position with the image data of the real scene to obtain the first image.
  • the processing module 906 includes but is not limited to a processor provided in a portable electronic device.
  • processing module 906 is also used to determine the relative position based on the third radio frequency signal and the reflected signal.
  • the display module 908 is used to display the first image.
  • the display module 908 includes but is not limited to a display screen.
  • the display screen may be a module independent of the image acquisition module 902, or may be a module integrated in the image acquisition module 902.
  • Fig. 14 is a schematic structural diagram of an image display device according to an embodiment of the present application.
  • the device is used to implement the method shown in FIG. 3.
  • the device includes:
  • the obtaining module 1402 is used to obtain the relative position of the portable electronic device and the first electronic tag;
  • the acquisition module 1402 includes: a first transceiver unit 150, configured to send a first radio frequency signal to a reader corresponding to the first electronic tag; and receive the reader The relative position determined based on the first radio frequency signal and the second radio frequency signal, where the second radio frequency signal is the radio frequency signal sent by the first electronic tag to the reader; the second transceiver unit 152, Send a third radio frequency signal to the first electronic tag; receive the reflected signal returned by the first electronic tag after receiving the third radio frequency signal; based on at least one of the third radio frequency signal and the reflected signal One determines the relative position.
  • the portable electronic device is provided with one or more second electronic tags; the first transceiver unit is configured to send the one or more second electronic tags to the reader through the one or more second electronic tags.
  • Multiple first radio frequency signals are provided.
  • each second electronic tag of the one or more second electronic tags has a tag identification, wherein each tag identification corresponds to an installation position of the one second electronic tag in the portable electronic device.
  • the fusion module 1404 is configured to fuse the position information corresponding to the relative position with the image data of the real scene to obtain a first image
  • the first display module 1406 is used to display the first image.
  • the above-mentioned device further includes the following modules: a second display module 156, used to display a list of electronic tags to a target object; a receiving module 158, used to receive a selection instruction of the target object; a selection module 160, used to In response to the selection instruction, one or more target electronic tags are selected from the electronic tag list, and the selected one or more target electronic tags are used as the first electronic tags.
  • a second display module 156 used to display a list of electronic tags to a target object
  • a receiving module 158 used to receive a selection instruction of the target object
  • a selection module 160 used to In response to the selection instruction, one or more target electronic tags are selected from the electronic tag list, and the selected one or more target electronic tags are used as the first electronic tags.
  • the embodiment of the present application also provides a non-volatile storage medium, the non-volatile storage medium includes a stored program, wherein, when the program is running, the device where the non-volatile storage medium is located is controlled to execute the above The described image display method.
  • the program when the program is running, the following program instructions are executed: obtain the relative position of the portable electronic device and the first electronic tag; superimpose the graphic object corresponding to the relative position on the shooting interface of the portable electronic device to obtain the target display interface, where , The shooting interface is used to display the real-world image to be shot; the target display interface is displayed.
  • the disclosed technical content can be implemented in other ways.
  • the device embodiments described above are only illustrative.
  • the division of the units may be a logical function division, and there may be other divisions in actual implementation, for example, multiple units or components may be combined or may be Integrate into another system, or some features can be ignored or not implemented.
  • the displayed or discussed mutual coupling or direct coupling or communication connection may be indirect coupling or communication connection through some interfaces, units or modules, and may be in electrical or other forms.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in one place, or they may be distributed on multiple units. Some or all of the units may be selected according to actual needs to achieve the objectives of the solutions of the embodiments.
  • the functional units in the various embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units may be integrated into one unit.
  • the above-mentioned integrated unit can be implemented in the form of hardware or software functional unit.
  • the integrated unit is implemented in the form of a software functional unit and sold or used as an independent product, it can be stored in a computer readable storage medium.
  • the technical solution of this application essentially or the part that contributes to the existing technology or all or part of the technical solution can be embodied in the form of a software product, and the computer software product is stored in a storage medium.
  • a computer device which may be a personal computer, a server, or a network device, etc.
  • the aforementioned storage media include: U disk, read-only memory (ROM, Read-Only Memory), random access memory (RAM, Random Access Memory), mobile hard disk, magnetic disk or optical disk and other media that can store program code .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Human Computer Interaction (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Sont divulgués un procédé et un appareil d'affichage d'image et un dispositif électronique portable. Le dispositif électronique portable comprend : un module de collecte d'image pour acquérir des données d'image d'un scénario réel ; un module d'émission-réception de signal pour communiquer avec un lecteur d'une première étiquette électronique ou la première étiquette électronique ; un module de traitement pour acquérir la position relative du dispositif électronique portable et de la première étiquette électronique sur la base d'un signal reçu par le module d'émission-réception de signal, et fusionner des informations de position correspondant à la position relative avec les données d'image du scénario réel de façon à obtenir une première image ; et un module d'affichage pour afficher la première image. La présente invention résout le problème technique dans l'état de la technique pertinent de l'expérience de l'utilisateur qui est affectée étant donné que la présentation d'un résultat de suivi d'un système de suivi d'étiquette électronique ne prend pas en considération l'environnement réel où se trouve un utilisateur.
PCT/CN2020/080518 2020-03-20 2020-03-20 Procédé et appareil d'affichage d'image, et dispositif électronique portable WO2021184388A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/080518 WO2021184388A1 (fr) 2020-03-20 2020-03-20 Procédé et appareil d'affichage d'image, et dispositif électronique portable
PCT/CN2020/083372 WO2021184442A1 (fr) 2020-03-20 2020-04-03 Procédé et appareil d'affichage d'image et dispositif électronique portatif
CN202080087336.XA CN114830067A (zh) 2020-03-20 2020-04-03 图像展示方法及装置、便携式电子设备

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/080518 WO2021184388A1 (fr) 2020-03-20 2020-03-20 Procédé et appareil d'affichage d'image, et dispositif électronique portable

Publications (1)

Publication Number Publication Date
WO2021184388A1 true WO2021184388A1 (fr) 2021-09-23

Family

ID=77767925

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/CN2020/080518 WO2021184388A1 (fr) 2020-03-20 2020-03-20 Procédé et appareil d'affichage d'image, et dispositif électronique portable
PCT/CN2020/083372 WO2021184442A1 (fr) 2020-03-20 2020-04-03 Procédé et appareil d'affichage d'image et dispositif électronique portatif

Family Applications After (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/083372 WO2021184442A1 (fr) 2020-03-20 2020-04-03 Procédé et appareil d'affichage d'image et dispositif électronique portatif

Country Status (2)

Country Link
CN (1) CN114830067A (fr)
WO (2) WO2021184388A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150531A (zh) * 2013-03-10 2013-06-12 周良文 基于rfid电子标签的私人物品监控系统
CN105574946A (zh) * 2015-12-21 2016-05-11 天津中兴智联科技有限公司 一种手持式etc读写器及其采用的rfid与ar的融合方法
CN106575347A (zh) * 2014-06-10 2017-04-19 标记与寻找无线解决方案有限公司 使用行动装置来定位物品的射频识别读取器及天线系统
CN108573293A (zh) * 2018-04-11 2018-09-25 广东工业大学 一种基于增强现实技术的无人超市购物协助方法及系统
CN108881809A (zh) * 2017-05-09 2018-11-23 杭州海康威视数字技术股份有限公司 视频监控方法、装置及系统
WO2019168780A1 (fr) * 2018-02-27 2019-09-06 Thin Film Electronics Asa Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil
US20190354735A1 (en) * 2018-05-17 2019-11-21 Motorola Mobility Llc Method to correlate an object with a localized tag

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR101781849B1 (ko) * 2010-12-06 2017-09-26 엘지전자 주식회사 이동 단말기 및 그 제어방법
CN103018923A (zh) * 2013-01-15 2013-04-03 南京恒知讯科技有限公司 智能眼镜装置、rfid系统及其通信方法
EP3218735A4 (fr) * 2014-11-13 2018-06-27 Nokia Technologies Oy Calcul de position à l'aide d'une énergie basse bluetooth
CN104537401B (zh) * 2014-12-19 2017-05-17 南京大学 基于射频识别和景深感知技术的现实增强系统及工作方法
CN207306182U (zh) * 2017-04-26 2018-05-04 左志权 一种基于wifi传输的智能消防头盔装置
CN107095384B (zh) * 2017-04-26 2023-11-24 左志权 一种基于wifi传输的智能消防头盔装置
JP6402873B2 (ja) * 2017-06-02 2018-10-10 株式会社ダイフク 仕分け設備
CN107633276A (zh) * 2017-08-18 2018-01-26 石家庄学院 物品防盗定位方法及装置
CN109615703B (zh) * 2018-09-28 2020-04-14 阿里巴巴集团控股有限公司 增强现实的图像展示方法、装置及设备
CN110543700A (zh) * 2019-08-15 2019-12-06 刘德建 使用ar技术将复杂精装bim工艺模型与施工现场结合的方法
CN110716645A (zh) * 2019-10-15 2020-01-21 北京市商汤科技开发有限公司 一种增强现实数据呈现方法、装置、电子设备及存储介质

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103150531A (zh) * 2013-03-10 2013-06-12 周良文 基于rfid电子标签的私人物品监控系统
CN106575347A (zh) * 2014-06-10 2017-04-19 标记与寻找无线解决方案有限公司 使用行动装置来定位物品的射频识别读取器及天线系统
CN105574946A (zh) * 2015-12-21 2016-05-11 天津中兴智联科技有限公司 一种手持式etc读写器及其采用的rfid与ar的融合方法
CN108881809A (zh) * 2017-05-09 2018-11-23 杭州海康威视数字技术股份有限公司 视频监控方法、装置及系统
WO2019168780A1 (fr) * 2018-02-27 2019-09-06 Thin Film Electronics Asa Système et procédé permettant de fournir une expérience de réalité augmentée à des objets à l'aide d'étiquettes sans fil
CN108573293A (zh) * 2018-04-11 2018-09-25 广东工业大学 一种基于增强现实技术的无人超市购物协助方法及系统
US20190354735A1 (en) * 2018-05-17 2019-11-21 Motorola Mobility Llc Method to correlate an object with a localized tag

Also Published As

Publication number Publication date
WO2021184442A1 (fr) 2021-09-23
CN114830067A (zh) 2022-07-29

Similar Documents

Publication Publication Date Title
US11416719B2 (en) Localization method and helmet and computer readable storage medium using the same
CN107179524B (zh) 消防设备定位方法、装置、系统及计算机可读存储介质
US20220405733A1 (en) Payment Method and Electronic Device
WO2017027338A1 (fr) Appareil et procédé pour prendre en charge des fonctionnalités interactives de réalité augmentée
CN104936283A (zh) 室内定位方法、服务器和系统
RU2656576C2 (ru) Устройство, система и способ идентификации объекта в изображении и транспондер
TR201815821T4 (tr) Bir cihazın kontrolu için yöntem.
US10922042B2 (en) System for sharing virtual content and method for displaying virtual content
CN107219517A (zh) 基于led可见光通信的手机安卓相机定位系统及其方法
CN109164410A (zh) Rfid定位及追踪方法、系统及计算机可读存储介质
GB2510226A (en) Indoor tracking of mobile phones using emitted markers
US20150120461A1 (en) Information processing system
WO2021184388A1 (fr) Procédé et appareil d'affichage d'image, et dispositif électronique portable
WO2022183906A1 (fr) Procédé et appareil d'imagerie, dispositif, et support de stockage
JP2006308493A (ja) 携帯端末装置、応答通信装置並びに被探索対象表示システム及び方法
US11238658B2 (en) AR space image projecting system, AR space image projecting method, and user terminal
Strecker et al. MR Object Identification and Interaction: Fusing Object Situation Information from Heterogeneous Sources
KR101729923B1 (ko) 스크린 영상과 증강현실 영상을 이용한 입체영상 구현 방법 및 이를 실행하는 서버 및 시스템
Jian et al. Hybrid cloud computing for user location-aware augmented reality construction
US10885617B2 (en) Image analysis method and image analysis system for server
CN101299172B (zh) 动态鼠标指针再生系统及其再生方法
CN112418377A (zh) 电子价签及其工作方法
WO2024092588A1 (fr) Procédé et système de positionnement de dispositif à l'aide d'une direction de signal
CN103401990B (zh) 利用手机传感器和网络进行物体标记及提取的方法和系统
CN116148768A (zh) 一种定位方法、装置、系统、头戴显示设备及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20925031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20925031

Country of ref document: EP

Kind code of ref document: A1