CN114545629A - Augmented reality device, information display method and device - Google Patents

Augmented reality device, information display method and device Download PDF

Info

Publication number
CN114545629A
CN114545629A CN202210072682.1A CN202210072682A CN114545629A CN 114545629 A CN114545629 A CN 114545629A CN 202210072682 A CN202210072682 A CN 202210072682A CN 114545629 A CN114545629 A CN 114545629A
Authority
CN
China
Prior art keywords
augmented reality
information
coordinate system
glasses
sensor module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210072682.1A
Other languages
Chinese (zh)
Inventor
胡永涛
戴景文
贺杰
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Virtual Reality Technology Co Ltd
Original Assignee
Guangdong Virtual Reality Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Virtual Reality Technology Co Ltd filed Critical Guangdong Virtual Reality Technology Co Ltd
Priority to CN202210072682.1A priority Critical patent/CN114545629A/en
Publication of CN114545629A publication Critical patent/CN114545629A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01DMEASURING NOT SPECIALLY ADAPTED FOR A SPECIFIC VARIABLE; ARRANGEMENTS FOR MEASURING TWO OR MORE VARIABLES NOT COVERED IN A SINGLE OTHER SUBCLASS; TARIFF METERING APPARATUS; MEASURING OR TESTING NOT OTHERWISE PROVIDED FOR
    • G01D18/00Testing or calibrating apparatus or arrangements provided for in groups G01D1/00 - G01D15/00
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T19/00Manipulating 3D models or images for computer graphics
    • G06T19/006Mixed reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N13/00Stereoscopic video systems; Multi-view video systems; Details thereof
    • H04N13/30Image reproducers
    • H04N13/332Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Software Systems (AREA)
  • Computer Hardware Design (AREA)
  • Computer Graphics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The application discloses an augmented reality device, an information display method and an information display device. Augmented reality device include augmented reality glasses and with augmented reality glasses communication connection's sensor module, the sensor module is independent each other with augmented reality glasses. Wherein, augmented reality glasses include: the glasses comprise a glasses body, a marker, a communication module and a controller. The marker is arranged on the glasses body and is exposed through the outer surface of the glasses body. The communication module is arranged on the glasses body and is used for being in communication connection with the sensor module. The controller is arranged on the glasses body and electrically connected with the communication module. Wherein the sensor module is configured to: an image of the target containing the marker is acquired. The controller is configured to: and calibrating the first coordinate system where the sensor module is located and the second coordinate system where the glasses body is located according to the target image to obtain a calibration result. In this application, augmented reality glasses's volume is smaller and more exquisite, and weight is lighter, is suitable for the user to wear for a long time.

Description

Augmented reality device, information display method and device
Technical Field
The present application relates to the field of augmented reality technologies, and in particular, to an augmented reality apparatus, an information display method, and an information display apparatus.
Background
In recent years, with the progress of science and technology, Augmented Reality (AR) technology has gradually become a hot spot of research at home and abroad. The user displays virtual information to the real world through an interactive device (e.g., augmented reality glasses) so that the user can acquire richer information in the real world.
In the existing augmented reality glasses, a sensor (e.g., a camera) is often disposed on an outer surface of the augmented reality glasses to acquire detection information of a location where a user is located (e.g., the detection information corresponding to the camera is a scene image of the location where the user is located), and a controller inside the augmented reality glasses processes the detection information and displays the processed detection information on the augmented reality glasses. For example, when the user walks to the doorway of a restaurant, the detection information acquired by the augmented reality glasses through the camera is a picture of the restaurant, and the processed detection information displayed on the augmented reality glasses may be information such as a recommended menu, discount information, and average price of the restaurant.
However, in the augmented reality glasses, if a user wants to acquire different types of information, a plurality of sensors corresponding to the types of information must be provided, and for example, if a user wants to acquire distance information between the user and the surrounding environment (for example, a building), the augmented reality glasses need to be provided with a distance measuring sensor (for example, a distance measuring instrument). Consequently, augmented reality glasses can appear the problem that the volume is too big, weight is overweight under the condition that is provided with a plurality of sensors, and the user wears the in-process of augmented reality glasses for a long time, can appear wearing uncomfortable condition and take place, has reduced user's use and has experienced.
Disclosure of Invention
The embodiment of the application provides an augmented reality device, an information display method and an information display device.
In a first aspect, some embodiments of the present application provide an augmented reality apparatus. This augmented reality device includes: augmented reality glasses and with augmented reality glasses communication connection's sensor module, sensor module and augmented reality glasses are independent each other. Wherein, augmented reality glasses include: the glasses comprise a glasses body, a marker, a communication module and a controller. The mark is arranged on the glasses body and is exposed through the outer surface of the glasses body. The communication module is arranged on the glasses body and is used for being in communication connection with the sensor. The controller is arranged on the glasses body and electrically connected with the communication module. Wherein the sensor module is configured to: an image of the target containing the marker is acquired. The controller is configured to: and calibrating a first coordinate system where the sensor is located and a second coordinate system where the glasses body is located according to the target image to obtain a calibration result.
In a second aspect, some embodiments of the present application further provide an information display method, which is applied to augmented reality glasses. Augmented reality glasses and sensor module communication connection, augmented reality glasses are independent each other with the sensor module. The augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies. The method comprises the following steps: an image of a target including a marker acquired by a sensor is acquired. And acquiring a calibration result based on the target image, wherein the calibration result represents a mapping relation between a first coordinate system where the sensor is located and a second coordinate system where the glasses body is located. And acquiring detection information acquired by the sensor, wherein the detection information is information in a first coordinate system. And determining display information corresponding to the detection information based on the calibration result, wherein the display information is information in a second coordinate system for the augmented reality glasses to display.
In a third aspect, some embodiments of the present application further provide an information display method, which is applied to an augmented reality device. Augmented reality device includes augmented reality glasses and with augmented reality glasses communication connection's sensor module, the sensor module is independent each other with augmented reality glasses. The augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies. The method comprises the following steps: the sensor module acquires an image of a target including a marker. The augmented reality glasses obtain a calibration result based on the target image, and the calibration result represents a mapping relation between a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located. The sensor module acquires detection information, wherein the detection information is information in a first coordinate system. And determining display information corresponding to the detection information by the augmented reality glasses based on the calibration result, wherein the display information is information in a second coordinate system for the augmented reality glasses to display.
In a fourth aspect, some embodiments of the present application further provide an information display device, where the information display device is applied to augmented reality glasses, the augmented reality glasses are in communication connection with the sensor module, and the augmented reality glasses and the sensor module are independent of each other. The augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies. The device includes: the device comprises a target image acquisition module, a calibration result acquisition module, a detection information acquisition module and a display information determination module. The target image acquisition module is used for acquiring a target image which is acquired by the sensor and contains the marker. The calibration result acquisition module is used for acquiring a calibration result based on the target image, and the calibration result represents a mapping relation between a first coordinate system where the sensor is located and a second coordinate system where the glasses body is located. The detection information acquisition module is used for acquiring detection information acquired by the sensor, and the detection information is information in a first coordinate system. The display information determining module is used for determining display information corresponding to the detection information based on the calibration result, and the display information is information in a second coordinate system and is displayed by the augmented reality glasses.
In a fifth aspect, embodiments of the present application further provide a computer-readable storage medium, where the computer-readable storage medium stores computer program instructions. The computer program instructions can be called by the processor to execute the information display method.
In a sixth aspect, an embodiment of the present application further provides a computer program product, where the computer program product, when executed, implements the information display method described above.
The application provides an augmented reality device, an information display method and an information display device. In the augmented reality device that this application provided, augmented reality glasses and sensor module are independent each other, also the phase separation between augmented reality glasses and the sensor module for the user can be under the user demand of difference, and the sensor module with augmented reality glasses and different functions is pointed to and is connected, in order to obtain the detection information of different grade type. Therefore, the augmented reality glasses have richer functions, smaller size and lighter weight, and are suitable for being worn by a user for a long time. In addition, be provided with the marker on the glasses body of augmented reality glasses, the sensor module can carry out position calibration through discerning this marker with the glasses body, has guaranteed that follow-up display information who shows on the information display part of augmented reality glasses can coincide with the real world in corresponding element that the user saw mutually, improves user's use and experiences.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 shows a schematic diagram of an augmented reality apparatus provided in an embodiment of the present application.
Fig. 2 shows an application scenario diagram of an augmented reality device according to an embodiment of the present application.
Fig. 3 is a schematic diagram illustrating an overlay display of uncorrected detection information and annotation information and a view field image according to an embodiment of the present application.
Fig. 4 shows a schematic diagram of an overlay display of corrected detection information and annotation information and a view field image according to an embodiment of the present application.
Fig. 5 shows a schematic diagram of a marker provided by an embodiment of the present application.
Fig. 6 is a flowchart illustrating an information display method according to a first embodiment of the present application.
Fig. 7 is a flowchart illustrating an information display method according to a second embodiment of the present application.
Fig. 8 is a schematic flowchart illustrating an information display method according to a third embodiment of the present application.
Fig. 9 shows a block diagram of an information display device according to an embodiment of the present application.
FIG. 10 illustrates a block diagram of modules of a computer-readable storage medium provided by embodiments of the present application.
Detailed Description
Reference will now be made in detail to embodiments of the present application, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are exemplary only for the purpose of explaining the present application and are not to be construed as limiting the present application.
In order to make the technical solutions of the present application better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application. It is to be understood that the embodiments described are only a few embodiments of the present application and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
The application provides an augmented reality device, an information display method and an information display device. In the augmented reality device that this application provided, augmented reality glasses and sensor module are independent each other, also phase separation between augmented reality glasses and the sensor module for the user can be under the user demand of difference, and the sensor module with augmented reality glasses and different functions is pointed to and is connected, in order to acquire the detection information of different grade type. Therefore, the augmented reality glasses have richer functions, smaller size and lighter weight, and are suitable for being worn by a user for a long time. In addition, be provided with the marker on the glasses body of augmented reality glasses, the sensor can carry out position calibration through discerning this marker with the glasses body, has guaranteed that the display information that follow-up shows on the information display part of augmented reality glasses can coincide with the real world in corresponding element that the user saw mutually, improves user's use and experiences.
Referring to fig. 1, fig. 1 schematically illustrates an augmented reality apparatus according to an embodiment of the present application. The augmented reality device 10 includes augmented reality glasses 20 and a sensor module 30 communicatively connected to the augmented reality glasses 20, wherein the sensor module 30 and the augmented reality glasses 20 are independent of each other.
In the embodiment of the present application, the augmented reality glasses 20 include a glasses body 100, a marker 200, a communication module 300, and a controller 400. The marker 200 is disposed on the glasses body 100 and exposed through the outer surface of the glasses body 100. The communication module 300 is disposed on the glasses body 100 and is used for communicating with the sensor module 30. The controller 400 is disposed on the glasses body 100 and electrically connected to the communication module 300.
In the embodiment of the present application, the sensor module 30 is configured to: an image of the target containing the marker 200 is acquired. The sensor module 30 may include one sensor or a plurality of sensors. In some embodiments, the sensor module 30 includes at least one of the following sensors: image sensor, infrared thermal imager, range finding sensor, night-time vision device and spectrum appearance.
In the embodiment of the present application, the sensor module 30 includes an image acquisition part (not shown in the figure) and a detection information acquisition part.
The image acquisition component is used to acquire an image of the target containing the marker 200. Illustratively, the image acquisition component may be an image sensor (e.g., a camera).
The detection information acquisition component is used for detecting scenes of the real world and acquiring corresponding detection information. For example, if the sensor is an infrared thermal imager, the corresponding detection information is an infrared image corresponding to a scene in the real world acquired by the infrared thermal imager. If the sensor is a distance measuring sensor, the corresponding detection information is the distance information between the distance measuring sensor and the target object in the real world. If the sensor is a night vision device, the corresponding detection information is a night vision image corresponding to a scene in the real world. If the sensor is a spectrometer, the corresponding detection information is spectral information of ambient light corresponding to a scene in the real world. The embodiment of the present application does not specifically limit the specific type of the sensor.
In other embodiments, the image capturing component and the detection information capturing component may be implemented by the same hardware, and in the case of using a sensor as an image sensor, an image captured by the image sensor may include both a marker and a real-world scene.
It should be noted that, in the augmented reality device 10 provided in the present application, the sensor module 30 and the augmented reality glasses 20 are independent from each other, that is, there is no physical connection relationship between the sensor module 30 and the augmented reality glasses 20, for example, the sensor module 30 and the augmented reality glasses 20 communicate with each other in a wireless communication manner, rather than establishing a wired communication manner through a data line and a signal interface; as another example, there is no way for hardware to mate (e.g., a securing slot, a clamping mechanism, etc.) between the sensor module 30 and the augmented reality glasses 20, such that the sensor module 30 can be secured over the augmented reality glasses 20. In the embodiment of the present application, because independent each other between sensor module 30 and augmented reality glasses 20, consequently, augmented reality glasses's volume is smaller and more exquisite, and weight is lighter, is suitable for the user to wear for a long time.
In some embodiments, the sensor module 30 further comprises a mounting member (not shown) adapted to be mounted to the movable platform. The movable platform can be a movable carrier such as an unmanned vehicle and an unmanned aerial vehicle, and a fixing part matched with the installation part on the sensor module 30 is arranged on the movable platform, so that the sensor module 30 can be fixed on the movable platform through the fixing part. Wherein the mounting part and the fixing part may be an attaching mechanism, a screw coupling mechanism, or the like. In the embodiment of the present application, the sensor module 30 is disposed on the movable platform, and then the user can collect the detection information in the user's view blind area by controlling the movable platform. For example, when the user wears the augmented reality glasses 20 in the field, if the user has a view of a shrub that is blocked, the user may control the movable platform to bypass the shrub, so that the sensor module 30 can collect the detection information behind the shrub. It should be noted that, a certain scene coincidence is required to be ensured between the detection information collected by the sensor module 30 and the real world seen by the user through the augmented reality glasses 20, so that the display information corresponding to the detection information can coincide with the corresponding element in the real world seen by the user.
In some embodiments, the sensor module 30 may also be disposed on the human body of the user through a mounting part. For example, the sensor module 30 may be fixed to the top of the head of the user by a mounting member (e.g., a strap, etc.), or may be fixed to the ear of the user (e.g., by an ear hook, etc.). It should be noted that, when the sensor module 30 is disposed at a certain position of the human body, it is required to ensure that the target image including the marker 200 can be obtained through the image capturing component at the corresponding position, so as to ensure that the sensor module 30 can smoothly complete the calibration of the coordinate system with the augmented reality glasses 20 in the subsequent use process. In the embodiment of the present application, the sensor module 30 is installed in other positions except the glasses body 100 through the installation portion, so that the augmented reality glasses 20 are more portable, and the wearing comfort of the user is ensured.
In the present embodiment, the controller 400 is configured to: and calibrating the first coordinate system of the sensor module 30 and the second coordinate system of the glasses body 100 according to the target image to obtain a calibration result.
Referring to fig. 2, fig. 2 schematically illustrates an application scenario of augmented reality glasses according to an embodiment of the present application. In fig. 2, the sensor module 30 is an image capturing component, and a user can view a visual field image 610 corresponding to a person 600 in the real world through the glasses body 100 of the augmented reality glasses. Meanwhile, the sensor module 30 corresponding to the augmented reality glasses may acquire the detection information 620 corresponding to the person 600, and acquire the additional annotation information 622 corresponding to the detection information 620 through the controller, where in the application scenario shown in fig. 2, the additional annotation information 622 is type information and gender information of the person. Since the sensor module 30 and the eyewear body 100 are located at different positions, if the augmented reality eyeglasses 20 do not correct the display positions of the detection information 620 and the additional comment information 622 and directly display the detection information 620 and the additional comment information 622 on the eyewear body 100, the positions of the detection information 620 and the additional comment information 622 corresponding to the same object (for example, the detection information and the comment information corresponding to the child 1) and the position of the same object (for example, the child 1) in the visual field image 610 viewed through the eyewear body 100 by the user may be displaced. Referring to fig. 3, fig. 3 schematically illustrates a schematic view of a display of uncorrected detection information and annotation information superimposed on a view image according to an embodiment of the present application. In one aspect, the deviation is caused by the sensor module 30 and the eyeglass body 100 being in different spatial positions; on the other hand, the deviation phenomenon is caused by the relative inclination of the detection angle of the sensor module 30 with respect to the angle at which the user wears the augmented reality glasses 20. Therefore, when the detection information is displayed, it is necessary to correct a positional deviation between the positions of the detection information and the additional comment information of the same object and the position of the same object in the sight field image. In the embodiment of the present application, the controller 400 determines a mapping relationship between a first coordinate system where the sensor module 30 is located and a second coordinate system where the glasses body 100 is located, that is, a calibration result, according to the target image including the marker 200. In the subsequent step, the controller 400 may accurately display the detection information on the augmented reality glasses 20 based on the above calibration result, that is, accurately superimpose the detection information and the additional annotation information of the same object on the same object in the sight field image. Referring to fig. 4, fig. 4 schematically illustrates a schematic view of displaying a corrected detection information and annotation information in superposition with a view field image according to an embodiment of the present application. Specifically, specific embodiments of the controller 400 acquiring the calibration result based on the target image are set forth in detail in the following examples.
Referring to fig. 1 again, the glasses body 100 includes a frame part 110 and a lens part 120, and the lens part 120 is detachably fixed to the frame part 110. In some embodiments, the frame part 110 includes a receiving cavity (not shown) in which the communication module 300 and the controller 400 are disposed, and the receiving cavity is used to provide a protection function for the communication module 300 and the controller 400, so as to prevent the communication module 300 and the controller 400 from being misplaced or damaged due to external impact, and prolong the service life of the augmented reality glasses 20.
Referring to fig. 1 again, in some embodiments, the glasses body 100 further includes an information display portion 130, and the information display portion 130 is configured to display the display information output by the controller 400. The sensor module 30 is further configured to: and acquiring detection information, wherein the detection information is information in a first coordinate system. The controller 400 is further configured to: and determining display information corresponding to the detection information according to the calibration result, wherein the display information is information in a second coordinate system. Finally, the display information is displayed on the information display unit 130. In the application embodiment, the controller 400 converts the detection information in the first coordinate system into the display information in the second coordinate system based on the calibration result, so that the display information and the view image of the real world that the user sees through the augmented reality glasses 20 are information in the same coordinate system, thereby avoiding the occurrence of a position deviation between the information that is required to be displayed by the corresponding object in the detection information and the same object in the view image, and further ensuring good use experience of the user. Specifically, the specific implementation of the controller 400 determining the display information based on the calibration result is set forth in detail in the following examples.
In some embodiments, the information display portion 130 may include a combination of a projection device and a projection lens, the projection device is disposed on the lens frame portion 110 for projecting the display information output by the controller 400 to the projection lens, and the projection lens may be integrated with the lens portion 120 for displaying the display information. It should be noted here that the user can see both the displayed information on the projection lens and the real world behind the projection lens through the projection lens. In other embodiments, the information display portion 130 may be a transparent display screen (e.g., a display screen made of OLED) disposed on the lens portion 120, so as to facilitate the controller 400 to output information such as images to the transparent display screen, and after the information is displayed by the transparent display screen, a user can see the information superimposed with the real world through the transparent display screen. Wherein the transparent display screen may establish a wired connection with the controller 400 through a data line or a wireless connection with the controller 400 through a communication interface.
Referring to fig. 5, fig. 5 schematically illustrates a schematic diagram of a marker provided in an embodiment of the present application, in which the marker 200 includes a background 210 and a plurality of feature points 220 distributed on the background 210, and the plurality of feature points 220 are configured to be distinguished from the background 210 to be identified by the sensor module 30. The shape of the marker 200 may be circular, rectangular, triangular, and is not particularly limited in this embodiment. In some embodiments, the plurality of feature points 220 are distinguished from the background 210 by color information, e.g., the background 210 of the marker 200 is white and the plurality of feature points 220 are black. In one embodiment, the marker 200 is a marker code. Specifically, the mark code may be deep tag, RuneTag, aprilat, TopoTag, QR, artoollit, ARUCO, aprilat, artoollit +, and the like, and is not particularly limited in the embodiments of the present application.
In this embodiment, the communication module 300 may include one or more of a ZigBee module, a bluetooth module, an NFC module, and a WIFI module, and the controller 400 establishes a communication connection with the sensor module 30 through the communication module 300. In other embodiments, the communication module 300 includes a signal interface and a data line. The signal interface is electrically connected to the controller 400, and the sensor module 30 is connected to the augmented reality glasses 20 through a data line and the signal interface. In the embodiment of the present application, a specific implementation manner of the communication module 300 is not particularly limited.
In the present embodiment, the controller 400 may include a processor, a memory, and a substrate. Wherein. The processor and the memory are fixedly connected to the substrate respectively, and the processor and the memory are in communication connection. The processor may include one or more processing cores to implement the obtaining of calibration results and the translation of detection information by executing instructions, programs, sets of code or sets of instructions stored in memory and invoking data stored in memory. Alternatively, the processor may integrate one or a combination of a Central Processing Unit (CPU), a Graphics Processing Unit (GPU), a modem, and the like. The Memory may include a Random Access Memory (RAM) or a Read-Only Memory (Read-Only Memory). The memory may be used to store an instruction, a program, code, a set of codes, or a set of instructions.
In some embodiments, the sensor module 30 includes a plurality of sensors, and the number of markers 200 is plural. The plurality of markers 200 are disposed at intervals on the glasses body 100, and the plurality of markers 200 are adapted to correspond to the plurality of sensors one by one. In the embodiment of the present application, by providing the plurality of markers 200 on the glasses body 100, the plurality of sensors can be bound to the augmented reality glasses 20 through the corresponding markers 200, and it is ensured that a plurality of detection information acquired by the plurality of sensors can be accurately displayed on the augmented reality glasses 20 in a subsequent use process of the plurality of sensors. Further, the real-world scene observed by the user can be further enriched by the multiple detection information acquired by the multiple sensors, for example, the augmented reality glasses 20 worn by the user are provided with the markers 200 corresponding to the infrared thermal imager and the distance measuring sensor, so that the detection information acquired by the infrared thermal imager and the distance measuring sensor can be accurately displayed on the augmented reality glasses 20, that is, the user can see the infrared image of the target person and the distance information between the target person in the real-world scene by wearing the augmented reality glasses 20 at night.
In some embodiments, the plurality of sensors includes a first sensor and a second sensor, and the plurality of markers 200 includes a first marker corresponding to the first sensor and a second marker corresponding to the second sensor. Wherein the first sensor is configured to: a first target image including a first marker is acquired. The second sensor is configured to: a second target image is acquired that includes a second marker. The controller 400 is further configured to: and calibrating a first sub-coordinate system where the first sensor is located and a second coordinate system where the glasses body 100 is located according to the first target image to obtain a first calibration result. The controller 400 is further configured to: and calibrating a second sub-coordinate system where the second sensor is located and a second coordinate system where the glasses body 100 is located according to the second target image to obtain a second calibration result. In the embodiment of the present application, in the case where the controller 400 determines the first calibration result based on the first target image and determines the second calibration result based on the second target image, in the subsequent steps, the controller 400 accurately displays the first detection information acquired by the first sensor and the second detection information acquired by the second sensor on the augmented reality glasses 20. Specifically, the specific implementation of the controller 400 determining the first calibration result based on the first target image and the second calibration result based on the second target image is set forth in detail in the following examples.
Referring to fig. 6, fig. 6 schematically illustrates an information display method according to a first embodiment of the present application. The method is applied to the augmented reality glasses in the figure 1, wherein the augmented reality glasses are in communication connection with the sensor module, the augmented reality glasses and the sensor module are independent of each other, and the augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies. According to the method, a first coordinate system where a sensor module is located and a second coordinate system where a glasses body is located are calibrated through a target image containing a marker, which is obtained by the sensor module, so that a calibration result is obtained, and detection information in the first coordinate system is converted into display information in the second coordinate system based on the calibration result. According to the method provided by the embodiment of the application, the sensor module and the augmented reality glasses are calibrated through the identification marker, the fact that the subsequent display information can coincide with the corresponding elements in the real world seen by the user is guaranteed, and the use experience of the user is improved. Specifically, the method may include the following steps S610 to S640.
Step S610: and acquiring a target image containing the marker acquired by the sensor module.
In one embodiment, the controller sends a target image request signal to the sensor module when detecting that the sensor module is in communication connection with the sensor module, and the sensor module acquires a marker on the augmented reality glasses and sends the acquired target image containing the marker to the controller when receiving the target image request signal. Illustratively, a communication connection zone bit is arranged in the controller, the controller determines whether the controller is in a communication state with the sensor module by reading a value of the communication connection zone bit, and when the value of the communication connection zone bit is a preset value, the controller determines that the controller is in the communication state with the sensor module; otherwise, the sensor module is determined to be in the off state.
As another embodiment, the sensor module sends a detection signal to the controller to determine whether the sensor module and the augmented reality glasses are in the communication connection state when receiving the start signal, and if the sensor module receives a confirmation signal sent by the controller in response to the detection signal, the sensor module and the augmented reality glasses are in the communication connection state. In this case, the sensor module acquires a marker on the augmented reality glasses and transmits the acquired target image including the marker to the controller.
Step S620: and acquiring a calibration result based on the target image.
The calibration result represents the mapping relation between the first coordinate system where the sensor module is located and the second coordinate system where the glasses body is located. The first coordinate system and the second coordinate system may be cartesian rectangular coordinate system, planar polar coordinate system, cylindrical coordinate system, spherical coordinate system, etc., and the first coordinate system and the second coordinate system may be the same type of coordinate system or different types of coordinate systems, which is not limited in this embodiment. For example, referring to fig. 2 again, the first coordinate system of the sensor module 30 may be a cartesian rectangular coordinate system formed by u-axis, v-axis and w-axis; the second coordinate system of the glasses body 100 may be a cartesian rectangular coordinate system formed by an x-axis, a y-axis and a z-axis.
In the embodiment of the application, the controller determines relative orientation information between the sensor module and the marker based on the target image, and determines calibration results of a first coordinate system and a second coordinate system based on the relative orientation information; wherein the relative orientation information includes relative position information and relative rotation information. Specifically, step S620 may include steps S622 to S626.
Step S622: and analyzing the image characteristics of the marker in the target image.
In one embodiment, a target recognition algorithm is provided in the controller, and based on the target recognition algorithm, the position of the marker in the target image is determined. Illustratively, the target recognition algorithm may be a sliding window based target recognition algorithm, an R-CNN neural network based algorithm, or the like.
In an embodiment of the present application, the image feature of the marker is characterized by the pixel coordinates of a plurality of feature points in the marker. The pixel coordinates are coordinates of the plurality of feature points in an image coordinate system corresponding to the target image. In some embodiments, the marker is composed of a plurality of regular graphs, and the plurality of feature points may be vertices of the regular graphs or center points of the regular graphs. Under the condition that the position of the marker is determined, the controller identifies a plurality of characteristic points in the marker, and further determines pixel coordinates of the characteristic points in the target image. Illustratively, a first pattern matching algorithm and a pre-stored image corresponding to the marker are set in the controller, the pre-stored image is pre-marked with the feature points to be identified, and based on the feature points to be identified in the pre-stored image, the first pattern matching algorithm can determine the feature points at the corresponding positions of the marker in the target image. The first pattern matching algorithm may be a template matching algorithm, a feature matching algorithm, a deep learning based matching algorithm, or the like.
In some embodiments, if the controller does not recognize the marker in the target image through the target recognition algorithm, in this case, it may be caused by the sensor module and the marker being occluded, the controller may display a reminding message in the message display unit to remind the user to process the occlusion. Illustratively, the reminder may be "XX sensor does not acquire a marker, please check whether an occlusion condition has occurred".
In some embodiments, the controller further comprises a step of image preprocessing the target image before the step of identifying the plurality of feature points in the marker in the case where the position of the marker is determined. In one embodiment, when the position of the marker is determined, the controller extracts a sub-target image including the marker based on the position information of the marker, and processes the sub-target image based on a preset image processing algorithm. The image processing algorithm can be an image contrast enhancement algorithm and an image sharpening algorithm, and the color contour difference between the feature points in the marker and the background can be more obvious through the image preprocessing step, so that the subsequent feature points can be rapidly and accurately extracted.
Step S624: and acquiring relative orientation information between the sensor module and the augmented reality glasses according to the image characteristics.
In the embodiment of the present application, the relative orientation information is determined based on the pixel coordinates of the plurality of feature points, the physical coordinates, and the physical parameters of the sensor module. The physical coordinates of the characteristic points are the coordinates of the characteristic points in a second coordinate system where the glasses body is located. That is, the physical coordinates of the plurality of feature points represent the actual physical positions of the plurality of feature points on the augmented reality glasses, and the controller can determine the physical coordinates of the plurality of feature points by reading the physical positions of the markers on the augmented reality glasses and the physical positions of the feature points on the markers. The physical parameters of the sensor module represent parameters adopted when the target image is acquired, and the physical parameters of the sensor module can be parameters such as the focal length and the size of a view field of the sensor module. The controller can determine the physical parameters of the sensor module by reading the parameter information corresponding to the sensor module.
The controller obtains the relative orientation information between the sensor module and the augmented reality glasses under the condition of determining the pixel coordinates and the physical coordinates corresponding to the characteristic points and the physical parameters of the sensor module. As an embodiment, the relative orientation information may include relative position information and relative rotation information. The relative position information represents a moving state between the first coordinate system and the second coordinate system, namely, the moving freedom of the sensor module in the second coordinate system and each coordinate axis of the second coordinate system. The relative rotation information represents a rotation state between the sensor module and the second coordinate system, that is, a rotational degree of freedom between the sensor module and each coordinate axis of the second coordinate system in the second coordinate system. The relative orientation information is the six-degree-of-freedom information of the sensor module in the second coordinate system, and can represent the rotation and movement states of the sensor module in the second coordinate system, that is, the angle, the distance, and the like between the visual field of the sensor module and each coordinate axis in the second coordinate system can be obtained. As one embodiment, the controller calculates the relative orientation information between the sensor module and the augmented reality glasses through a preset algorithm (e.g., SVD algorithm) while determining the pixel coordinates, the physical coordinates, and the physical parameters of the sensor module corresponding to the feature points.
Specifically, in other embodiments, in the case where the controller acquires the feature point in step S624, the controller further acquires the pixel coordinate of the feature point. After acquiring the pixel coordinates of the feature points, the controller is further configured to determine a relative positional relationship between the plurality of feature points in the target image according to the pixel coordinates of the feature points, so that a spatial posture of the marker (e.g., posture information relative to the sensor module) can be obtained; calculating the space distance of the marker (or each characteristic point) relative to the sensor module according to the pixel size of the characteristic points in the target image; and finally, obtaining the relative orientation information of the marker relative to the sensor module according to the space attitude and the space distance.
Step S626: and determining a calibration result based on the relative orientation information.
In the embodiment of the present application, the calibration result is realized by a calibration matrix. As an embodiment, the controller stores the relative orientation information in the controller in a matrix form in a case where the relative orientation information is determined. The calibration matrix comprises a position matrix and a rotation matrix. The position matrix represents relative position information in the relative orientation information, and the rotation matrix represents relative rotation information in the relative orientation information.
It should be noted that, when the controller determines the calibration matrix corresponding to the sensor module, in the subsequent operation, the detection information obtained by the sensor module in the first coordinate system may be converted into corresponding information in the second coordinate system, that is, display information, based on the calibration matrix, so that the controller may implement coordinate conversion between the first coordinate system and the second coordinate system through the calibration matrix.
Step S630: and acquiring the detection information acquired by the sensor module.
The detection information is information in a first coordinate system. In one embodiment, the controller sends a detection information acquisition signal to the sensor module when determining the calibration result of the sensor module, and the sensor module acquires the detection information in real time and sends the detection information to the controller when receiving the detection information acquisition signal. As another embodiment, after the sensor module sends the target image containing the marker to the controller, the sensor module automatically starts the detection information acquisition function, and acquires and sends the detection information to the controller. For example, taking the example that the user wears the augmented reality glasses at night, the real-world scene that the user sees through the augmented reality glasses may be a piece of blackish-black scene. At this time, if the augmented reality glasses are in communication connection with the infrared thermal imager, the infrared image acquired by the infrared thermal imager can be acquired; if the augmented reality glasses are in communication connection with the night vision device, a night vision image acquired by the night vision device can be acquired; if the augmented reality glasses are in communication connection with the ranging sensor module, the distance information between the ranging sensor module and a target object in the real world can be acquired.
Step S640: and determining display information corresponding to the detection information based on the calibration result.
The display information is information in the second coordinate system for the augmented reality glasses to display. In the embodiment of the present application, the detection information includes first position data in a first coordinate system, and a corresponding product operation is performed on the first position data and a calibration matrix corresponding to a calibration result, so that second position data in a second coordinate system, that is, display information corresponding to the detection information, can be obtained. For example, taking the detection information as the infrared image acquired by the infrared thermal imager as an example, the detection information includes first position data of the infrared image in a first coordinate system, and the first position data and a calibration matrix corresponding to the infrared thermal imager are subjected to a dot product operation, so that second position data in a second coordinate system, that is, the infrared image in the second coordinate system, can be obtained. In the subsequent step, the augmented reality glasses may display the infrared image in the second coordinate system on an information display portion of the augmented reality glasses, and the user may simultaneously see a scene of the real world (for example, a scene painted black at night) and the infrared image corresponding to the scene through the information display portion.
In some embodiments, referring to fig. 2 again, in fig. 2, a user wearing the augmented reality glasses can see a scene played by a child in the real world through the glasses body, and a sensor module (i.e., an image capturing component) corresponding to the augmented reality glasses can capture detection information corresponding to the scene. The controller processes the detection information to acquire additional annotation information (i.e., type information and gender information of the person) corresponding to the detection information. When the detection information and the additional comment information are acquired, if the controller directly displays the detection information and the additional comment information on the glasses body without correcting the display positions of the detection information and the additional comment information, the positions of the detection information and the additional comment information corresponding to the child and the position of the same child viewed through the glasses body by the user may deviate. In this case, the controller transforms the first position data in the first coordinate system corresponding to the detection information and the additional annotation information to the second coordinate system through the calibration matrix to obtain the second position data, that is, the display data. The controller displays the display data on the glasses body, so that the detection information and the additional annotation information of the children are accurately superposed with the same children in the visual field image.
The embodiment of the application provides an information display method. The method is applied to the augmented reality glasses in fig. 1, and the augmented reality glasses comprise a glasses body and a marker arranged on the glasses body. In the method, the sensor module and the augmented reality glasses are calibrated by identifying the marker, so that the follow-up display information can be coincident with the corresponding elements in the real world seen by the user, and the use experience of the user is improved.
Referring to fig. 7, fig. 7 schematically illustrates an information display method according to a second embodiment of the present application. The method is applied to the augmented reality glasses in fig. 1, and the augmented reality glasses comprise a glasses body and a marker arranged on the glasses body. Wherein, the quantity of the markers is a plurality of, and the plurality of markers include a first marker corresponding to the first sensor and a second marker corresponding to the second sensor. Specifically, the method may include the following steps S710 to S780.
Step S710: a first target image including a first marker acquired by a first sensor is acquired.
Step S720: a second target image including a second marker acquired by a second sensor is acquired.
The specific implementation of step S710 to step S720 may refer to the specific description in step S610, and is not described herein again.
Step S730: and acquiring a first calibration result based on the first target image.
The first calibration result represents a first mapping relation between a first sub-coordinate system where the first sensor is located and a second coordinate system where the glasses body is located.
Step S740: and acquiring a second calibration result based on the second target image.
The second calibration result represents a second mapping relation between a second sub-coordinate system where the second sensor is located and a second coordinate system where the glasses body is located.
The specific implementation of step S730 to step S740 may refer to the specific description in step S620, and is not described herein again.
It should be noted here that since the augmented reality glasses are provided with a plurality of markers, in the first target image and the second target image acquired by the controller, a situation that a plurality of markers are captured in one target image may occur. For example, a first marker and a second marker are acquired simultaneously in a first target image. Thus, the controller may determine multiple markers when determining the position of the first marker in the first target image by the target recognition algorithm and when determining the position of the second marker in the second target image.
Therefore, when the controller determines that a plurality of markers exist in the target image, the controller determines a sensor for acquiring the target image according to the parameter information of the target image, then determines the marker corresponding to the sensor by searching the first mapping table, and further determines the marker corresponding to the target image from the plurality of markers. Wherein the first mapping table represents a one-to-one correspondence between different types of sensors and different markers. For example, there is a one-to-one correspondence between the first sensor and the first marker, and a one-to-one correspondence between the second sensor and the second marker. For example, when it is determined that the plurality of markers exist in the first target image, the controller determines that the first target image is acquired by the first sensor based on the parameter information of the first target image, then determines the first marker corresponding to the first sensor by looking up the first mapping table, and further determines the first marker in the plurality of markers as the marker corresponding to the first target image. Specifically, a second pattern matching algorithm may be provided in the controller, and based on the second pattern matching algorithm, the controller matches the first marker from the multiple markers as the marker corresponding to the first target image. The second pattern matching algorithm may be a template matching algorithm, a feature matching algorithm, a deep learning based matching algorithm, or the like.
Step S750: first detection information acquired by a first sensor is acquired.
The first detection information is information in a first sub-coordinate system.
Step S760: second detection information acquired by the second sensor is acquired.
The second detection information is information in a second sub-coordinate system.
The specific implementation of step S750 to step S760 may refer to the specific description in step S630, and is not described herein again.
Step S770: and determining first display information corresponding to the first detection information based on the first calibration result.
The first display information is information in a second coordinate system for the augmented reality glasses to display.
Step S780: and determining second display information corresponding to the second detection information based on the second calibration result.
The second display information is information in a second coordinate system for the augmented reality glasses to display.
The specific implementation of steps S770 to S780 may refer to the specific description in step S640, and is not described herein again.
In some embodiments, the controller further displays the first detection information and the second detection information simultaneously on the information display portion in the augmented reality glasses when determining the first display information corresponding to the first detection information and the second display information corresponding to the second detection information. For example, taking the first sensor as an infrared thermal imager and the second sensor as a distance measuring sensor as an example, the first display information corresponding to the infrared thermal imager is an infrared image in a second coordinate system, and the second display information corresponding to the distance measuring sensor is distance information of the glasses body from a target object in a scene of the real world. In the embodiment of the present application, the controller may simultaneously display both the first display information and the second display information on the information display portion, so that the user may simultaneously view an infrared image corresponding to a scene of the real world (for example, a dark scene at night) through the information display portion, and if the infrared image includes an infrared image corresponding to a target person, the user may also view distance information between the user and the target person through the information display portion.
The embodiment of the application provides an information display method. The method is applied to the augmented reality glasses in fig. 1, and the augmented reality glasses comprise a glasses body and a marker arranged on the glasses body. Wherein, the quantity of the markers is a plurality of, and the plurality of markers include a first marker corresponding to the first sensor and a second marker corresponding to the second sensor. In the method, the plurality of markers are identified to realize the coordinate system calibration of the plurality of sensors and the augmented reality glasses, so that a plurality of display information acquired by the plurality of sensors can be ensured to be coincident with corresponding elements in the real world seen by a user, and the scenes of the real world observed by the user are further enriched.
Referring to fig. 8, fig. 8 schematically illustrates an information display method according to a third embodiment of the present application. The method is applied to the augmented reality device in fig. 1, wherein the augmented reality device comprises augmented reality glasses and a sensor module in communication connection with the augmented reality glasses, the sensor module and the augmented reality glasses are independent of each other, and the augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies. Specifically, the method may include the following steps S810 to S840.
Step S810: the sensor module acquires an image of a target including a marker.
Step S820: the augmented reality glasses acquire a calibration result based on the target image.
The calibration result represents the mapping relation between the first coordinate system where the sensor module is located and the second coordinate system where the glasses body is located.
Step S830: the sensor module acquires detection information.
The detection information is information in a first coordinate system.
Step S840: and the augmented reality glasses determine display information corresponding to the detection information based on the calibration result.
The display information is information in the second coordinate system for the augmented reality glasses to display.
The specific implementation of steps S810 to S840 may refer to the specific description of steps S610 to S640, and is not repeated here.
The embodiment of the application provides an information display method, which is applied to an augmented reality device in fig. 1. In the method, the augmented reality glasses in the augmented reality device acquire the target image containing the marker based on the sensor module, so that the calibration of the coordinate system of the sensor module and the augmented reality glasses is realized, the coincidence of subsequent display information and corresponding elements in the real world seen by a user is ensured, and the use experience of the user is improved.
Referring to fig. 9, fig. 9 schematically illustrates an information display device 900 according to an embodiment of the present disclosure. The device 900 is applied to augmented reality glasses, and augmented reality glasses and sensor module communication connection, augmented reality glasses are independent each other with the sensor module, and augmented reality glasses include the glasses body and set up in the marker of glasses body. Wherein the apparatus 900 comprises: a target image obtaining module 910, a calibration result obtaining module 920, a detection information obtaining module 930, and a display information determining module 940. The target image acquiring module 910 is configured to acquire a target image including a marker acquired by the sensor module. The calibration result obtaining module 920 is configured to obtain a calibration result based on the target image, where the calibration result represents a mapping relationship between a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located. The detection information acquiring module 930 is configured to acquire detection information acquired by the sensor module, where the detection information is information in a first coordinate system. The display information determining module 940 is configured to determine display information corresponding to the detection information based on the calibration result, where the display information is information in a second coordinate system for displaying the augmented reality glasses.
In some embodiments, the calibration result obtaining module 920 is further configured to analyze image features of the marker in the target image; acquiring relative orientation information between the sensor module and the augmented reality glasses according to the image characteristics; and determining a calibration result based on the relative orientation information.
In some embodiments, the number of markers is a plurality, the plurality of markers including a first marker corresponding to the first sensor and a second marker corresponding to the second sensor. The target image acquiring module 910 is further configured to acquire a first target image including a first marker acquired by a first sensor, the first sensor being in communication with the augmented reality glasses. The target image acquiring module 910 is further configured to acquire a second target image including a second marker acquired by a second sensor, the second sensor being in communication with the augmented reality glasses. The calibration result obtaining module 920 is further configured to obtain a first calibration result based on the first target image, where the first calibration result represents a first mapping relationship between a first sub-coordinate system where the first sensor is located and a second coordinate system where the glasses body is located. The calibration result obtaining module 920 is further configured to obtain a second calibration result based on the second target image, where the second calibration result represents a second mapping relationship between a second sub-coordinate system where the second sensor is located and a second coordinate system where the glasses body is located. The detection information obtaining module 930 is further configured to obtain first detection information obtained by the first sensor, where the first detection information is information in the first sub-coordinate system. The detection information obtaining module 930 is further configured to obtain second detection information obtained by the second sensor, where the second detection information is information in a second sub-coordinate system. The display information determining module 940 is further configured to determine first display information corresponding to the first detection information based on the first calibration result, where the first display information is information in a second coordinate system for the augmented reality glasses to display. The display information determining module 940 is further configured to determine second display information corresponding to the second detection information based on the second calibration result, where the second display information is information in a second coordinate system for the augmented reality glasses to display.
It can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working processes of the above-described apparatuses and modules may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the several embodiments provided in the present application, the coupling between the modules may be electrical, mechanical or other type of coupling.
In addition, functional modules in the embodiments of the present application may be integrated into one processing module, or each of the modules may exist alone physically, or two or more modules are integrated into one module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode.
The embodiment of the application provides an information display device, which is applied to augmented reality glasses in fig. 1. In this device, augmented reality glasses pass through the discernment marker, realize carrying out coordinate system to sensor module and augmented reality glasses and markd, guaranteed that follow-up display information can coincide with corresponding element in the real world that the user saw mutually, improve user's use and experience.
Referring to fig. 10, a computer-readable storage medium 1000 is further provided according to an embodiment of the present application, where the computer-readable storage medium 1000 stores computer program instructions 1010, and the computer program instructions 1010 can be called by a processor to execute the method described in the foregoing embodiment.
The computer-readable storage medium may be an electronic memory such as a flash memory, an EEPROM (electrically erasable programmable read only memory), an EPROM, a hard disk, or a ROM. Alternatively, the computer-readable storage medium includes a non-volatile computer-readable storage medium. The computer readable storage medium 1000 has storage space for computer program instructions 1010 to perform any of the method steps of the method described above. The computer program instructions 1010 may be read from or written to one or more computer program products.
Although the present application has been described with reference to the preferred embodiments, it is to be understood that the present application is not limited to the disclosed embodiments, but rather, the present application is intended to cover various modifications, equivalents and alternatives falling within the spirit and scope of the present application.

Claims (10)

1. An augmented reality device is characterized by comprising augmented reality glasses and a sensor module which is in communication connection with the augmented reality glasses, wherein the sensor module and the augmented reality glasses are independent from each other;
the augmented reality glasses include:
a spectacle body;
a marker disposed on the glasses body and exposed through an outer surface of the glasses body;
the communication module is arranged on the glasses body and is used for being in communication connection with the sensor module; and
the controller is arranged on the glasses body and is electrically connected with the communication module;
wherein the sensor module is configured to: acquiring a target image containing the marker; the controller is configured to: and calibrating a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located according to the target image to obtain a calibration result.
2. The augmented reality device of claim 1, wherein the eyeglass body is provided with an information display, the sensor module further configured to: acquiring detection information, wherein the detection information is information in the first coordinate system;
the controller is further configured to: determining display information corresponding to the detection information according to the calibration result, wherein the display information is information in the second coordinate system;
and displaying the display information on the information display part.
3. The augmented reality device of claim 1, wherein the sensor module comprises a plurality of sensors, the number of the markers is multiple, the markers are arranged on the glasses body at intervals, and the markers are adapted to correspond to the sensors one by one.
4. The augmented reality device of claim 3, wherein the plurality of sensors includes a first sensor and a second sensor, and the plurality of markers includes a first marker corresponding to the first sensor and a second marker corresponding to the second sensor; the first sensor is configured to: acquiring a first target image comprising the first marker; the second sensor is configured to: acquiring a second target image comprising the second marker; the controller is further configured to:
calibrating a first sub-coordinate system where the first sensor is located and a second coordinate system where the glasses body is located according to the first target image to obtain a first calibration result;
and calibrating a second sub-coordinate system where the second sensor is located and a second coordinate system where the glasses body is located according to the second target image to obtain a second calibration result.
5. Augmented reality device according to any one of claims 1-4, wherein the sensor module comprises at least one of the following sensors:
image sensor, infrared thermal imager, range finding sensor, night-time vision device and spectrum appearance.
6. The information display method is applied to augmented reality glasses, the augmented reality glasses are in communication connection with a sensor module, and the augmented reality glasses and the sensor module are independent of each other; the augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies, and the method comprises the following steps:
acquiring a target image containing the marker acquired by the sensor module;
acquiring a calibration result based on the target image, wherein the calibration result represents a mapping relation between a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located;
acquiring detection information acquired by the sensor module, wherein the detection information is information in the first coordinate system;
and determining display information corresponding to the detection information based on the calibration result, wherein the display information is information in the second coordinate system and is displayed by the augmented reality glasses.
7. The method of claim 6, said obtaining calibration results based on said target image, comprising:
analyzing the image characteristics of the marker in the target image;
acquiring relative orientation information between the sensor module and the augmented reality glasses according to the image characteristics;
and determining the calibration result based on the relative orientation information.
8. The method of any one of claims 6 to 7, wherein the number of the markers is a plurality, the plurality of markers including a first marker corresponding to the first sensor and a second marker corresponding to the second sensor; the target image containing the marker acquired by the acquisition sensor module comprises:
acquiring a first target image acquired by the first sensor comprising the first marker;
acquiring a second target image including the second marker acquired by the second sensor;
the obtaining of the calibration result based on the target image includes:
acquiring a first calibration result based on the first target image, wherein the first calibration result represents a first mapping relation between a first sub-coordinate system where the first sensor is located and a second coordinate system where the glasses body is located;
acquiring a second calibration result based on the second target image, wherein the second calibration result represents a second mapping relation between a second sub-coordinate system where the second sensor is located and a second coordinate system where the glasses body is located;
the acquiring of the detection information acquired by the sensor module includes:
acquiring first detection information acquired by the first sensor, wherein the first detection information is information in the first sub-coordinate system;
acquiring second detection information acquired by the second sensor, wherein the second detection information is information in the second sub-coordinate system;
the determining, based on the calibration result, display information corresponding to the detection information includes:
determining first display information corresponding to the first detection information based on the first calibration result, wherein the first display information is information in the second coordinate system and is displayed by the augmented reality glasses;
and determining second display information corresponding to the second detection information based on the second calibration result, wherein the second display information is information in the second coordinate system and is displayed by the augmented reality glasses.
9. The information display method is applied to an augmented reality device, the augmented reality device comprises augmented reality glasses and a sensor module which is in communication connection with the augmented reality glasses, and the sensor module and the augmented reality glasses are independent from each other; the augmented reality glasses comprise glasses bodies and markers arranged on the glasses bodies, and the method comprises the following steps:
the sensor module acquires a target image containing the marker;
the augmented reality glasses obtain a calibration result based on the target image, wherein the calibration result represents a mapping relation between a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located;
the sensor module acquires detection information, wherein the detection information is information in the first coordinate system;
and the augmented reality glasses determine display information corresponding to the detection information based on the calibration result, wherein the display information is information in the second coordinate system for the augmented reality glasses to display.
10. An information display device is characterized in that the device is applied to augmented reality glasses, the augmented reality glasses are in communication connection with a sensor module, and the augmented reality glasses and the sensor module are independent of each other; augmented reality glasses include the glasses body and set up in the marker of glasses body, the device includes:
the target image acquisition module is used for acquiring a target image which is acquired by the sensor module and contains the marker;
the calibration result acquisition module is used for acquiring a calibration result based on the target image, and the calibration result represents a mapping relation between a first coordinate system where the sensor module is located and a second coordinate system where the glasses body is located;
the detection information acquisition module is used for acquiring detection information acquired by the sensor module, and the detection information is information in the first coordinate system;
and the display information determining module is used for determining display information corresponding to the detection information based on the calibration result, wherein the display information is information in the second coordinate system and is displayed by the augmented reality glasses.
CN202210072682.1A 2022-01-21 2022-01-21 Augmented reality device, information display method and device Pending CN114545629A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210072682.1A CN114545629A (en) 2022-01-21 2022-01-21 Augmented reality device, information display method and device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210072682.1A CN114545629A (en) 2022-01-21 2022-01-21 Augmented reality device, information display method and device

Publications (1)

Publication Number Publication Date
CN114545629A true CN114545629A (en) 2022-05-27

Family

ID=81672221

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210072682.1A Pending CN114545629A (en) 2022-01-21 2022-01-21 Augmented reality device, information display method and device

Country Status (1)

Country Link
CN (1) CN114545629A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396653A (en) * 2022-08-24 2022-11-25 歌尔科技有限公司 Calibration method, system, device and medium for AR glasses

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210253A (en) * 2015-05-26 2016-12-07 Lg电子株式会社 Glasses type terminal and control method thereof
CN108139799A (en) * 2016-04-22 2018-06-08 深圳市大疆创新科技有限公司 The system and method for region of interest (ROI) processing image data based on user
CN108664037A (en) * 2017-03-28 2018-10-16 精工爱普生株式会社 The method of operating of head-mount type display unit and unmanned plane
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
CN112019807A (en) * 2020-08-05 2020-12-01 何学谦 Augmented reality system based on unmanned aerial vehicle
CN112631431A (en) * 2021-01-04 2021-04-09 杭州光粒科技有限公司 AR (augmented reality) glasses pose determination method, device and equipment and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106210253A (en) * 2015-05-26 2016-12-07 Lg电子株式会社 Glasses type terminal and control method thereof
CN108139799A (en) * 2016-04-22 2018-06-08 深圳市大疆创新科技有限公司 The system and method for region of interest (ROI) processing image data based on user
CN108664037A (en) * 2017-03-28 2018-10-16 精工爱普生株式会社 The method of operating of head-mount type display unit and unmanned plane
CN109814719A (en) * 2018-07-26 2019-05-28 亮风台(上海)信息科技有限公司 A kind of method and apparatus of the display information based on wearing glasses
CN112019807A (en) * 2020-08-05 2020-12-01 何学谦 Augmented reality system based on unmanned aerial vehicle
CN112631431A (en) * 2021-01-04 2021-04-09 杭州光粒科技有限公司 AR (augmented reality) glasses pose determination method, device and equipment and storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115396653A (en) * 2022-08-24 2022-11-25 歌尔科技有限公司 Calibration method, system, device and medium for AR glasses

Similar Documents

Publication Publication Date Title
US9401050B2 (en) Recalibration of a flexible mixed reality device
US10643390B2 (en) Head mounted display, method for controlling head mounted display, and computer program
WO2017170148A1 (en) Flight device, electronic device and program
US10482668B2 (en) Miniature vision-inertial navigation system with extended dynamic range
CN107015638B (en) Method and apparatus for alerting a head mounted display user
US20160041388A1 (en) Head mounted display, information system, control method for head mounted display, and computer program
US20140160170A1 (en) Provision of an Image Element on a Display Worn by a User
EP3422153A1 (en) System and method for selective scanning on a binocular augmented reality device
US10169880B2 (en) Information processing apparatus, information processing method, and program
JP7012163B2 (en) Head-mounted display device and its method
US20190156511A1 (en) Region of interest image generating device
CN112525185B (en) AR navigation method based on positioning and AR head-mounted display device
US11335090B2 (en) Electronic device and method for providing function by using corneal image in electronic device
KR101682705B1 (en) Method for Providing Augmented Reality by using RF Reader
JP2019215688A (en) Visual line measuring device, visual line measurement method and visual line measurement program for performing automatic calibration
CN114545629A (en) Augmented reality device, information display method and device
US10559132B2 (en) Display apparatus, display system, and control method for display apparatus
CN110120100B (en) Image processing method, device and identification tracking system
WO2022240933A1 (en) Depth-from- stereo bending correction
CN112213753B (en) Method for planning parachuting training path by combining Beidou navigation and positioning function and augmented reality technology
CN110120062B (en) Image processing method and device
US11854227B2 (en) Depth-from-stereo bending correction using visual inertial odometry features
CN117873159B (en) Indoor target visual positioning method of multi-rotor unmanned aerial vehicle
CN215642483U (en) Wearable device
US20230298345A1 (en) Wearable device, information processing system, and information processing method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination