CN215642483U - Wearable device - Google Patents

Wearable device Download PDF

Info

Publication number
CN215642483U
CN215642483U CN202121981198.8U CN202121981198U CN215642483U CN 215642483 U CN215642483 U CN 215642483U CN 202121981198 U CN202121981198 U CN 202121981198U CN 215642483 U CN215642483 U CN 215642483U
Authority
CN
China
Prior art keywords
wearable device
image data
data
image
control device
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202121981198.8U
Other languages
Chinese (zh)
Inventor
王博文
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Leapsy Technology Co ltd
Original Assignee
Shenzhen Leapsy Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Leapsy Technology Co ltd filed Critical Shenzhen Leapsy Technology Co ltd
Priority to CN202121981198.8U priority Critical patent/CN215642483U/en
Application granted granted Critical
Publication of CN215642483U publication Critical patent/CN215642483U/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Traffic Control Systems (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The wearable device provided by the specification comprises a laser radar, a vision sensor, a control device and a display device, wherein the laser radar measures position data of an object in a target field of view relative to the laser radar during operation, and the position data comprises three-dimensional coordinate information; the vision sensor collects image data in the target field of view during operation; the control device is in communication connection with the laser radar and the vision sensor, acquires the position data and the image data in real time, and performs superposition processing on the position data and the image data in real time to generate a three-dimensional image; the display device is in communication connection with the control device and receives and displays the stereoscopic image. The wearable device can generate a three-dimensional image of a scene in real time, so that the wearable device can be applied to a wider range of scenes.

Description

Wearable device
Technical Field
This specification relates to augmented reality technical field, especially relates to a wearable equipment.
Background
With the rapid development of computer technology, the concept of Augmented Reality (AR) is generally recognized by consumers, and thus is widely applied to various fields, particularly in the field of wearable devices, such as smart helmets, smart glasses, smart watches, and the like. The camera in the wearable device in the prior art can only acquire a planar image within a view field range, cannot acquire depth information, and cannot help a user to judge the position of an object.
Therefore, it is desirable to provide a wearable device capable of acquiring object position information to generate and display a stereoscopic image, thereby being applied to a wider range of scenes.
SUMMERY OF THE UTILITY MODEL
The present specification provides a wearable device capable of acquiring position information of an object to generate and display a stereoscopic image, thereby being applied to a wider range of scenes.
The specification provides wearable equipment, which comprises a shell, a laser radar, a vision sensor, a control device and a display device, wherein the laser radar is arranged on the shell, and is used for measuring position data of an object in a target visual field relative to the laser radar during operation, and the position data comprises three-dimensional coordinate information; the vision sensor is mounted on the housing and operative to acquire image data within the field of view of the target; the control device is arranged on the shell, is in communication connection with the laser radar and the vision sensor, acquires the position data and the image data in real time, and performs superposition processing on the position data and the image data in real time to generate a three-dimensional image; and the display device is arranged on the shell, is in communication connection with the control device, and receives and displays the stereoscopic image.
In some embodiments, the lidar is operable to transmit electromagnetic wave signals at a plurality of different angles to the target field of view and to receive a plurality of reflected electromagnetic wave signals reflected back by objects within the target field of view, thereby obtaining three-dimensional coordinate information of objects within the target field of view relative to the lidar.
In some embodiments, the wearable device is a head-mounted device for wearing on the head of a target subject, and the display device is positioned in front of the eyes of the target subject.
In some embodiments, the stereoscopic image includes image data having depth information.
In some embodiments, the stereoscopic image includes three-dimensional model data of the target field of view.
In some embodiments, the wearable device further comprises a thermal imaging device mounted on the housing, communicatively coupled to the control device, and operable to acquire thermal image data within the field of view of the target.
In some embodiments, the control device receives the thermal image data in real time, and superimposes the thermal image data, the position data, and the image data in real time to obtain the stereoscopic image, where the stereoscopic image includes image data obtained by superimposing the thermal image data, the position data, and the image data.
In some embodiments, the laser radar, the vision sensor, and the thermal imaging device are mounted in a predetermined positional relationship and at a distance from each other not exceeding a predetermined value.
In some embodiments, the housing comprises a first housing through which the wearable device is worn on a target object when in operation, and a second housing, the first housing comprising an accommodation cavity in which the display device and the control device are mounted; the laser radar, the vision sensor and the thermal imaging device are arranged on the second shell, the second shell is fixedly connected with the first shell, and the display device and the control device are sealed in the containing cavity.
In some embodiments, the control device includes a communication module, and is in communication connection with an external electronic device during operation, so as to obtain virtual model data from the external electronic device, or send the stereoscopic image to the external electronic device, and the control device is capable of receiving the virtual model data, performing overlay processing on the virtual model data and the stereoscopic image data to generate a display image, and displaying the display image through the display device.
According to the technical scheme, the wearable device provided by the specification is provided with the laser radar, the laser radar can transmit a plurality of groups of electromagnetic wave signals with different angles into the target field of view and receive the reflection of the object in the target field of view to the electromagnetic wave signals, so that the position data, such as three-dimensional coordinate information, of the object in the target field of view relative to the laser radar can be calculated in real time. The wearable device can also acquire image data of a target view field in real time through the camera. The control device can superimpose the image data and the position data in real time, so that a three-dimensional image of a target view field is acquired in real time, and the three-dimensional image is displayed through the display device to help a target object (a wearer) to judge the position of an object. In some embodiments, the wearable device may further include a thermal imaging device to generate thermal image data. The control device can perform superposition processing on the stereo image and the thermal image data and control the display device to display the superposition processing result so as to help the target object to acquire temperature distribution data in the target field of view, and therefore the wearable device is applied to more scenes.
Other functions of the wearable device provided by the present specification will be set forth in part in the description that follows. The inventive aspects of the wearable devices provided in this specification can be fully explained by the practice or use of the methods, apparatus and combinations in the following detailed examples.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present disclosure, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present disclosure, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic view illustrating an overall structure of a wearable device provided according to an embodiment of the present specification;
fig. 2 illustrates an exploded view of a wearable device provided in accordance with embodiments of the present disclosure;
fig. 3 shows a hardware structure diagram of a wearable device provided in accordance with an embodiment of the present description;
FIG. 4 illustrates a flow diagram of a method of data overlay provided in accordance with an embodiment of the present description;
fig. 5 is a schematic view illustrating a screen of a display device when a wearable device is applied to disaster relief according to an embodiment of the present specification;
fig. 6 is a schematic view illustrating a screen of a display device when a wearable device is applied to disaster relief according to an embodiment of the present specification;
fig. 7 is a schematic view illustrating a screen of a display device when a wearable device is applied to disaster relief according to an embodiment of the present specification;
fig. 8 is a schematic view of a scene of a wearable device applied to real-time scanning of a three-dimensional object according to an embodiment of the present disclosure;
fig. 9 is a schematic view illustrating a screen of a display device when a wearable device provided in an embodiment of the present disclosure is applied to real-time scanning of a three-dimensional object;
fig. 10 is a schematic diagram illustrating a remote transmission scenario of a wearable device provided in accordance with an embodiment of the present specification;
fig. 11 is a schematic view illustrating a screen of a display device when a wearable device provided in an embodiment of the present specification is applied to gesture recognition;
fig. 12 is a schematic view illustrating a screen of a display device when another wearable device provided in accordance with an embodiment of the present specification is applied to gesture recognition; and
fig. 13 is a schematic view illustrating a screen of a display device when a wearable device provided according to an embodiment of the present specification is applied to skiing.
Detailed Description
The following description is presented to enable any person skilled in the art to make and use the present description, and is provided in the context of a particular application and its requirements. Various modifications to the disclosed embodiments will be readily apparent to those skilled in the art, and the general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the present description. Thus, the present description is not limited to the embodiments shown, but is to be accorded the widest scope consistent with the claims.
The terminology used herein is for the purpose of describing particular example embodiments only and is not intended to be limiting. For example, as used herein, the singular forms "a", "an" and "the" may include the plural forms as well, unless the context clearly indicates otherwise. The terms "comprises," "comprising," "includes," and/or "including," when used in this specification, are intended to specify the presence of stated integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
These and other features of the present specification, as well as the operation and function of the elements of the structure related thereto, and the combination of parts and economies of manufacture, may be particularly improved upon in view of the following description. Reference is made to the accompanying drawings, all of which form a part of this specification. It is to be expressly understood, however, that the drawings are for the purpose of illustration and description only and are not intended as a definition of the limits of the specification. It should also be understood that the drawings are not drawn to scale.
The present description provides a wearable device. The wearable device can be any form of intelligent electronic device that can be worn on the body of a person. In some embodiments, the wearable device may be a clothing class, an electronic device worn on the torso or on an extremity of the target subject, such as a smart bracelet, smart watch, or the like. In some embodiments, the wearable device may be a head-mounted device for wearing on the head of the target subject, such as a smart helmet, smart glasses, or the like. For convenience of illustration, in this specification, we will describe a wearable device as an example of a head-mounted device. Wherein the target object may be a wearer, which may be a human being.
Fig. 1 shows an overall structural schematic diagram of a wearable device 001 provided according to an embodiment of the present description; fig. 2 shows an exploded structure diagram of a wearable device 001 provided according to an embodiment of the present disclosure. As shown in fig. 1 and 2, the wearable device 001 may include a housing 100, a laser radar 200, a vision sensor 400, a control device 600, and a display device 800. In some embodiments, wearable device 001 may also include thermal imaging apparatus 900.
The housing 100 may be a mounting base or frame for the various components of the wearable device 001. The laser radar 200, the visual sensor 400, the control device 600, the display device 800, and the thermal imaging device 900 may be mounted on the casing 100 or may be mounted inside the casing 100. The housing 100 may also be a wearable base of the wearable device 001. Wearable device 001 may be worn on a target site on a target subject, such as the head, through housing 100. The material of the housing 100 may be any material, such as a metal material, a non-metal material, a polymer material, and the like. The housing 100 may include rigid components or may include flexible components. The appearance of the housing 100 may be adaptively designed according to different wearing manners and wearing positions, which is not limited in this specification. Adjustable means may be included on the housing 100 to adjust the size of the housing 100 to enhance the fit and comfort of different target subjects.
In some embodiments, the housing 100 may include a first housing 120 and a second housing 140. The first casing 120 may be a mounting base for the control device 600 and the display device 800, or may be a mounting base for the wearable device 001. Wearable device 001 may be worn on a target object through first housing 120 in operation. The first housing 120 may include a receiving cavity 122. The display device 800 and the control device 600 may be installed in the receiving cavity 122. In some embodiments, the first housing 120 may further have a data transmission interface 124 for connecting with the external electronic device 6 for data transmission. The data transmission interface 124 may be any form of data interface, such as a USB interface, a Type-C interface, and the like.
The second housing 140 may be fixedly connected with the first housing 120 to enclose the display device 800 and the control device 600 in the accommodating cavity 122. The fixed connection mode can be at least one of a threaded connection mode, a buckling connection mode, an adhesion mode, a welding mode, a riveting mode and the like. After the second housing 140 is connected to the first housing 120, the display device 800 and the control device 600 can be protected by sealing, so that the display device 800 and the control device 600 can be prevented from being damaged by the outside. The second housing 140 may include a mounting groove 142, a sensor protective lens 144, and a display device protective lens 146. The laser radar 200, the vision sensor 400, and the thermal imaging device 900 are mounted in the mounting groove 142 in the second housing 140. The sensor protection lens 144 may be located outside the mounting groove 142 to protect the laser radar 200, the vision sensor 400, and the thermal imaging device 900. The position of the display device protective lens 146 corresponds to the installation position of the display device 800 to protect the display device 800.
Fig. 3 shows a hardware structure diagram of a wearable device 001 provided according to an embodiment of the present disclosure.
As shown in fig. 1 to 3, the laser radar 200 may be mounted on the housing 100, and particularly, may be mounted in the mounting groove 142. Lidar 200 may, in operation, measure position data of objects within the field of view of the target relative to lidar 200. The position data may be three-dimensional coordinate information. That is, laser radar 200 may be operated to measure coordinate information of objects within the target field of view in the reference coordinate system of laser radar 200. The laser radar 200 may be used to measure a distance and an angle by transmitting a microwave signal and calculating a time difference by receiving a reflected signal of the microwave signal reflected by the object to calculate the distance and the angle of the object from the laser radar 200. In particular, lidar 200 may include a transmit sensor and a receive sensor. The transmitting sensor may transmit an electromagnetic wave signal outward. Specifically, the transmitting sensor may transmit electromagnetic wave signals of a plurality of different transmission angles outward. The receiving sensor may receive electromagnetic wave signals reflected back by objects within the field of view of the target. In particular, the receiving sensor may receive a plurality of reflected electromagnetic wave signals reflected back by objects within the field of view of the target. Lidar 200 may determine a distance of an object at the emission angle of the current electromagnetic wave signal from lidar 200 based on a time difference of each electromagnetic wave signal and its corresponding reflected electromagnetic wave signal, and determine three-dimensional coordinate information of the object at the emission angle relative to lidar 200 based on the emission angle of the current electromagnetic wave signal.
The target field of view may be the sensing range of lidar 200. The target view field is set and changed according to the use requirement.
In some embodiments, lidar 200 may be used as a three-dimensional spatial scan to create a three-dimensional model of the target field of view. Specifically, laser radar 200 may acquire three-dimensional coordinate information of objects at different locations within the target field of view relative to laser radar 200, thereby acquiring relative coordinate information between objects within the target field of view. When the position and the attitude of the laser radar 200 change, the computing device (for example, the control device 600) may select a fixed point from the target view field as a reference point, and calculate three-dimensional coordinate information of objects at different positions relative to the reference point according to three-dimensional coordinate information of the objects at different positions relative to the laser radar 200, which is acquired by the laser radar 200 at different times, so as to establish a three-dimensional model of the objects in the target view field. The lidar 200 may perform the acquisition of position data in real time, and the computing device may perform three-dimensional modeling in real time.
As shown in fig. 1 to 3, the vision sensor 400 may be mounted on the housing 100, and particularly, may be mounted in the mounting groove 142. Vision sensor 400 may be operable to acquire image data within a field of view of a target. The vision sensor 400 may be a camera, such as an RGB camera, such as an infrared camera, or the like. The vision sensor 400 may capture images within the field of view of the target. The sensing range of vision sensor 400 may be close to that of lidar 200. In some embodiments, the sensing range of the vision sensor 400 may include the target field of view. In some embodiments, the target field of view may include the sensing range of the vision sensor 400.
As shown in fig. 1 to 3, the thermal imaging device 900 may be mounted on the housing 100, and particularly, may be mounted in the mounting groove 142. Thermal imaging device 900 may be operated to acquire thermal image data within a field of view of an object. The thermal imaging device 900 may include an infrared detector and an optical imaging objective that receives an infrared radiation energy distribution pattern of the target under test, and the infrared detector reflects the pattern onto a photosensitive element of the infrared detector to obtain thermal image data. The thermographic data may be an image showing different temperatures of the object by different colors. The sensing range of thermal imaging device 900 may be close to that of lidar 200. In some embodiments, the sensing range of thermal imaging device 900 may include the target field of view. In some embodiments, the target field of view may include the sensing range of the thermal imaging device 900.
The laser radar 200, the vision sensor 400, and the thermal imaging device 900 are installed in a preset positional relationship. In order to make the sensing ranges of the laser radar 200, the vision sensor 400 and the thermal imaging device 900 consistent as much as possible, and reduce the visual error among the three, so as to reduce the error rate as much as possible, the distances among the three should be as small as possible, and the distances between the three should not exceed a preset value. The preset value can be obtained through an experimental mode and can also be obtained through machine learning.
The laser radar 200, the vision sensor 400, and the thermal imaging device 900 may be arranged in a predetermined manner. For example, the laser radar 200, the vision sensor 400, and the thermal imaging device 900 may be linearly arranged, that is, arranged in a column, may be horizontally linearly arranged, may be vertically linearly arranged, may be linearly arranged in any direction, and the like. In some embodiments, lidar 200 may be located in the middle, with vision sensor 400 and thermal imaging device 900 located on either side of lidar 200. In some embodiments, vision sensor 400 may be located in the middle, with lidar 200 and thermal imaging device 900 located on either side of vision sensor 400. In some embodiments, thermal imaging device 900 may be located in the middle, with laser radar 200 and vision sensor 400 located on either side of thermal imaging device 900. In some embodiments, the laser radar 200, the vision sensor 400, and the thermal imaging device 900 may be arranged in a rectangular shape, for example, the three may be divided into two rows, wherein one row includes one of the three, and the other row includes the other two of the three. In some embodiments, lidar 200 is in the first row and vision sensor 400 and thermal imaging device 900 are juxtaposed below lidar 200. In some embodiments, vision sensor 400 is located in the first row and lidar 200 and thermal imaging device 900 are located side-by-side below vision sensor 400.
As shown in fig. 1 to 3, the control device 600 may be mounted on the housing 100, and particularly, may be mounted in the accommodating cavity 122 of the first housing 120. The control device 600 may be in communication with the laser radar 200, the vision sensor 400, and the thermal imaging device 900 to acquire position data, image data, and thermal image data in real time. In some embodiments, the control device 600 may also perform the superposition processing of the position data and the image data in real time to generate the stereoscopic image. The stereoscopic image may include data after the position data and the image data are superimposed. In some embodiments, the control device 600 may further perform a superposition process on the thermal image data, the position data, and the image data in real time to obtain a stereoscopic image in real time. At this time, the stereoscopic image may include the thermal image data and the position data as well as the image data after the image data superposition processing. In some embodiments, the control device 600 may further be communicatively connected to the display device 800, and control the display device 800 to display a stereoscopic image. That is, the control device 600 may perform data superposition processing on the position data, the image data, and the thermal image data in real time to acquire a stereoscopic image or a stereoscopic temperature distribution image.
The control apparatus 600 may include a hardware device having a data information processing function and a program required to drive the hardware device to operate. Of course, the control device 600 may be only a hardware device having a data processing capability, or only a program running in a hardware device. In some embodiments, the control device 600 may include a circuit board. The control apparatus 600 may perform the method of the data superposition processing. The method of data overlay processing will be described elsewhere in this specification. As shown in fig. 3, the control apparatus 600 may include at least one storage medium 630 and at least one central processor 620. In some embodiments, the control device 600 may also include a communication module 650 and an internal communication bus 610. Meanwhile, the control device 600 may further include a peripheral control module 640, a power control module 680, and a display control module 690.
The internal communication bus 610 may connect various system components including the storage medium 630, the central processor 620, the communication module 650, and the peripheral control module 640, the power control module 680, and the display control module 690. The internal communication bus 610 may also connect the laser radar 200, the vision sensor 400, the thermal imaging device 900, and the display device 800.
The peripheral control module 640 supports input/output between the control device 600 and other components.
The communication module 650 is used for controlling the apparatus 600 to perform data communication with the outside, for example, the communication module 650 may be used for controlling data transmission between the apparatus 600 and the external electronic device 6. The communication module 650 may be a wired communication module or a wireless communication module. The wired communication module may be electrically connected with the data transmission interface 124. The communication method adopted by the wired communication module includes, but is not limited to, mobile high definition link (HML), Universal Serial Bus (USB), High Definition Multimedia Interface (HDMI), and the like. The communication mode adopted by the wireless communication module includes, but is not limited to, wireless fidelity (WiFi), bluetooth communication technology, low power consumption bluetooth communication technology, communication technology based on ieee802.11s, and the like. In some embodiments, a positioning module, such as a GPS module, may also be included in communication module 650 to locate wearable device 001. The communication module 650 may be in communication connection with the external electronic device 6 in operation to acquire virtual model data from the external electronic device 6 or to transmit stereoscopic images to the external electronic device 6. The control device 600 can receive the virtual model data, superimpose the virtual model data on the stereoscopic image, and display the result of the superimposition processing on the display device 800. The external electronic device 6 may be any intelligent electronic device, such as a smartphone, computer, tablet, smart band, etc.
The storage medium 630 may include a data storage device. The data storage device may be a non-transitory storage medium or a transitory storage medium. For example, the data storage device may include one or more of a disk 632, a read only memory medium (ROM)634, or a random access memory medium (RAM) 636. The storage medium 630 also includes at least one set of instructions stored in a data storage device. The instructions are computer program code that may include programs, routines, objects, components, data structures, procedures, modules, and the like that perform the methods of data overlay processing provided herein.
The at least one central processor 620 may be communicatively coupled to the at least one storage medium 630 and the communication module 650 via an internal communication bus 610. The at least one central processing unit 620 is configured to execute the at least one instruction set. When the control apparatus 600 operates, the at least one central processor 620 reads the at least one instruction set and executes the method of data overlay processing provided herein according to the instruction of the at least one instruction set. The central processor 620 may perform all the steps included in the data superposition processing method. The central processor 620 may be in the form of one or more processors, and in some embodiments, the central processor 620 may include one or more hardware processors, such as microcontrollers, microprocessors, Reduced Instruction Set Computers (RISC), Application Specific Integrated Circuits (ASICs), application specific instruction set processors (ASIPs), Central Processing Units (CPUs), Graphics Processing Units (GPUs), Physical Processing Units (PPUs), microcontroller units, Digital Signal Processors (DSPs), Field Programmable Gate Arrays (FPGAs), Advanced RISC Machines (ARMs), Programmable Logic Devices (PLDs), any circuit or processor capable of executing one or more functions, and the like, or any combination thereof. For illustrative purposes only, only one central processor 620 is depicted in the control device 600 in this specification. It should be noted, however, that the control device 600 may also include multiple processors, and thus, the operations and/or method steps disclosed in this specification may be performed by one processor as described in this specification, or may be performed by a combination of multiple processors. For example, if the central processor 620 of the control apparatus 600 performs steps a and B in this specification, it should be understood that steps a and B may also be performed by two different central processors 620 in combination or separately (e.g., a first processor performs step a, a second processor performs step B, or both a first and second processor perform steps a and B together).
Power control module 680 may be used to control battery management of wearable device 001.
The display control module 690 may be used to control the display device 800 to display.
The display device 800 may be mounted on the housing 100, and particularly, may be mounted in the receiving cavity 122 of the first housing 120. The display device 800 may be communicatively connected to the control device 600, and receive and display a stereoscopic image or a stereoscopic temperature distribution image. When the wearable device 001 is worn on the head of the target subject, the display apparatus 800 may be positioned directly in front of the eyes of the target subject. The display device 800 may include one or more displays.
Fig. 4 shows a flowchart of a method P100 for data overlay processing provided according to an embodiment of the present description. Control apparatus 600 may perform method P100. Specifically, the central processor 620 may perform the method P100. As shown in fig. 4, method P100 may include:
s110: the control device 600 receives image data and position data.
Specifically, the control device 600 may acquire image data from the vision sensor 400 in real time. Control device 600 may obtain position data from lidar 200 in real time.
In some embodiments, method P100 may further include:
s120: the control device 600 receives thermal image data.
The control device 600 may acquire thermal image data from the thermal imaging device 900 in real time.
Method P100 may further include:
s140: the control device 600 performs superimposition processing on the image data and the position data in real time, or performs superimposition processing on the thermal image data, the image data, and the position data in real time, to generate a stereoscopic image.
In some embodiments, the control device 600 may acquire the image data and the position data at the same time for the superimposition processing based on the time stamp. As previously described, the position data includes three-dimensional coordinate information of objects within the field of view of the target relative to lidar 200. The control device 600 may store the relative positional relationship among the laser radar 200, the vision sensor 400, and the thermal imaging device 900 in advance. The control device 600 may calculate three-dimensional coordinate information of an object in the target field of view with respect to the vision sensor 400 based on the relative positional relationship between the laser radar 200 and the vision sensor 400 and the positional data. The control device 600 may further calculate depth information corresponding to each pixel point, that is, a distance between the object in each pixel point and the vision sensor 400, based on the three-dimensional coordinate information of the object in the target field relative to the vision sensor 400 and the coordinates of each pixel point in the image data, so as to obtain a stereoscopic image. The stereoscopic image includes at least image data obtained by superimposing the image data and the position data.
In some embodiments, the control device 600 may perform temperature identification on the object in the thermal image based on the thermal image data. The temperature indication can be a digital indication of the temperature or different colors for indicating different temperatures. Control device 600 may acquire thermal image data, and position data at the same time based on the time stamp, perform superimposition processing, and generate a stereoscopic image. At this time, the stereoscopic image may include thermal image data, and image data in which the position data is subjected to the superimposition processing.
In some embodiments, the stereoscopic image may include image data with depth information, i.e., a depth image. In some embodiments, the stereoscopic images may include three-dimensional model data within the field of view of the object. As previously described, lidar 200 may be used for three-dimensional space scanning. The control device 600 may determine the relative positional relationship of objects within the target field of view based on the positional data to construct a three-dimensional model of the target field of view. The control device 600 may also superimpose the image data on the three-dimensional model to obtain a stereoscopic image, i.e., an image containing the three-dimensional model within the field of view of the object.
In some embodiments, step S140 may further include: the control device 600 receives the virtual model transmitted from the external electronic device 6, and superimposes the virtual model on the stereoscopic image. The control apparatus 600 may receive the virtual model transmitted from the external electronic device 6 through the communication module. The control device 600 may compare and analyze the virtual model and the stereoscopic image to obtain a reference point, perform an operation on the virtual model to match the virtual model with the stereoscopic image, and superimpose the virtual model and the stereoscopic image with the reference point as a reference to obtain a virtual-real combined stereoscopic image. At this time, the stereoscopic image may include a virtual-real combined stereoscopic image obtained by superimposing the virtual model and the stereoscopic image.
In some embodiments, method P100 may further include:
s160: control device 600 determines whether the calculation result of step S140 is correct.
In some embodiments, method P100 may further include:
s170: the control device 600 determines that the calculation result is correct, and controls the display device 800 to display the stereoscopic image.
The control device 600 may control the display device 800 to display the stereoscopic image through the display control module 690.
In some embodiments, method P100 may further include:
s180: the control device 600 determines that the calculation result is correct, and transmits the stereoscopic image to the external electronic device 6 through the communication module 650.
In some embodiments, method P100 may further include:
s190: the control device 600 determines that the operation result is incorrect and re-executes step S140.
The wearable device 001 provided by the specification can be applied to disaster relief scenes. Generally, the environment of the disaster site is messy, the difficulty level of rescuers for searching and rescuing survivors is quite high, and the rescuers can hardly judge whether the survivors in the disaster site are still survivors or not. In addition, if a building exists in the early stage of the disaster site, the real building in the disaster site is damaged, so that the rescue difficulty is improved again. At this time, the rescuers can wear the wearable device 001 provided in this specification to obtain the position data of the objects on the disaster site in real time through the laser radar 200, and further calculate the relative distance between the objects on the disaster site. The wearable device 001 can also perform real-time multi-directional measurement through the laser radar 200, construct a three-dimensional model of the disaster site according to the measurement result, and combine the three-dimensional model with the image data acquired by the vision sensor 400 to generate a three-dimensional image of the disaster site, so that the matching degree of the three-dimensional model of the disaster site and the actual scene is improved. The wearable device 001 can also superimpose and compare the three-dimensional virtual model of the building in the early stage with the scanned and established stereoscopic image of the disaster site, help rescue workers to know the difference of the building structure before and after the disaster, and enable the rescue workers to find out a better rescue route to execute the disaster relief task. Since survivors at the disaster site may be buried in buildings, it is difficult for rescuers to identify them by the naked eye. The wearable device 001 can also superimpose the thermal image generated by the thermal imaging device 900 onto the stereoscopic image, obtain the temperature distribution of all objects within the visible range of the disaster site, and help the rescue workers quickly and effectively identify the survivors and the positions of the survivors from the disaster site. When the wearable device 001 provided by the specification is applied to a disaster rescue scene, rescuers can be helped to quickly establish a stereoscopic image of a disaster site, a rescue route can be effectively and quickly planned, the positions of survivors can be quickly identified, and the rescue efficiency can be effectively improved. The wearable device 001 can also transmit the stereoscopic image of the disaster site to an external command center or control center through a wireless transmission module, so that the wearable device cooperates with a remote command center or control center and a plurality of personnel at the disaster site at different positions to assist rescue personnel in identifying and rescuing survivors, improve the rescue survival rate of the survivors and avoid the rescue personnel being difficult at the disaster site.
Fig. 5 is a schematic diagram illustrating a screen 801 of a display device 800 when a wearable device 001 provided in an embodiment of the present disclosure is applied to disaster relief. The rescuer knows the disaster site by viewing the screen 801 displayed on the display device 800. The image displayed on the screen 801 is a stereoscopic image of an actual disaster site. Operation buttons of an operation system mounted on the display device 800 are displayed in the bottom of the screen 801. The rescue personnel can operate the system through the operation buttons.
Fig. 6 is a schematic diagram illustrating a screen 802 of a display device 800 when the wearable device 001 is applied to disaster relief according to an embodiment of the present disclosure. The rescuer knows the disaster site by viewing the screen 802 displayed on the display device 800. The image displayed in the frame 802 is a virtual-real superimposed image obtained by superimposing a stereoscopic image of an actual disaster site and the three-dimensional model 1 of the original building of the disaster site, and effectively helps the rescue workers to know the difference before and after the disaster in a virtual-real superimposed manner and find out the optimal rescue route. Operation buttons of the operating system are displayed in the bottom of the screen 802. The rescue personnel can operate the system through the operation buttons.
Fig. 7 is a schematic diagram illustrating a screen 803 of a display device 800 when a wearable apparatus 001 provided in an embodiment of the present disclosure is applied to disaster relief. The rescuer knows the disaster site by viewing the screen 803 displayed on the display device 800. The image displayed on the screen 803 is an image obtained by superimposing a stereoscopic image of an actual disaster site and thermal image data, and can effectively help the rescue workers to know the temperature distribution of the disaster site, and quickly recognize that there may be survivors 2 in the positions of the human figure graphic representation according to the temperature distribution. Operation buttons of the operating system are displayed on the bottom of the screen 803. The rescue personnel can operate the system through the operation buttons.
The wearable device 001 provided by the specification can be applied to a scene of real-time three-dimensional modeling. The three-dimensional model in the conventional wearable device needs to be built by scanning an object in advance through a scanner or on site, or by being built in advance according to the proportion of the object through a computer. When the actual object or scene changes, the three-dimensional model established in advance needs to be modified again, and the method is long in time consumption and cannot achieve synchronous and instant updating. The wearable device 001 provided by the present specification can acquire the position data of the field object in real time through the laser radar 200 and the vision sensor 400, so as to generate a stereoscopic image, and establish a three-dimensional model of the field object in real time. In addition, thermal imaging device 900 may also acquire a temperature distribution of an in-situ object. The data acquired by the laser radar 200, the vision sensor 400 and the thermal imaging device 900 in real time are overlapped, a three-dimensional model of a field object can be generated in real time, the three-dimensional model can be synchronously changed in real time according to the change of the field object, the operation of establishing the three-dimensional model in advance is omitted, and the change of the actual object or the scene is not limited. Wearable device 001 can also overlap and compare the three-dimensional model established in real time with the three-dimensional model at an early stage to help the wearer to quickly identify the difference before and after and make reasonable judgment.
Fig. 8 shows a scene schematic diagram of a wearable device 001 applied to real-time scanning of a three-dimensional object according to an embodiment of the present disclosure. Wearable device 001 can scan object 3 through laser radar 200, and calculate through controlling means 600, establish the three-dimensional model of object 3, then show through display device 800.
Fig. 9 shows a schematic diagram of a screen 804 of the display device 800 when the wearable device 001 provided according to the embodiment of the present disclosure is applied to real-time scanning of a three-dimensional object. The wearer can compare the actual image 4 of the object 3 with the three-dimensional model 5 of the object 3 by viewing the screen 804 displayed on the display device 800. The wearable device 001 may also display the actual image 4 of the object 3 superimposed with the three-dimensional model 5 of the object 3 for comparison. Operation buttons of the operating system are displayed on the bottom of the screen 804. The wearer can operate the system by operating the buttons.
Fig. 10 shows a schematic view of a remote transmission scenario of a wearable device 001 provided in accordance with an embodiment of the present specification. As previously described, the wearable device 001 may transmit the generated stereoscopic image to the external electronic device 6 through the communication module 650 for instant communication. The communication module 650 may be a wireless communication module such as a Wi-Fi module, a bluetooth module, and the like. In some embodiments, a positioning module, such as a GPS module, may also be included in the communication module 650 to generate location information for the wearable device 001 and transmit the location information to the external electronic device 6.
The wearable device 001 provided by the present specification can be applied to gesture recognition scenarios. Gesture recognition in traditional wearable equipment adopts the RGB camera mostly as gesture recognition's collection component. However, the RGB camera is easily interfered by the intensity of light in the shooting field when performing gesture recognition, and causes erroneous judgment or failure. When the light on the shooting site is insufficient to provide an imaging mechanism for the RGB camera, the collected image may be dark, and the gesture cannot be correctly recognized. The wearable device 001 provided by the specification overcomes the limitation of light, and improves the accuracy of gesture recognition through the laser radar 200. Acquiring position data of an object through the laser radar 200, thereby determining a dynamic three-dimensional posture of a gesture, and acquiring a dynamic gesture of a hand; measuring the hand temperature by the thermal imaging device 900, and drawing the contour curve of the hand according to the hand temperature distribution condition, thereby identifying the hand from the picture; the position data, the thermal image data and the image data are overlapped, so that the real-time hand swing gesture can be rapidly recognized, the dynamic actions of the small limbs of the hand can be recognized, and meanwhile, the gesture recognition precision is improved. The wearable device 001 provided by the specification can be applied to a gesture recognition scene of small limb actions, for example, the action of clicking a virtual keyboard by fingers, the volume is adjusted by fine actions of the fingers, and fine gesture actions are used for controlling related function settings in a system.
Fig. 11 is a schematic diagram illustrating a screen 805 of a display device 800 when a wearable device 001 provided in an embodiment of the present specification is applied to gesture recognition. Fig. 12 is a schematic diagram illustrating a screen 806 of the display device 800 when another wearable device 001 provided in an embodiment of the present specification is applied to gesture recognition. The wearer can view images of the virtual scene superimposed with the real scene through screen 805 and screen 806. The wearable device 001 can accurately acquire detailed motions and dynamic postures of the hand through the laser radar 200, the vision sensor 400 and the thermal imaging device 900, so as to perform accurate gesture recognition. Operation buttons of the operating system are displayed on the bottom of the screens 805 and 806. The wearer can operate the system by operating the buttons.
The wearable device 001 provided by the specification can be applied to skiing scenes. Skiing is a movement with a considerable degree of danger. Especially, when skiing at night, because the light intensity is not enough to illuminate all ranges at night, the light dead angle with weak light exists in the skiing field, thereby the capability of identifying and judging the image by human eyes and upgrading the danger. Furthermore, since the skiing area is white, the skier who is sliding down may not correctly identify the skier with white clothes, thereby causing safety accidents. When a skier wears the wearable device 001 (taking smart glasses as an example) provided by the specification, the wearable device 001 can acquire a stereoscopic image in front of the wearer in real time through the laser radar 200, the vision sensor 400 and the thermal imaging device 900, so as to acquire an object or a person existing in front of the wearer in real time, and display the object or the person to the wearer through the display device 800, thereby helping the wearer to better judge a scene in front, and timely make a correct judgment, and improving the skiing safety.
Fig. 13 is a schematic diagram illustrating a screen 807 of a display device 800 when the wearable device 001 provided in the embodiment of the present specification is applied to skiing. A stereoscopic image in front of the wearer may be displayed in the screen 807 to determine whether there is an obstacle in front of the wearer and the distance between the wearer and the obstacle. The screen 807 may also display thermal image data in front of the wearer to show the temperature distribution to determine whether a human, such as other skiers, is present in front of the wearer and the distance of the wearer from other skiers. The wearer can find the front obstacle or other skiers in advance by watching the picture 807 to make reasonable actions, thereby avoiding accidents and improving the safety of skiing. Operation buttons of the operating system are displayed on the bottom of the screen 807. The wearer can operate the system by operating the buttons.
To sum up, the wearable device 001 provided in the present specification measures position data of an object in the target field of view in real time by the laser radar 200 provided thereon, thereby acquiring a relative position and a distance of the object in the target field of view to generate three-dimensional model data of the target field of view. Meanwhile, the wearable device 001 may further superimpose the image data acquired by the vision sensor 400 and the position data acquired by the laser radar 200 to generate a stereoscopic image and display the stereoscopic image through the display device 800, so as to help the target object (wearer) to more rapidly identify the position of the object and the distance to the wearer. The wearable device 001 may further superimpose the thermal image data collected by the thermal imaging apparatus 900 with the stereoscopic image to help the wearer identify whether or not the human body is present and the position of the human body. The wearable device 001 provided by the specification can establish a three-dimensional model of a scene in real time and can overlap the three-dimensional model with an actual scene, so that the wearable device 001 can be applied to more scenes, such as a disaster relief scene, a three-dimensional modeling scene, a gesture recognition scene, a skiing scene and the like.
The foregoing description has been directed to specific embodiments of this disclosure. Other embodiments are within the scope of the following claims. In some cases, the actions or steps recited in the claims may be performed in a different order than in the embodiments and still achieve desirable results. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
In conclusion, upon reading the present detailed disclosure, those skilled in the art will appreciate that the foregoing detailed disclosure is presented by way of example only, and not limitation. Although not explicitly illustrated herein, those skilled in the art will appreciate that the present description encompasses numerous reasonable variations, improvements, and modifications to the embodiments. Such alterations, improvements, and modifications are intended to be suggested by this specification, and are within the spirit and scope of the exemplary embodiments of this specification.
Furthermore, certain terminology has been used in this specification to describe embodiments of the specification. For example, "one embodiment," "an embodiment," and/or "some embodiments" means that a particular feature, structure, or characteristic described in connection with the embodiment can be included in at least one embodiment of the specification. Therefore, it is emphasized and should be appreciated that two or more references to "an embodiment" or "one embodiment" or "an alternative embodiment" in various portions of this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined as suitable in one or more embodiments of the specification.
It should be appreciated that in the foregoing description of embodiments of the specification, various features are grouped together in a single embodiment, figure, or description thereof for the purpose of streamlining the specification, for the purpose of aiding in the understanding of one feature. This is not to be taken as an admission that any of the features are required in combination, and it is fully possible for one skilled in the art to extract some of the features as separate embodiments when reading this specification. That is, embodiments in this specification may also be understood as an integration of a plurality of sub-embodiments. And each sub-embodiment described herein is equally applicable to less than all features of a single foregoing disclosed embodiment.
Each patent, patent application, publication of a patent application, and other material, such as articles, books, descriptions, publications, documents, articles, and the like, cited herein is hereby incorporated by reference. All matters hithertofore set forth herein except as related to any prosecution history, may be inconsistent or conflicting with this document or any prosecution history which may have a limiting effect on the broadest scope of the claims. Now or later associated with this document. For example, if there is any inconsistency or conflict in the description, definition, and/or use of terms associated with any of the included materials with respect to the terms, descriptions, definitions, and/or uses associated with this document, the terms in this document are used.
Finally, it should be understood that the embodiments of the application disclosed herein are illustrative of the principles of the embodiments of the present specification. Other modified embodiments are also within the scope of this description. Accordingly, the disclosed embodiments are to be considered in all respects as illustrative and not restrictive. Those skilled in the art may implement the applications in this specification in alternative configurations according to the embodiments in this specification. Therefore, the embodiments of the present description are not limited to the embodiments described precisely in the application.

Claims (10)

1. A wearable device, comprising:
a housing;
a lidar mounted on the housing for measuring positional data of an object within a target field of view relative to the lidar during operation, the positional data comprising three-dimensional coordinate information;
a vision sensor mounted on said housing for acquiring image data within said target field of view during operation;
the control device is arranged on the shell, is in communication connection with the laser radar and the vision sensor, acquires the position data and the image data in real time, and performs superposition processing on the position data and the image data in real time to generate a three-dimensional image; and
and the display device is arranged on the shell, is in communication connection with the control device, and receives and displays the stereoscopic image.
2. The wearable device of claim 1, wherein the lidar is operable to transmit electromagnetic wave signals at a plurality of different angles to the target field of view and to receive a plurality of reflected electromagnetic wave signals reflected back by objects within the target field of view to obtain three-dimensional coordinate information of the objects within the target field of view relative to the lidar.
3. The wearable device of claim 1, wherein the wearable device is a head-mounted device for wearing on a head of a target subject, the display device being positioned in front of the eyes of the target subject.
4. The wearable device of claim 1, wherein the stereoscopic image comprises image data having depth information.
5. The wearable device of claim 1, wherein the stereoscopic image comprises three-dimensional model data of the target field of view.
6. The wearable device of claim 1, further comprising:
and the thermal imaging device is arranged on the shell, is in communication connection with the control device, and acquires thermal image data in the target field of view during operation.
7. The wearable device of claim 6, wherein the control device receives the thermal image data in real time and superimposes the thermal image data, the position data, and the image data in real time to obtain the stereoscopic image, and the stereoscopic image includes the thermal image data, the position data, and the image data after the superimposition processing.
8. The wearable device of claim 6, wherein the laser radar, the vision sensor, and the thermal imaging device are mounted in a predetermined positional relationship and at a distance from each other that does not exceed a predetermined value.
9. The wearable device of claim 6, wherein the housing comprises:
the wearable device is worn on a target object through the first shell when working, the first shell comprises an accommodating cavity, and the display device and the control device are installed in the accommodating cavity; and
the laser radar, the vision sensor and the thermal imaging device are arranged on the second shell, and the second shell is fixedly connected with the first shell to seal the display device and the control device in the accommodating cavity.
10. The wearable device of claim 1, wherein the control means comprises:
the control device can receive the virtual model data, superpose the virtual model data and the stereo image data to generate a display image, and display the display image through the display device.
CN202121981198.8U 2021-08-20 2021-08-20 Wearable device Active CN215642483U (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202121981198.8U CN215642483U (en) 2021-08-20 2021-08-20 Wearable device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202121981198.8U CN215642483U (en) 2021-08-20 2021-08-20 Wearable device

Publications (1)

Publication Number Publication Date
CN215642483U true CN215642483U (en) 2022-01-25

Family

ID=79900541

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202121981198.8U Active CN215642483U (en) 2021-08-20 2021-08-20 Wearable device

Country Status (1)

Country Link
CN (1) CN215642483U (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024166580A1 (en) * 2023-02-08 2024-08-15 Necプラットフォームズ株式会社 Situation ascertaining assistance device, situation ascertaining assistance method, and computer-readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2024166580A1 (en) * 2023-02-08 2024-08-15 Necプラットフォームズ株式会社 Situation ascertaining assistance device, situation ascertaining assistance method, and computer-readable storage medium

Similar Documents

Publication Publication Date Title
KR102541812B1 (en) Augmented reality within a field of view that includes a mirror image
CN106643699B (en) Space positioning device and positioning method in virtual reality system
US9563981B2 (en) Information processing apparatus, information processing method, and program
US11277597B1 (en) Marker-based guided AR experience
EP3422153A1 (en) System and method for selective scanning on a binocular augmented reality device
CN102591449B (en) The fusion of the low latency of virtual content and real content
CN107635129B (en) Three-dimensional trinocular camera device and depth fusion method
EP3371779B1 (en) Systems and methods for forming models of three dimensional objects
EP4182889B1 (en) Using 6dof pose information to align images from separated cameras
EP2879098A1 (en) Three-dimensional environment sharing system, and three-dimensional environment sharing method
US20180053055A1 (en) Integrating augmented reality content and thermal imagery
CN110895676B (en) dynamic object tracking
CN107509043B (en) Image processing method, image processing apparatus, electronic apparatus, and computer-readable storage medium
CN104089606A (en) Free space eye tracking measurement method
CN105809654A (en) Target object tracking method and device, and stereo display equipment and method
CN215642483U (en) Wearable device
US20150120461A1 (en) Information processing system
WO2021066970A1 (en) Multi-dimensional rendering
CN114545629A (en) Augmented reality device, information display method and device
CN204258990U (en) Intelligence head-wearing display device
CN115482359A (en) Method for measuring size of object, electronic device and medium thereof
US20180122145A1 (en) Display apparatus, display system, and control method for display apparatus
CN105138130A (en) Information communication instructing method and system in same scene at different places
CA3138269A1 (en) A system and method for localisation using footprints
EP4372527A1 (en) Electronic device and method for anchoring augmented reality object

Legal Events

Date Code Title Description
GR01 Patent grant
GR01 Patent grant