CN108427479B - Wearable device, environment image data processing system, method and readable medium - Google Patents

Wearable device, environment image data processing system, method and readable medium Download PDF

Info

Publication number
CN108427479B
CN108427479B CN201810149313.1A CN201810149313A CN108427479B CN 108427479 B CN108427479 B CN 108427479B CN 201810149313 A CN201810149313 A CN 201810149313A CN 108427479 B CN108427479 B CN 108427479B
Authority
CN
China
Prior art keywords
data
wearable device
image data
interface
environment image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810149313.1A
Other languages
Chinese (zh)
Other versions
CN108427479A (en
Inventor
顾照鹏
肖泽东
郑远力
陈宗豪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tencent Technology Shenzhen Co Ltd
Original Assignee
Tencent Technology Shenzhen Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tencent Technology Shenzhen Co Ltd filed Critical Tencent Technology Shenzhen Co Ltd
Priority to CN201810149313.1A priority Critical patent/CN108427479B/en
Publication of CN108427479A publication Critical patent/CN108427479A/en
Application granted granted Critical
Publication of CN108427479B publication Critical patent/CN108427479B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/163Wearable computers, e.g. on a belt
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F13/00Interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F13/10Program control for peripheral devices
    • G06F13/102Program control for peripheral devices where the programme performs an interfacing function, e.g. device driver
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2213/00Indexing scheme relating to interconnection of, or transfer of information or other signals between, memories, input/output devices or central processing units
    • G06F2213/0024Peripheral component interconnect [PCI]

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

The embodiment of the invention discloses a wearable device, a system, a method and a readable medium for processing environmental image data, wherein the wearable device comprises: a first interface, a second interface and a processor; the processor is connected with the image sensor through the first interface and is used for receiving environment image data acquired by the image sensor; the processor is connected with the inertial sensor through the second interface, and is further used for receiving the motion data of the wearable device sensed by the inertial sensor; the processor is connected with the control end and is further used for calculating attitude data of the wearable device according to the environment image data and the motion data and generating a map processing message according to the environment image data. The wearable device can asynchronously send related data to other control terminals, well avoid the problem of frame loss, and can quickly and timely display scenes such as virtual reality scenes or augmented reality scenes.

Description

Wearable device, environment image data processing system, method and readable medium
Technical Field
The invention relates to the technical field of electronics, in particular to wearable equipment, an environment image data processing system, an environment image data processing method and a readable medium.
Background
With the technical development of AR (Augmented Reality technology) and VR (Virtual Reality technology), wearable devices (such as AR/VR glasses and helmets) gradually enter people's daily lives. The AR technology is a technology of calculating the position and angle of a camera image in real time and adding corresponding images, videos, and 3D models, and the goal of the technology is to fit a virtual world on a screen over the real world and interact with the real world. VR technology is a computer simulation system that can create and experience virtual worlds, using computers to create a simulated environment into which users are immersed through a system simulation of multi-source information-fused, interactive three-dimensional dynamic views and physical behaviors. Through similar wearable equipment, can bring extraordinary visual experience for the user.
When the current wearable device performs environmental image processing such as AR or VR, the image processing and scene presentation are limited by the volume and weight, and a high-performance processor can be adopted in the aspect of image data processing, but the cost of the wearable device is high and the price is high, and if a low-cost processor is adopted, the problems of image loss and the like exist, and the product performance is low.
Disclosure of Invention
The embodiment of the invention provides a wearable device, a system and a method for processing environmental image data and a readable medium, which can quickly and efficiently process the image data on the wearable device.
In one aspect, an embodiment of the present invention provides a wearable device, including: a first interface, a second interface and a processor;
the processor is connected with the image sensor through the first interface and is used for receiving environmental image data acquired by the image sensor;
the processor is connected with the inertial sensor through the second interface, and is further used for receiving the motion data of the wearable device sensed by the inertial sensor;
the processor is further used for calculating attitude data of the wearable device according to the environment image data and the motion data, and sending the attitude data to the first control end;
the processor is further configured to generate a map processing message according to the environment image data, and send the map processing message to a second control end, where the map processing message includes the environment image data, and the map processing message is used to instruct the second control end to perform environment map processing according to the environment image data, so as to generate map data.
In another aspect, an embodiment of the present invention provides a system for processing environmental image data, including: the wearable device comprises a wearable device, a first control end and a second control end;
the wearable device is connected with the first control end and used for acquiring environment image data and motion data of the wearable device, calculating attitude data of the wearable device according to the environment image data and the motion data, and sending the attitude data to the first control end;
the wearable device is connected with the second control end, and is further used for generating a map processing message according to the environment image data, and sending the map processing message to the second control end, wherein the map processing message comprises the environment image data;
the first control end is used for determining virtual information according to the attitude data and sending the virtual information to the wearable equipment;
the second control end is used for carrying out environment map processing according to the environment image data to generate map data.
In another aspect, an embodiment of the present invention provides a method for processing environmental image data, including:
acquiring environment image data, and generating a map processing message according to the environment image data;
acquiring motion data, and calculating to obtain attitude data of the wearable device according to the environment image data and the motion data;
and sending the attitude data to a first control end, and sending the map processing message to a second control end, wherein the map processing message comprises environment image data, and the map processing message is used for indicating the second control end to perform environment map processing according to the environment image data to generate map data.
In yet another aspect, an embodiment of the present invention provides a computer storage medium storing one or more instructions adapted to be loaded by a processor and perform the following steps:
acquiring environment image data, and generating a map processing message according to the environment image data;
acquiring motion data, and calculating to obtain attitude data of the wearable device according to the environment image data and the motion data;
and sending the attitude data to a first control end, and sending the map processing message to a second control end, wherein the map processing message comprises environment image data, and the map processing message is used for indicating the second control end to perform environment map processing according to the environment image data to generate map data.
The wearable device in the embodiment of the invention comprises two different interfaces and a processor. The processor may enable asynchronous reception of the ambient image data sent by the image sensor and the motion data of the wearable device sent by the inertial sensor through the two different interfaces. And after the processor processes the environmental image data and the motion data to obtain the related data, the processor can also asynchronously send the related data to other control terminals, so that the problem of frame loss is well avoided, and scenes such as a virtual reality scene or an enhanced display scene can be quickly and timely displayed.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments are briefly introduced below, and it is obvious that the drawings in the following description are some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Fig. 1 is a schematic structural diagram of a wearable device according to an embodiment of the present invention;
fig. 2 is an application scenario diagram of a wearable device according to an embodiment of the present invention;
fig. 3 is a system architecture diagram of a wearable device according to an embodiment of the present invention;
fig. 4 is a schematic structural diagram of a wearable device provided in an embodiment of the present invention;
fig. 5 is a schematic structural diagram of a wearable device provided in an embodiment of the present invention;
FIG. 6 is a schematic structural diagram of a system for processing environmental image data according to an embodiment of the present invention;
FIG. 7 is a schematic diagram of another system for processing environmental image data according to an embodiment of the present invention;
fig. 8 is a flowchart illustrating a method for processing environment image data according to an embodiment of the present invention.
Detailed Description
In an embodiment of the present invention, on one hand, the wearable device can be connected to an image sensor through an Interface, and is dedicated to receive data of the image sensor, for example, the Interface may be an Interface such as a USB3.0 Interface or an MIPI (Mobile Industry Processor Interface) that can quickly receive image data; and a special Interface is also arranged to accurately and quickly receive the motion data sensed by the inertial sensor, for example, the Interface may be a Serial Peripheral Interface (SPI) Interface or the like. On the other hand, the wearable device can respectively calculate and determine the posture of the user and an environment map in a heterogeneous manner so as to more accurately position and confirm the posture of the wearable device, in one embodiment, the processor calculates posture data of the wearable device, determines the posture of the user so as to determine suitable interactive content subsequently, and regarding the processing manner of the environment map, the processor only needs to send relevant data such as images to a connected control terminal, and the control terminal completes map data processing based on algorithms such as SLAM (simultaneous localization and mapping, instant positioning and map construction) and the like to generate a map about the current environment A second interface 102 and a processor 103.
In one embodiment, the processor 103 is connected to an image sensor through the first interface 101, and the processor 103 is configured to receive environmental image data collected by the image sensor. The image sensor is used for acquiring an environment image. In one embodiment, the image sensor may be camera equipment that is externally connected to the wearable device, such as a high-definition camera that the user purchases at their own discretion. Alternatively, the image sensor may be a part of a wearable device, and may be fixed in the wearable device at the time of factory shipment. The processor 103 is connected with an inertial sensor through the second interface 102, and the processor 103 is configured to receive motion data of the wearable device sensed by the inertial sensor. The inertial sensor may be used to sense motion data of the wearable device. In one embodiment, the inertial sensor may be a detection device that is external to the wearable device. Alternatively, the inertial sensor may be a part of a wearable device, and may be fixed in the wearable device at the time of factory shipment.
The wearable device in the embodiment of the invention comprises a first interface, a second interface and a processor. Because the data volume of the environmental image data collected by the image sensor is large and the frame rate is low (20-60 fps), and the data volume of the motion data collected by the inertial sensor is small and the frame rate is high (100-1000 hz), the embodiment of the invention adopts two interfaces with different transmission rates to asynchronously receive the environmental image data and the motion data, can avoid the frame loss phenomenon caused by transmission delay of the environmental image data relative to the motion data and channel contention, can accurately acquire and process related data, and can timely and accurately show scenes such as virtual reality scenes or augmented reality scenes for users.
The processor 103 may calculate posture data of the wearable device according to the environment image data and the motion data, and then send the posture data to the first control end, that is, the processor 103 may locally complete fusion of the environment image data and the motion data, and comprehensively determine current posture data of the wearable device; the processor 103 may further generate a map processing message according to the environment image data, and send the map processing message to the second control end, where the map processing message may only include the environment image data, and the map processing message is mainly used to instruct the second control end to perform environment map processing according to the environment image data to generate map data, that is, the processor 103 is not configured to perform environment map processing based on a SLAM algorithm or the like, and is only configured to trigger the second control end to perform environment map processing. In one embodiment, the first control end and the second control end may be implemented on a device, and the device as a host device can implement the corresponding functions of the first control end and the second control end, respectively.
The processor 103 can calculate posture data of the wearable device in time for the received data, and can generate a map processing message including environment image data, and send the map processing message to the second control terminal, and the second control terminal dedicated for map processing performs map processing according to the asynchronously received environment image data. Therefore, the map processing and the attitude calculation are also completed asynchronously, the processing timeliness and the accuracy of the data acquired by the image sensor and the inertial sensor can be ensured, the data processing can be completed efficiently, quickly and at low cost, and the virtual reality scene or the augmented reality scene and the like can be better displayed for the user.
Referring to fig. 2, fig. 2 is an application scenario diagram of a wearable device according to an embodiment of the present invention. As shown in fig. 2, the user may wear the wearable device on the head. When the user uses the wearable device, the processor in the wearable device can receive the environment image data of the environment where the user is located, which is acquired by the image sensor, through the first interface, and can also receive the motion data sensed by the inertial sensor through the second interface, wherein the motion data can be the six-degree-of-freedom motion data of the user sensed by the inertial sensor. The six degrees of freedom include a degree of freedom of movement in the directions of three orthogonal axes x, y, and z, and a degree of freedom of rotation about the three orthogonal axes x, y, and z, respectively, as shown in fig. 2. After the processor calculates posture data of the wearable device, namely the user, according to the environment image data and the motion data, the posture data can be sent to the first control end. The first control end can use AR technology/VR technology to obtain virtual information according to the attitude data and send the virtual information to a processor in the wearable device. After receiving the virtual information, the processor in the wearable device may display the virtual information in front of the user, presenting a corresponding AR scene or VR scene.
The wearable device in the embodiment of the invention is not only a portable device which can be worn directly by a user or integrated into clothes or accessories of the user, but also a device which can realize a corresponding scene by using AR technology/VR technology through interaction with a host device (such as a smart terminal). For example, when a user is playing a game, the game may be opened at the smart terminal and the wearable device worn. The smart terminal may determine a corresponding game interface image in real time according to the gesture data of the user (such as a left turn, a right turn, or a straight walk, etc.), and transmit the virtual game interface image to the wearable device. And after the wearable device receives the virtual game interface image, the virtual game interface can be displayed to the user through the wearable device, so that the user can feel immersive.
Referring to fig. 3, it is a system architecture diagram based on a wearable device according to an embodiment of the present invention, the wearable device is connected to at least one control end, and the connection diagram thereof may refer to fig. 3, and the connection manner may be a wired or wireless connection manner. The wearable device can perform data fusion according to the environment image data and the motion data after acquiring the environment image data and the motion data of the wearable device, calculate gesture data of a user, and then send the gesture data to the first control terminal. The problems of delay and frame loss of data transmission can be avoided. And the attitude data is directly sent to the first control end, so that the first control end does not need to calculate the attitude data, and can obtain the virtual information in time according to the attitude data. In addition, the wearable device can obtain a map processing message according to the environment image data and send the map processing message to the second control terminal.
In one embodiment, the wearable device is connected to the first control terminal through the third interface and connected to the second control terminal through the fourth interface, the wearable device may send the related data to each control terminal through different interfaces, and both the third interface and the fourth interface may be configured as a wired transmission interface or a wireless transmission interface as needed. In other embodiments, the wearable device and the respective control terminals may also be connected to a server. The wearable device can upload the gesture data and the map processing message to the server, and then the server correspondingly sends the gesture data and the map processing message to each control terminal.
In the embodiment of the present invention, the structure of the wearable device can be seen in fig. 4. As shown in fig. 4, the wearable device in the embodiment of the present invention includes: a first interface 201, a second interface 202 and a processor 203. It should be noted that, in the embodiment of the present invention, the first interface 201, the second interface 202, and the processor 203 may refer to the first interface 101, the second interface 102, and the processor 103 in the above embodiment of the present invention, and no further description is given in the embodiment of the present invention. In one embodiment, the wearable device in the present embodiment further includes an image sensor 204 and an inertial sensor 205. In one embodiment, the connection relationship between the image sensor 204, the inertial sensor 205, and the processor 203 may be described with reference to the following.
The image sensor 204 is connected to the processor 203 through a first interface 201, the image sensor 204 is configured to acquire environment image data, and send the environment image data to the processor 203 through the first interface 201, in an embodiment, the first interface 201 may be an MIPI interface.
The inertial sensor 205 is connected to the processor 203 through the second interface 202, and the inertial sensor 205 is used for sensing the touch-sensitive objectThe motion data of the wearable device is sent to the processor 203 through the second interface 202. The inertial sensor 205 is a sensor that can detect and output motion information such as acceleration and angular velocity of the wearable device in real time. In one embodiment, the Inertial sensor 205 may be an IMU (Inertial Measurement Unit) sensor. The IMU sensor is a device integrating multiple sensors such as an acceleration sensor and a speed sensor (gyroscope), the IMU sensor is used for sensing the motion data of the wearable device, the hysteresis phenomenon generated when the wearable device senses the motion data can be reduced, and therefore the gesture tracking function of the wearable device can be improved. In one embodiment, the second interface may be an SPI interface or I2C(Inter-Integrated Circuit,I2C) An interface.
The processor 203 is configured to calculate gesture data of the wearable device according to the environment image data and the motion data, and send the gesture data to the first control end. In one embodiment, the processor 203 may employ an EKF (Extended Kalman Filter) algorithm to perform data fusion calculation when calculating the pose data of the wearable device according to the environment image data and the motion data, thereby obtaining a high-precision pose data about the wearable device. In an embodiment, the processor 203 may be a DSP (Digital Signal Processing) processor, a general embedded ARM processor, an FPGA (Field Programmable Gate Array) processor, or other processors that can perform the same function, which is not limited in the embodiments of the present invention. In one embodiment, the data fusion calculation also fuses positioning information, which refers to the wearable device's position in the environment map determined based on the SLAM algorithm.
Therefore, after the environment image data and the motion data are acquired, the wearable device in the embodiment of the invention can calculate the environment image data and the motion data through the processor of the wearable device to obtain the attitude data, the environment image data and the motion data do not need to be sent to the intelligent terminal together, and the delay of data transmission can be avoided. In addition, the wearable device only needs to send the calculated attitude data to the first control end. The condition of channel contention in the data transmission process can be avoided, so that the phenomenon of frame loss of data is avoided.
The processor 203 is further configured to generate a map processing message according to the environment image data, and send the map processing message to the second control end, where the map processing message includes the environment image data, and the map processing message is used to instruct the second control end to perform environment map processing according to the environment image data, so as to generate map data. In an embodiment, when the processor 203 is configured to generate the map processing message according to the environment image data, the processor is configured to invoke a feature point matching algorithm to perform feature point matching calculation on an environment image corresponding to the environment image data, obtain a matching result, and generate the map processing message according to the matching result. In one embodiment, before the processor 203 invokes the feature point matching algorithm to perform the feature point matching calculation, the processor 203 may invoke a feature extraction algorithm (e.g., BRIEF (Binary Robust Independent Features) algorithm, ORB (organized FAST and rolling BRIEF, FAST detection with direction and BRIEF Features) algorithm, etc.) to perform the feature point extraction on the environment image data. In one embodiment, the map processing message may include a loopback detection message and/or a map update message.
In one embodiment, the processor 203 is further configured to send a relocation message to the second control end if the environment image data sent by the image sensor 204 is not successfully received within a preset time period, where the relocation message is used to instruct the second control end to perform relocation processing on the wearable device according to the key frame information included in the generated map data. If the environment image data sent by the image sensor 204 is not successfully received within the preset time period, the processor 203 may determine that the wearable device at this time is in a tracking failure state, and therefore, may send a relocation message to the second control end to instruct the second control end to perform relocation processing on the wearable device. And after receiving the repositioning data obtained by repositioning processing from the second control end, the processor 203 may recover the tracking posture of the wearable device according to the repositioning data. In an embodiment, if there is no characteristic region in the environment image data received by the processor 203, the processor 203 may also consider that the wearable device at this time is in a tracking failure state, and may also send a relocation message to the second control end at this time. In general, in the map processing process based on the SLAM algorithm, the relative position of the wearable device is determined by comparing feature points based on two frames of environmental image data collected by an image sensor, and then the wearable device is positioned to determine the position of the wearable device in the environmental map. If the environmental image data that is not collected by the image sensor, for example, the lens is blocked, or the second control end does not successfully receive the environmental image data, the repositioning process may be performed, where the repositioning process refers to: when the environment image data is received again after the environment image data is not successfully received, based on the comparison between the environment image corresponding to the latest received environment image data and the key image corresponding to the environment image data included in the key frame recorded in the process of processing the environment map, the key image with the maximum similarity between the images corresponding to the latest received environment image data is found, the key frame where the key image with the maximum similarity is located is determined to be the most similar key frame, the position and the posture corresponding to the latest received environment image at present are calculated by using the feature matching result between the latest received environment image and the most similar key frame in the environment map, and based on the environment image data received again, the positioning and drawing processing can be continued. The key frame is recorded in the process of map processing at the second control end, the key frame comprises an image and the position of the image in a processed map, and generally, when the image change in the environmental image data of the previous frame and the next frame is large, the image of the next frame and the position determined based on the image are recorded as the key frame.
In one embodiment, the processor 203 is further configured to receive relocation data sent by the second control end, and calculate posture data of the wearable device according to the relocation data and the motion data sensed by the inertial sensor, where the relocation data is obtained by the second control end after relocation processing is performed on the wearable device according to keyframe information included in the generated map data. That is, the pose data may include not only the data related to 6 degrees of freedom, but also the position of the user wearing the wearable device in the environment map generated by the SLAM algorithm.
In one embodiment, the processor 203 is further configured to receive virtual information sent by the first control end, where the virtual information is determined by the first control end according to the posture data. After receiving the virtual information sent by the first control terminal, the processor 203 may display the virtual information in a display screen of the wearable device for the user to view the virtual information. The virtual information may be, for example, some virtual cartoon characters, or some textual descriptions of objects, such as the price of a certain item and product introduction.
In an embodiment, the processor 203 may be divided into a visual ranging module 2031 and an IMU data fusion module 2032, and a specific structural diagram may be shown in fig. 5. As shown in fig. 5, the visual ranging module 2031 is connected to the image sensor 204 via a first interface 201, and the IMU data fusion module 2032 is connected to the inertial sensor 205 via a second interface 202. The functions of the processor 203 described above may be implemented by the visual ranging module 2031 and the IMU data fusion module 2032, respectively. In one embodiment, the processor 203, when configured to generate a map processing message according to the environment image data and transmit the map processing message to the second control terminal, may be configured to generate a map processing message according to the environment image data by the visual ranging module 2031 and transmit the map processing message to the second control terminal. In one embodiment, the visual ranging module 2031 may be further configured to, after receiving the environment image data of the image sensor 204, match the previous and subsequent frame images by using an image matching algorithm, determine visual pose data of the wearable device according to a matching result, and send the visual pose data to the IMU data fusion module 2032. Correspondingly, when the processor 203 is configured to calculate the posture data of the wearable device according to the environment image data and the motion data and send the posture data to the first control end, the IMU data fusion module 2032 may complete corresponding operations. After receiving the motion data sent by the inertial sensor 205 and the visual attitude data sent by the visual ranging module 2031, the IMU data fusion module 2032 may perform fusion processing on the motion data and the visual attitude data according to an EKF algorithm to obtain attitude data with high precision and high frame rate, and send the attitude data with high precision and high frame rate to the first control end.
The embodiment of the invention adopts two interfaces with different transmission rates to asynchronously receive the environmental image data and the motion data, can avoid the frame loss phenomenon caused by the transmission delay of the environmental image data relative to the motion data and channel contention, and can timely and better show the virtual reality scene or augmented reality scene and the like for a user.
Based on the wearable device described above, the embodiment of the present invention further provides a processing system of environmental image data, which is a visual-inertial SLAM system based on heterogeneous computing. The processing system can meet the requirements of high real-time performance, low time delay and high-precision posture output required by AR/VR posture tracking. The processing system in the embodiment of the invention mainly comprises a front end and a rear end, wherein the front end can be composed of an image sensor, an inertial sensor, a processor responsible for environment image processing and sensor fusion, a peripheral circuit and the like, the rear end can be composed of a general CPU (central processing unit) system such as a PC/ARM (Advanced RISC Machines, ARM) and the like, and the front end and the rear end can communicate through a USB (Universal Serial Bus) 3.0.
In one embodiment, a schematic diagram of the processing system can be seen in fig. 6. As shown in fig. 6, the processing system in the embodiment of the present invention may include: wearable device 301, first control end 302 and second control end 303. Correspondingly, the wearable device 301 is a front-end part of a processing system, and the first control end 302 and the second control end 303 together form a back-end part of the processing system.
The wearable device 301 is connected to the first control end 302 through a third interface, and the wearable device 301 is configured to obtain environment image data and motion data of the wearable device, calculate gesture data of the wearable device 301 according to the environment image data and the motion data, and send the gesture data to the first control end 302.
The wearable device 301 is connected to the second control terminal 303 through a fourth interface, and the wearable device 301 is further configured to generate a mapping message according to the environment image data, and send the mapping message to the second control terminal 303, where the mapping message includes the environment image data.
The first control end 302 is configured to determine virtual information according to the posture data, and send the virtual information to the wearable device 301. In one embodiment, the first control end 302 may include a 6Dof gesture output module 3021, and the specific system structure may be as shown in fig. 7. The 6Dof gesture output module 3021 may determine virtual information from an image database stored in the first control terminal 302 according to the gesture data transmitted by the wearable device 301, and transmit the determined virtual information to the wearable device 301, so that the wearable device 301 displays the virtual information in front of the eyes of the user.
The second control terminal 303 is configured to perform environment map processing according to the environment image data to generate map data. In one embodiment, the second control terminal 303 may include a loop detection module 3031, a map maintenance module 3032, and a relocation module 3033. As shown in fig. 7, the loop detection module 3031, the map maintenance module 3032, and the relocation module 3033 may all be connected with the wearable device 301 through a fourth interface. The loop detection module 3031 is responsible for loop detection to complete closed loop and ensure consistency of the map. The map maintenance module 3032 is responsible for updating the three-dimensional positions of map points by using point/line segment matching between multi-frame images, and maintaining the poses of multiple key frames by using a sparse binding adjustment algorithm SBA. The relocation module 3033 is responsible for recovery work after the posture tracking of the wearable device 301 fails. Therefore, the second control terminal 303 may invoke different modules to perform environment map processing according to different map processing messages, so as to generate map data.
In one embodiment, the image mapping message may be a loop detection message. The loop detection means that the second control terminal 303 detects whether a closed loop appears in the motion trajectory of the wearable device 301. Correspondingly, when the second control terminal 303 is configured to perform the environment map processing according to the environment image data, the second control terminal 303 may be configured to invoke the loop detection module 3031 to perform matching according to the environment image corresponding to the environment image data and the key frame information included in the generated map data, and perform the environment map loop processing when the current environment map processing state is determined to be the loop processing state according to the matching result. In one embodiment, the second control terminal 303 may also invoke the loop detection module 3031 to perform loop detection using a loop detection algorithm (e.g., DBoW2 algorithm). If the second control terminal 303 detects that a loop occurs, the second control terminal may call the global SBA of the map maintenance module 3032 to perform binding adjustment.
In one embodiment, the map processing message may be a map update message, and the environment image data includes feature points in an environment image corresponding to the environment image data and image location information of the feature points. Correspondingly, when the second control terminal 303 is configured to perform the environment map processing according to the environment image data, the second control terminal 303 may be configured to call the map maintenance module 3032 to perform the update processing of the environment map according to the feature points in the environment image corresponding to the environment image data and the image position information of the feature points.
In one embodiment, the wearable device 301 is further configured to send a relocation message to the second control terminal 303 if the environmental image data is not successfully acquired within a preset time period. Correspondingly, the second control terminal 303 is further configured to, after receiving the relocation message, invoke the relocation module 3033 to perform relocation processing on the wearable device according to the key frame information included in the generated map data to obtain relocation data, and send the relocation data to the wearable device 301. In one embodiment, the relocation module 3033 may invoke a relocation algorithm (e.g., GCBB algorithm) to relocate the wearable device.
In one embodiment, the second control terminal 303 is further configured to perform relocation processing on the wearable device 301 according to the key frame information included in the generated map data to obtain relocation data, and send the relocation data to the wearable device 301. In one embodiment, if the second control end 303 does not successfully receive the environment image data sent by the wearable device 301 within a preset time period, the second control end 303 may determine that the wearable device 301 is in a tracking failure state, so the second control end 303 may perform relocation processing on the wearable device 301 according to the keyframe information included in the generated map data to obtain relocation data, and send the relocation data to the wearable device 301, so that the wearable device 301 may calculate posture data of the wearable device 301 according to the relocation data and the motion data after receiving the relocation data sent by the second control end 303.
In one embodiment, the third interface and the fourth interface may both be USB interfaces, and in an actual hardware device, the third interface and the fourth interface may be the same USB interface or different USB interfaces.
The processing system in the embodiment of the invention can complete image processing, attitude data calculation and sensor fusion on the environment image data through the processor in the wearable device, thereby ensuring that the processing system outputs the attitude data with high frame rate and low time delay in real time. Meanwhile, the first control end and the second control end can be communicated with each other through the third interface and the fourth interface respectively, and the second control end is used for asynchronously processing map consistency operations such as map updating, repositioning, loop detection and the like. Therefore, the processing system can avoid the problems of data transmission delay, frame loss and the like, and meet the requirements of high real-time performance, low delay and high-precision posture output required by AR/VR posture tracking. Meanwhile, the consistency of the SLAM map and the robustness of a processing system can be ensured, the immersion feeling of the wearable device can be effectively improved, and the vertigo feeling of a user is reduced.
According to another embodiment of the present invention, each unit in the wearable device or the processing system shown in fig. 1 or fig. 4 to fig. 7 may be respectively or entirely combined into one or several other units to form the wearable device or the processing system, or some unit(s) therein may be further split into multiple units with smaller functions to form the wearable device or the processing system, which may achieve the same operation without affecting the achievement of the technical effect of the embodiment of the present invention. In practical applications, the functions of one unit may be implemented by a plurality of units, or the functions of a plurality of units may be implemented by one unit. In other embodiments of the present invention, the wearable device or the processing system may also include other units, and in practical applications, these functions may also be implemented by assistance of other units, and may be implemented by cooperation of a plurality of units.
Based on the description of the wearable device embodiment, please refer to fig. 8, which is a flowchart illustrating a method for processing environmental image data according to an embodiment of the present invention. The processing method of the environment image data provided by the embodiment of the present invention may be executed by the processor in the wearable device, and the processing method of the environment image data provided by the embodiment of the present invention may include the following steps.
S101, acquiring environment image data and generating a map processing message according to the environment image data. In one embodiment, a processor in the wearable device may invoke an image sensor to acquire environmental image data. The processor, after acquiring the environmental image data, may generate a mapping message from the environmental image data. In one embodiment, the specific implementation of the processor generating the map processing message according to the environment image data may be: and calling a feature point matching algorithm to perform feature point matching calculation on the environment image corresponding to the environment image data to obtain a matching result. And generating a map processing message according to the matching result.
And S102, acquiring motion data, and calculating to obtain the posture data of the wearable device according to the environment image data and the motion data. In one embodiment, a processor in the wearable device may invoke an inertial sensor to sense motion data. After the processor acquires the motion data, the attitude data of the wearable device may be calculated according to the environment image data and the motion data, and the specific calculation manner of the attitude data may refer to the method mentioned in the above embodiment of the wearable device, which is not described herein again.
S103, the attitude data is sent to the first control end, and the map processing message is sent to the second control end. The map processing message includes environment image data, and the map processing message is used for instructing the second control terminal to perform environment map processing according to the environment image data to generate map data.
In one embodiment, the method may further comprise: and if the environmental image data is not successfully acquired within the preset time period, sending a relocation message to the second control end, wherein the relocation message is used for indicating the second control end to perform relocation processing on the wearable device according to the key frame information included in the generated map data.
In one embodiment, the method may further comprise: and receiving relocation data sent by the second control terminal, and calculating attitude data of the wearable device according to the relocation data and the motion data sensed by the inertial sensor, wherein the relocation data is obtained by the second control terminal after relocation processing is performed on the wearable device according to key frame information included in the generated map data.
In one embodiment, the method may further comprise: and receiving virtual information sent by the first control end, wherein the virtual information is determined by the first control end according to the attitude data.
According to an embodiment of the present invention, steps S101-S103 involved in the processing method of the environment image data shown in fig. 8 may be executed by respective units in the wearable device shown in fig. 4. For example, steps S101-S103 shown in fig. 8 may be performed by the image sensor 204, the inertial sensor 205, and the processor 203 shown in fig. 4, respectively.
The wearable device of the embodiment of the invention generates the map processing message according to the environment image data by acquiring the environment image data. And acquiring motion data, and calculating to obtain the posture data of the wearable device according to the environment image data and the motion data. And sending the attitude data to a first control end, and sending the map processing message to a second control end. The embodiment of the invention can avoid data frame loss caused by channel contention of the environmental image data and the motion data in the process of transmitting to the intelligent terminal. In addition, the wearable device in the embodiment of the invention respectively transmits the image data and the motion data through two interfaces with different transmission rates, so that the transmission delay can be avoided, and the immersion feeling of the user can be improved.
In an embodiment, the embodiment of the present invention further provides a computer storage medium, where the computer storage medium stores program instructions, and when the program instructions are loaded and executed by a processor, the method for processing environment image data described in fig. 8 is implemented.
The wearable device of the embodiment of the invention generates the map processing message according to the environment image data by acquiring the environment image data. And acquiring motion data, and calculating to obtain the posture data of the wearable device according to the environment image data and the motion data. And sending the attitude data to a first control end, and sending the map processing message to a second control end. The embodiment of the invention can avoid data frame loss caused by channel contention of the environmental image data and the motion data in the process of transmitting to the intelligent terminal. In addition, the wearable device in the embodiment of the invention respectively transmits the image data and the motion data through two interfaces with different transmission rates, so that the transmission delay can be avoided, and the immersion feeling of the user can be improved.
It will be understood by those skilled in the art that all or part of the processes of the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium, and when executed, can include the processes of the embodiments of the methods described above. The storage medium may be a magnetic disk, an optical disk, a Read-Only Memory (ROM), a Random Access Memory (RAM), or the like.
The above disclosure is intended to be illustrative of only some embodiments of the invention, and is not intended to limit the scope of the invention.

Claims (13)

1. A wearable device, comprising: the system comprises a first interface, a second interface, a third interface, a fourth interface and a processor, wherein the processor comprises a visual ranging module and an IMU data fusion module;
the visual ranging module is connected with the image sensor through the first interface and is used for receiving environmental image data acquired by the image sensor;
the IMU data fusion module is connected with an inertial sensor through the second interface and is used for receiving motion data of the wearable device sensed by the inertial sensor;
the IMU data fusion module is further used for calculating attitude data of the wearable device according to the environment image data and the motion data, and sending the attitude data to a first control end through a third interface;
the visual ranging module is further configured to generate a map processing message according to the environment image data, and send the map processing message to a second control end through the fourth interface, where the map processing message includes the environment image data, and the map processing message is used to instruct the second control end to perform environment map processing according to the environment image data to generate map data;
the processor is further configured to receive virtual information sent by the first control end, and display the virtual information in a display screen of the wearable device for a user to view the virtual information, where the virtual information is determined by the first control end according to the posture data.
2. The wearable device of claim 1, further comprising: an image sensor and/or an inertial sensor;
the image sensor is used for acquiring environment image data and sending the environment image data to the processor through the first interface;
the inertial sensor is used for sensing the motion data of the wearable device and sending the motion data to the processor through the second interface.
3. The wearable device of claim 1 or 2,
and the processor is used for calling a feature point matching algorithm to perform feature point matching calculation on the environment image corresponding to the environment image data to obtain a matching result when the processor is used for generating the map processing message according to the environment image data, and generating the map processing message according to the matching result.
4. The wearable device of claim 1 or 2,
the processor is further configured to send a relocation message to the second control end if the environment image data sent by the image sensor is not successfully received within a preset time period, where the relocation message is used to instruct the second control end to perform relocation processing on the wearable device according to key frame information included in the generated map data.
5. The wearable device of claim 1 or 2,
the processor is further configured to receive relocation data sent by the second control end, and calculate posture data of the wearable device according to the relocation data and the motion data sensed by the inertial sensor, where the relocation data is obtained by the second control end after relocation processing is performed on the wearable device according to keyframe information included in the generated map data.
6. The wearable device of claim 1 or 2, wherein the first interface is a mobile industry processor interface;
the second interface is a serial peripheral interface or I2And C, interface.
7. The system for processing the environmental image data is characterized by comprising wearable equipment, a first control end and a second control end; the wearable device comprises a first interface, a second interface, a third interface, a fourth interface and a processor, wherein the processor comprises a visual ranging module and an IMU data fusion module;
the wearable device is connected with the first control end, and the visual ranging module of the wearable device is used for acquiring environmental image data through the first interface;
the IMU data fusion module of the wearable device is used for acquiring motion data of the wearable device through the second interface, calculating attitude data of the wearable device according to the environment image data and the motion data, and sending the attitude data to the first control end through the third interface;
the wearable device is connected with the second control end, the visual ranging module of the wearable device is further configured to generate a map processing message according to the environment image data, and send the map processing message to the second control end through the fourth interface, where the map processing message includes environment image data;
the first control end is used for determining virtual information according to the attitude data and sending the virtual information to the wearable equipment;
and the second control terminal is used for carrying out environment map processing according to the environment image data to generate map data.
8. The processing system of claim 7,
the wearable device is further used for sending a relocation message to the second control terminal if the environmental image data are not collected successfully within a preset time period;
and the second control terminal is further configured to, after receiving the relocation message, perform relocation processing on the wearable device according to the key frame information included in the generated map data to obtain relocation data, and send the relocation data to the wearable device.
9. The processing system of claim 7,
the second control end is further configured to perform relocation processing on the wearable device according to key frame information included in the generated map data to obtain relocation data, and send the relocation data to the wearable device;
the wearable device is further used for receiving relocation data sent by the second control end, and calculating posture data of the wearable device according to the relocation data and the motion data.
10. The processing system of claim 8 or 9,
and the second control terminal is used for matching the environment image corresponding to the environment image data with the key frame information included in the generated map data when the second control terminal is used for carrying out environment map processing according to the environment image data, and carrying out environment map loop processing when the current environment map processing state is judged to be in a loop processing state according to a matching result.
11. The processing system according to claim 8 or 9, wherein the environment image data includes feature points in an environment image corresponding to the environment image data and image position information of the feature points;
and the second control terminal is used for updating the environment map according to the feature points in the environment image corresponding to the environment image data and the image position information of the feature points when the second control terminal is used for performing environment map processing according to the environment image data.
12. A method of processing ambient image data, the method being performed by a wearable device according to claims 1-6, the method comprising:
acquiring environment image data, and generating a map processing message according to the environment image data;
acquiring motion data, and calculating to obtain attitude data of the wearable device according to the environment image data and the motion data;
and sending the attitude data to a first control end, and sending the map processing message to a second control end, wherein the map processing message comprises environment image data, and the map processing message is used for indicating the second control end to perform environment map processing according to the environment image data to generate map data.
13. A computer storage medium storing program instructions which, when loaded and executed by a processor, implement the method of processing ambient image data according to claim 12.
CN201810149313.1A 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium Active CN108427479B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810149313.1A CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810149313.1A CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Publications (2)

Publication Number Publication Date
CN108427479A CN108427479A (en) 2018-08-21
CN108427479B true CN108427479B (en) 2021-01-29

Family

ID=63157007

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810149313.1A Active CN108427479B (en) 2018-02-13 2018-02-13 Wearable device, environment image data processing system, method and readable medium

Country Status (1)

Country Link
CN (1) CN108427479B (en)

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109147058B (en) * 2018-08-31 2022-09-20 腾讯科技(深圳)有限公司 Initialization method and device for visual and inertial navigation information fusion and storage medium
CN109255817A (en) * 2018-09-14 2019-01-22 北京猎户星空科技有限公司 A kind of the vision method for relocating and device of smart machine
CN109544615B (en) * 2018-11-23 2021-08-24 深圳市腾讯信息技术有限公司 Image-based repositioning method, device, terminal and storage medium
CN109788194B (en) * 2018-12-27 2020-08-25 北京航空航天大学 Adaptive wearable device subjective visual angle image acquisition method
CN115039015A (en) * 2020-02-19 2022-09-09 Oppo广东移动通信有限公司 Pose tracking method, wearable device, mobile device and storage medium
US12039674B2 (en) * 2020-09-18 2024-07-16 Apple Inc. Inertial data management for extended reality for moving platforms
US12048039B2 (en) 2022-08-26 2024-07-23 Htc Corporation Electronic system, control method and non- transitory computer readable storage medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140240223A1 (en) * 2013-02-22 2014-08-28 Thalmic Labs Inc. Method and apparatus for analyzing capacitive emg and imu sensor signals for gesture control
CN106507092A (en) * 2016-11-29 2017-03-15 歌尔科技有限公司 Camera head and its image processing method, virtual reality device
CN106774844A (en) * 2016-11-23 2017-05-31 上海创米科技有限公司 A kind of method and apparatus for virtual positioning
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN206541019U (en) * 2017-03-03 2017-10-03 北京国承万通信息科技有限公司 Position optical signal launch equipment, optical positioning system and virtual reality system
CN107610175A (en) * 2017-08-04 2018-01-19 华南理工大学 The monocular vision SLAM algorithms optimized based on semi-direct method and sliding window
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120299962A1 (en) * 2011-05-27 2012-11-29 Nokia Corporation Method and apparatus for collaborative augmented reality displays
US9300880B2 (en) * 2013-12-31 2016-03-29 Google Technology Holdings LLC Methods and systems for providing sensor data and image data to an application processor in a digital image format
US10121063B2 (en) * 2015-01-12 2018-11-06 BMT Business Meets Technology Holding AG Wink gesture based control system
CN107270900A (en) * 2017-07-25 2017-10-20 广州阿路比电子科技有限公司 A kind of 6DOF locus and the detecting system and method for posture

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140240223A1 (en) * 2013-02-22 2014-08-28 Thalmic Labs Inc. Method and apparatus for analyzing capacitive emg and imu sensor signals for gesture control
CN106774844A (en) * 2016-11-23 2017-05-31 上海创米科技有限公司 A kind of method and apparatus for virtual positioning
CN106507092A (en) * 2016-11-29 2017-03-15 歌尔科技有限公司 Camera head and its image processing method, virtual reality device
CN206541019U (en) * 2017-03-03 2017-10-03 北京国承万通信息科技有限公司 Position optical signal launch equipment, optical positioning system and virtual reality system
CN107160395A (en) * 2017-06-07 2017-09-15 中国人民解放军装甲兵工程学院 Map constructing method and robot control system
CN107610175A (en) * 2017-08-04 2018-01-19 华南理工大学 The monocular vision SLAM algorithms optimized based on semi-direct method and sliding window
CN107657640A (en) * 2017-09-30 2018-02-02 南京大典科技有限公司 Intelligent patrol inspection management method based on ORB SLAM

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
"基于部分惯性传感器信息的单目视觉同步定位与地图创建方法";顾照鹏等;《计算机辅助设计与图形学学报》;20120215;第155-160页 *

Also Published As

Publication number Publication date
CN108427479A (en) 2018-08-21

Similar Documents

Publication Publication Date Title
CN108427479B (en) Wearable device, environment image data processing system, method and readable medium
JP6979475B2 (en) Head-mounted display tracking
EP3469458B1 (en) Six dof mixed reality input by fusing inertial handheld controller with hand tracking
CN110140099B (en) System and method for tracking controller
EP3486707B1 (en) Perception based predictive tracking for head mounted displays
US11893702B2 (en) Virtual object processing method and apparatus, and storage medium and electronic device
CN112328078A (en) Method of processing three-dimensional user input
TW201346640A (en) Image processing device, and computer program product
US11792517B2 (en) Pose tracking for rolling shutter camera
US20210041942A1 (en) Sensing and control method based on virtual reality, smart terminal, and device having storage function
EP3508951B1 (en) Electronic device for controlling image display based on scroll input
CN115039015A (en) Pose tracking method, wearable device, mobile device and storage medium
EP3627289A1 (en) Tracking system and tracking method using the same
WO2015093130A1 (en) Information processing device, information processing method, and program
CN110688002B (en) Virtual content adjusting method, device, terminal equipment and storage medium
WO2023250267A1 (en) Robotic learning of tasks using augmented reality
WO2017061890A1 (en) Wireless full body motion control sensor
EP3840371A1 (en) Image display method, device, and system
US11436818B2 (en) Interactive method and interactive system
CN112308981A (en) Image processing method, image processing device, electronic equipment and storage medium
US11216165B2 (en) Content processing method and electronic device for supporting same
KR20210050321A (en) apparatus and method of displaying three dimensional augmented reality
CN111489376A (en) Method and device for tracking interactive equipment, terminal equipment and storage medium
CN114201028B (en) Augmented reality system and method for anchoring display virtual object thereof
CN117321472A (en) Post-warping to minimize delays in moving objects

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant