WO2021253195A1 - Positioning system and method for operating the positioning system - Google Patents

Positioning system and method for operating the positioning system Download PDF

Info

Publication number
WO2021253195A1
WO2021253195A1 PCT/CN2020/096225 CN2020096225W WO2021253195A1 WO 2021253195 A1 WO2021253195 A1 WO 2021253195A1 CN 2020096225 W CN2020096225 W CN 2020096225W WO 2021253195 A1 WO2021253195 A1 WO 2021253195A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
positioning
positioning system
inertial
visual
Prior art date
Application number
PCT/CN2020/096225
Other languages
French (fr)
Inventor
Marc Patrick ZAPF
Wei Wang
Hao Sun
Erik EINHOM
Original Assignee
Robert Bosch Gmbh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch Gmbh filed Critical Robert Bosch Gmbh
Priority to PCT/CN2020/096225 priority Critical patent/WO2021253195A1/en
Publication of WO2021253195A1 publication Critical patent/WO2021253195A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras

Definitions

  • the disclosure mainly relates to a positioning system and a method for operating the positioning system.
  • the existing positioning techniques mainly based on visual-inertial odometers fail to accurately determine the scale of the environmental and the estimated trajectory.
  • the existing positioning techniques mainly based on LIDAR or high performance cameras have relatively high cost.
  • the size of the existing positioning system is relatively large.
  • the disclosure is aimed at providing a positioning system and a method for operating the positioning system, which could provide accurate positioning information of objects in an area, in particular in real time.
  • the area is, for example, an industrial area, in particular a workshop area.
  • a positioning system for providing positioning information of an object in an area.
  • the positioning system comprises a visual-inertial odometer unit configured to provide pose information of the object in the area, an ultra-wideband positioning unit configured to provide position information of the object in the area and a computing unit configured to fuse the pose information provided by the visual-inertial odometer unit and the position information provided by the ultra-wideband positioning unit into positioning information of the object in the area.
  • the visual-inertial odometer unit comprises a stereo camera, an inertial measurement unit and a video processor.
  • the stereo camera is configured to provide visual data of the object.
  • the inertial measurement unit is configured to provide inertial data of the object.
  • the video processor is configured to calculate the pose information based on the visual data and the inertial data.
  • the pose information provided by the visual-inertial odometer unit and the position information provided by the ultra-wideband positioning unit are fused, so that the accurate real-time positioning information with corrected scale could be obtained.
  • the calculation of the pose information on the video processor could significantly reduce the calculation load of the computing unit and the communication load of a system, so that the total cost of the positioning system could be reduced, and the portability of the positioning system could be improved.
  • the pose information provided by the visual-inertial odometer unit represents a 6-DOF pose of the object in the area.
  • the position information provided by the ultra-wideband positioning unit is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system.
  • the ultra-wideband positioning unit may comprise at least one tag arranged on the object and at least four anchors arranged at different locations in the area.
  • the ultra-wideband positioning unit may further comprise a control unit connected to each anchor, wherein the control unit is configured to calculate the position information of the object based on signals received by each anchor from the tag by the Time of Flight method (ToF) or the Time Difference of Arrival method (TDoA) , and transmit the calculated position information back to the tag.
  • the position information may then be provided to the computing unit of the positioning system.
  • the tag of the ultra-wideband positioning unit is configured to calculate its own position information and thus the position information of the object based on signals received by anchors ( “self-localizing” ) .
  • the computing unit may be a microcomputer operating on the basis of Linux.
  • the computing unit may be coupled to the visual-inertial odometer unit and/or the ultra-wideband positioning unit through USB.
  • the positioning system may further comprise an output interface.
  • the output interface may transmit the fused positioning information to a main controller or a cloud server or other applications.
  • the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) ( “back-transmission” ) , to a main controller or a cloud server etc. for further applications or analysis.
  • UWB Ultra-Wideband
  • the stereo camera may have two fisheye camera units.
  • the inertial measurement unit may comprise at least one acceleration sensor and at least one gyroscope.
  • the stereo camera, the inertial measurement unit and the video processor may be integrated in the visual-inertial odometer unit. Therefore, the visual-inertial odometer unit may be constructed as an integrated component with low power consumption and small size.
  • the video processor may be constructed as a system-on-chip (SOC) component.
  • SOC system-on-chip
  • the video processor could perform image processing and computer vision computation with high efficiency.
  • a method for operating the above-described positioning system comprises the following steps: providing, by the visual-inertial odometer unit, pose information of the object in an area; providing, by the ultra-wideband positioning unit, position information of the object in the area; fusing, by the computing unit, the pose information and the position information into positioning information of the object in the area.
  • the pose information is calculated by the video processor based on the visual data provided by the stereo camera and the inertial data provided by the inertial measurement unit. Consequently, the positioning information obtained by the computing unit is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
  • the position information provided by the ultra-wideband positioning unit may be filtered before being transmitted to the computing unit.
  • the fused positioning information obtained by the computing unit may be transmitted to a main controller or a cloud server or other applications.
  • the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) ( “back-transmission” ) , to a main controller or a cloud server etc. for further applications or analysis.
  • UWB Ultra-Wideband
  • a personal gear at least comprises an above-described positioning system and a personal clothing.
  • the positioning system may be detachably mounted on the personal clothing.
  • the personal clothing may be a personal protective equipment, for example, a helmet, a protective suit, a belt, or the like.
  • a mobile device at least comprises an above-described positioning system and a mobile working apparatus.
  • the positioning system may be detachably mounted on the mobile working apparatus.
  • a computer-readable storage medium may have stored thereon a computer program, wherein the computer program, when executed by a processor, carries out the steps of the above-described method.
  • the positioning system or the method for operating the positioning system according to the disclosure has at least the following advantages:
  • the relative positioning information obtained from the visual-inertial odometer unit and the absolute position information obtained from the ultra-wideband positioning unit are fused, so that the scaling problem of the visual-inertial odometer system and the drift problem of relative localization and the inaccuracy of the ultra-wideband positioning (for example, due to non-line-of-sight between tag and anchors) could be avoided;
  • the total cost and size of the system could be reduced by using a video processor, particularly an integrated video processor;
  • the fused real-time positioning information could be transmitted by means of the ultra-wideband system using the UWB-protocol;
  • Fig. 1 schematically shows a block diagram of a positioning system 100 according to an embodiment of the disclosure
  • Fig. 2 schematically shows a block diagram of a visual-inertial odometer unit 101 according to an embodiment of the disclosure
  • Fig. 3 schematically shows a flow chart of a method for operating the positioning system according to an embodiment of the disclosure.
  • Fig. 1 schematically shows a block diagram of a positioning system 100 according to the disclosure.
  • the positioning system 100 is configured to provide positioning information of an object in an area, in particular in real time.
  • the area is, for example, an industrial area, in particular a workshop area.
  • the positioning system 100 is, for example, in particular detachably mounted on a personal clothing, so as to constitute a personal gear.
  • the personal clothing is a personal protective equipment, which is, for example, a helmet, a protective suit, a belt or the like
  • the positioning system 100 may also be integrated with a personal clothing, so as to constitute a personal gear.
  • the positioning system 100 may be mounted on a mobile working apparatus that works in a workshop.
  • the object may be, for example, a personal clothing/personal gear or a worker wearing the personal clothing/personal gear or a mobile working apparatus/amobile device.
  • the positioning system 100 may include a visual-inertial odometer unit 101, an ultra-wideband positioning unit 102 and a computing unit 103.
  • the visual-inertial odometer unit 101 is configured to provide pose information of the object in the area.
  • the pose information represents a 6 degree of freedom (6 DOF position) pose of the object in the area.
  • the ultra-wideband positioning unit 102 is configured to provide position information of the object in the area.
  • the position information is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system.
  • the ultra-wideband positioning unit 102 may comprise, for example, at least one tag arranged on the object and at least four anchors arranged at different locations in the area.
  • the ultra-wideband positioning unit 102 may further comprise a control unit connected to each anchor, which may be configured to calculate position information of the object based on signals received by each anchor from the tag by Time-of-Flight method (ToF) or a Time Difference of Arrival method (TDoA) , and transmit the calculated position information back to the tag.
  • the position information may then be provided to the computing unit 103.
  • the computing unit 103 is configured to fuse the pose information provided by the visual-inertial odometer unit 101 and the position information provided by the ultra-wideband positioning unit 102 into the fused positioning information of the object in the area or to generate the fused positioning information of the object in the area.
  • the computing unit 103 may be a microcomputer operating on the basis of Linux.
  • the computing unit 103 is coupled to the visual-inertial odometer unit 101 and the ultra-wideband positioning unit 102 through wired and/or wireless connection.
  • the computing unit 103 is coupled to the visual-inertial odometer unit 101 and/or the ultra-wideband positioning unit 102 through USB.
  • the positioning information obtained from the computing unit 103 is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
  • the positioning system 100 may further include an output interface (not shown) , which may transmit the resulting fused positioning information to a main controller or a cloud server etc. for further applications or analysis.
  • the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) , to a main controller or a cloud server etc. for further applications or analysis.
  • UWB Ultra-Wideband
  • Fig. 2 schematically shows a block diagram of a visual-inertial odometer unit 101 according to the disclosure.
  • the visual-inertial odometer unit 101 may include, for example, a stereo camera 1011, an inertial measurement unit 1012 and a video processor 1013.
  • the video processor 1013 is configured to calculate the pose information based on the visual data provided by the stereo cameras 1011 and the inertial data provided by the inertial measurement unit 1012.
  • the calculated pose information represents a 6-DOF pose of the object in the area and is then provided to the computing unit 103.
  • the stereo camera 1011 may have two fisheye camera elements, wherein each fisheye camera element may have a field of view of at least 150 degrees.
  • the inertial measurement unit 1012 may include at least one acceleration sensor and at least one gyroscope, and could preferably provide three-axis acceleration information as well as three-axis angular velocity information.
  • the stereo camera 1011, the inertial measurement unit 1012 and the video processor 1013 are integrated in the visual-inertial odometer unit 101, so that the visual-inertial odometer unit 101 is constructed as an integrated component with low power consumption and small size.
  • the video processor 1013 is constructed as a system-on-chip (SOC) component that could perform image processing and computer vision calculations with high efficiency.
  • SOC system-on-chip
  • Fig. 3 shows a method for positioning, in particular by means of the positioning system according to the disclosure, or for operating the positioning system according to the disclosure.
  • step 301 the pose information of the object in the area is provided by the visual-inertial odometer unit 101.
  • the pose information is calculated by a video processor 1013 based on visual data provided by the stereo camera 1011 and inertial data provided by an inertial measurement unit 1012.
  • the pose information represents a 6-DOF pose of the object in the area.
  • step 302 the position information of the object in the area is provided by an ultra-wideband positioning unit 102.
  • the position information is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system.
  • the position information provided by the ultra-wideband positioning unit 102 is filtered before being transmitted to the computing unit 103.
  • step 303 the pose information and the position information are fused by a computing unit into the positioning information of the object in the area or to generate the fused positioning information of the object in the area.
  • the positioning information obtained by the computation unit 103 is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
  • the fused positioning information obtained from the computing unit 103 may be transmitted to a main controller or a cloud server etc. for further applications or analysis.
  • the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) , to a main controller or a cloud server etc. for further applications or analysis.
  • UWB Ultra-Wideband

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Analysis (AREA)

Abstract

A positioning system (100) for providing positioning information of an object in an area is disclosed, the positioning system (100) comprises a visual-inertial odometer unit (101) configured to provide pose information of the object in the area, an ultra-wideband positioning unit (102) configured to provide position information of the object in the area and a computing unit (103) configured to fuse the pose information provided by the visual-inertial odometer unit (101) and the position information provided by the ultra-wideband positioning unit (102) into positioning information of the object in the area, wherein the visual-inertial odometer unit (101) comprises a stereo camera (1011), an inertial measurement unit (1012) and a video processor (1013), wherein the stereo camera (1011) is configured to provide visual data of the object, wherein the inertial measurement unit (1012) is configured to provide inertial data of the object, and wherein the video processor (1013) is configured to calculate the pose information based on the visual data and the inertial data. A method for operating a positioning system (100) is also disclosed. A personal gear, a mobile device and a computer-readable storage medium are also disclosed.

Description

Positioning system and method for operating the positioning system Technical Field
The disclosure mainly relates to a positioning system and a method for operating the positioning system.
Background Art
Nowadays, different solutions have been developed for the real-time localization of workers or mobile working devices in industrial scenarios, in particular in workshop area. However, the existing positioning techniques mainly based on visual-inertial odometers fail to accurately determine the scale of the environmental and the estimated trajectory. The existing positioning techniques mainly based on LIDAR or high performance cameras have relatively high cost. In addition, the size of the existing positioning system is relatively large.
Summary of the Disclosure
The disclosure is aimed at providing a positioning system and a method for operating the positioning system, which could provide accurate positioning information of objects in an area, in particular in real time. The area is, for example, an industrial area, in particular a workshop area.
According to one aspect of the disclosure, a positioning system for providing positioning information of an object in an area is disclosed. The positioning system comprises a visual-inertial odometer unit configured to provide pose information of the object in the area, an ultra-wideband positioning unit configured to provide position information of the object in the area and a computing unit configured to fuse the pose information provided by the visual-inertial odometer unit and the position information provided by the ultra-wideband positioning unit into positioning information of the object in the area. The visual-inertial odometer unit comprises a stereo camera, an inertial measurement unit and a video processor. The stereo camera is configured to provide visual data of the object. The inertial measurement unit is configured to provide inertial data of the object. The video processor is configured to calculate the pose information based on the visual data and the inertial data.
According to the disclosure, the pose information provided by the visual-inertial odometer unit and the position information provided by the ultra-wideband positioning unit are fused, so that the accurate real-time positioning information with corrected scale could be obtained. Meanwhile, the calculation of the pose information on the video processor could significantly reduce the calculation load of the computing unit and the communication load of a system, so  that the total cost of the positioning system could be reduced, and the portability of the positioning system could be improved.
According to an embodiment of the positioning system, the pose information provided by the visual-inertial odometer unit represents a 6-DOF pose of the object in the area.
According to an embodiment of the positioning system, the position information provided by the ultra-wideband positioning unit is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system.
According to an embodiment of the positioning system, the ultra-wideband positioning unit may comprise at least one tag arranged on the object and at least four anchors arranged at different locations in the area.
According to an embodiment of the positioning system, the ultra-wideband positioning unit may further comprise a control unit connected to each anchor, wherein the control unit is configured to calculate the position information of the object based on signals received by each anchor from the tag by the Time of Flight method (ToF) or the Time Difference of Arrival method (TDoA) , and transmit the calculated position information back to the tag. The position information may then be provided to the computing unit of the positioning system.
According to an embodiment of the positioning system, the tag of the ultra-wideband positioning unit is configured to calculate its own position information and thus the position information of the object based on signals received by anchors ( “self-localizing” ) .
According to an embodiment of the positioning system, the computing unit may be a microcomputer operating on the basis of Linux.
According to an embodiment of the positioning system, the computing unit may be coupled to the visual-inertial odometer unit and/or the ultra-wideband positioning unit through USB.
According to an embodiment of the positioning system, the positioning system may further comprise an output interface. The output interface may transmit the fused positioning information to a main controller or a cloud server or other applications.
According to an embodiment of the positioning system, the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) ( “back-transmission” ) , to a main controller or a cloud server etc. for further applications or analysis.
According to an embodiment of the positioning system, the stereo camera may have two fisheye camera units.
According to an embodiment of the positioning system, the inertial measurement unit may comprise at least one acceleration sensor and at least one gyroscope.
According to an embodiment of the positioning system, the stereo camera, the inertial measurement unit and the video processor may be integrated in the visual-inertial odometer unit. Therefore, the visual-inertial odometer unit may be constructed as an integrated component with low power consumption and small size.
According to an embodiment of the positioning system, the video processor may be constructed as a system-on-chip (SOC) component. The video processor could perform image processing and computer vision computation with high efficiency.
According to another aspect of the disclosure, a method for operating the above-described positioning system is disclosed. The method comprises the following steps: providing, by the visual-inertial odometer unit, pose information of the object in an area; providing, by the ultra-wideband positioning unit, position information of the object in the area; fusing, by the computing unit, the pose information and the position information into positioning information of the object in the area. The pose information is calculated by the video processor based on the visual data provided by the stereo camera and the inertial data provided by the inertial measurement unit. Consequently, the positioning information obtained by the computing unit is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
According to an embodiment of the method, the position information provided by the ultra-wideband positioning unit may be filtered before being transmitted to the computing unit.
According to an embodiment of the method, the fused positioning information obtained by the computing unit may be transmitted to a main controller or a cloud server or other applications.
According to an embodiment of the method, the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) ( “back-transmission” ) , to a main controller or a cloud server etc. for further applications or analysis.
According to yet another aspect of the disclosure, a personal gear is disclosed. The personal  gear at least comprises an above-described positioning system and a personal clothing. In particular, the positioning system may be detachably mounted on the personal clothing. In particular, the personal clothing may be a personal protective equipment, for example, a helmet, a protective suit, a belt, or the like.
According to yet another aspect of the disclosure, a mobile device is disclosed. The mobile device at least comprises an above-described positioning system and a mobile working apparatus. In particular, the positioning system may be detachably mounted on the mobile working apparatus.
According to yet another aspect of the disclosure, a computer-readable storage medium is disclosed. The computer-readable storage medium may have stored thereon a computer program, wherein the computer program, when executed by a processor, carries out the steps of the above-described method.
The positioning system or the method for operating the positioning system according to the disclosure has at least the following advantages:
- the relative positioning information obtained from the visual-inertial odometer unit and the absolute position information obtained from the ultra-wideband positioning unit are fused, so that the scaling problem of the visual-inertial odometer system and the drift problem of relative localization and the inaccuracy of the ultra-wideband positioning (for example, due to non-line-of-sight between tag and anchors) could be avoided;
- the total cost and size of the system could be reduced by using a video processor, particularly an integrated video processor;
- the fused real-time positioning information could be transmitted by means of the ultra-wideband system using the UWB-protocol;
- the stability and the accuracy of real-time positioning could be improved.
Brief Description of the Drawings
Fig. 1 schematically shows a block diagram of a positioning system 100 according to an embodiment of the disclosure,
Fig. 2 schematically shows a block diagram of a visual-inertial odometer unit 101 according to an embodiment of the disclosure,
Fig. 3 schematically shows a flow chart of a method for operating the positioning system  according to an embodiment of the disclosure.
Detailed Description of Preferred Embodiments
In the following detailed description, numerous specific details are set forth to provide a thorough understanding of the claimed subject matter. The claimed subject matter, however, may be practiced without these specific details. In some instances, methods, apparatuses, or systems that would be known by one of ordinary skill have not been described in detail so as not to obscure claimed subject matter.
Fig. 1 schematically shows a block diagram of a positioning system 100 according to the disclosure. The positioning system 100 is configured to provide positioning information of an object in an area, in particular in real time. The area is, for example, an industrial area, in particular a workshop area.
The positioning system 100 is, for example, in particular detachably mounted on a personal clothing, so as to constitute a personal gear. In particular, the personal clothing is a personal protective equipment, which is, for example, a helmet, a protective suit, a belt or the like Alternatively, the positioning system 100 may also be integrated with a personal clothing, so as to constitute a personal gear. Alternatively, the positioning system 100 may be mounted on a mobile working apparatus that works in a workshop. The object may be, for example, a personal clothing/personal gear or a worker wearing the personal clothing/personal gear or a mobile working apparatus/amobile device.
According to the disclosure, the positioning system 100 may include a visual-inertial odometer unit 101, an ultra-wideband positioning unit 102 and a computing unit 103.
The visual-inertial odometer unit 101 is configured to provide pose information of the object in the area. Preferably, the pose information represents a 6 degree of freedom (6 DOF position) pose of the object in the area.
The ultra-wideband positioning unit 102 is configured to provide position information of the object in the area. In particular, the position information is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system. The ultra-wideband positioning unit 102 may comprise, for example, at least one tag arranged on the object and at least four anchors arranged at different locations in the area. Furthermore, the ultra-wideband positioning unit 102 may further comprise a control unit connected to each anchor, which may be configured to calculate position information of the object based on signals received by each anchor from the tag by Time-of-Flight method (ToF) or a Time Difference of Arrival method (TDoA) , and transmit the calculated position information back  to the tag. The position information may then be provided to the computing unit 103.
The computing unit 103 is configured to fuse the pose information provided by the visual-inertial odometer unit 101 and the position information provided by the ultra-wideband positioning unit 102 into the fused positioning information of the object in the area or to generate the fused positioning information of the object in the area. The computing unit 103 may be a microcomputer operating on the basis of Linux.
The computing unit 103 is coupled to the visual-inertial odometer unit 101 and the ultra-wideband positioning unit 102 through wired and/or wireless connection. Preferably, the computing unit 103 is coupled to the visual-inertial odometer unit 101 and/or the ultra-wideband positioning unit 102 through USB.
Consequently, the positioning information obtained from the computing unit 103 is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
In addition, the positioning system 100 may further include an output interface (not shown) , which may transmit the resulting fused positioning information to a main controller or a cloud server etc. for further applications or analysis.
In particular, the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) , to a main controller or a cloud server etc. for further applications or analysis.
Fig. 2 schematically shows a block diagram of a visual-inertial odometer unit 101 according to the disclosure. The visual-inertial odometer unit 101 may include, for example, a stereo camera 1011, an inertial measurement unit 1012 and a video processor 1013.
The video processor 1013 is configured to calculate the pose information based on the visual data provided by the stereo cameras 1011 and the inertial data provided by the inertial measurement unit 1012. Preferably, the calculated pose information represents a 6-DOF pose of the object in the area and is then provided to the computing unit 103.
In addition, the stereo camera 1011 may have two fisheye camera elements, wherein each fisheye camera element may have a field of view of at least 150 degrees.
In addition, the inertial measurement unit 1012 may include at least one acceleration sensor and at least one gyroscope, and could preferably provide three-axis acceleration information as well as three-axis angular velocity information.
In addition, the stereo camera 1011, the inertial measurement unit 1012 and the video processor 1013 are integrated in the visual-inertial odometer unit 101, so that the visual-inertial odometer unit 101 is constructed as an integrated component with low power consumption and small size.
In addition, the video processor 1013 is constructed as a system-on-chip (SOC) component that could perform image processing and computer vision calculations with high efficiency.
Fig. 3 shows a method for positioning, in particular by means of the positioning system according to the disclosure, or for operating the positioning system according to the disclosure.
In step 301, the pose information of the object in the area is provided by the visual-inertial odometer unit 101.
In addition, the pose information is calculated by a video processor 1013 based on visual data provided by the stereo camera 1011 and inertial data provided by an inertial measurement unit 1012. Preferably, the pose information represents a 6-DOF pose of the object in the area.
In step 302, the position information of the object in the area is provided by an ultra-wideband positioning unit 102.
In addition, the position information is two-dimensional or three-dimensional absolute position information, in particular in the world coordinate system.
In addition, the position information provided by the ultra-wideband positioning unit 102 is filtered before being transmitted to the computing unit 103.
In step 303, the pose information and the position information are fused by a computing unit into the positioning information of the object in the area or to generate the fused positioning information of the object in the area.
The positioning information obtained by the computation unit 103 is stable, scale-accurate, real-time positioning information of the object in the world coordinate system.
In addition, the fused positioning information obtained from the computing unit 103 may be transmitted to a main controller or a cloud server etc. for further applications or analysis.
In particular, the fused positioning information may be transmitted, in particular using the UWB (Ultra-Wideband) -protocol from the tag of the ultra-wideband positioning unit (via at least one anchor) , to a main controller or a cloud server etc. for further applications or  analysis.
While certain embodiments have been described, these embodiments have been presented by way of example only, and are not intended to limit the scope of the disclosure. The attached claims and their equivalents are intended to cover all the modifications, substitutions and changes as would fall within the scope and spirit of the disclosure.

Claims (10)

  1. A positioning system for providing positioning information of an object in an area, the positioning system comprising
    a visual-inertial odometer unit configured to provide pose information of the object in the area;
    an ultra-wideband positioning unit configured to provide position information of the object in the area;
    a computing unit configured to fuse the pose information provided by the visual-inertial odometer unit and the position information provided by the ultra-wideband positioning unit into positioning information of the object in the area;
    wherein the visual-inertial odometer unit comprises a stereo camera, an inertial measurement unit and a video processor,
    wherein the stereo camera is configured to provide visual data of the object,
    wherein the inertial measurement unit is configured to provide inertial data of the object, and
    wherein the video processor is configured to calculate the pose information based on the visual data and the inertial data.
  2. The positioning system of claim 1, wherein the stereo camera, the inertial measurement unit and the video processor are integrated in the visual-inertial odometer unit.
  3. The positioning system of claim 1 or 2, wherein the ultra-wideband positioning unit comprises:
    at least one tag arranged on the object;
    at least four anchors arranged at different locations of the area.
  4. The positioning system of any one of claims 1 to 3, wherein the inertial measurement unit comprises at least one acceleration sensor and at least one gyroscope.
  5. The positioning system of any one of claims 1 to 4, wherein the video processor is constructed as a system-on-chip.
  6. A method for operating a positioning system of any one of claims 1 to 5, wherein the positioning system comprises a visual-inertial odometer unit, an ultra-wideband positioning unit and a computing unit, wherein the visual-inertial odometer unit comprises a stereo camera, an inertial measurement unit and a video processor, the method comprising:
    providing, by the visual-inertial odometer unit, pose information of the object in the area;
    providing, by the ultra-wideband positioning unit, position information of the object in the area;
    fusing, by the computing unit, the pose information and the position information into positioning information of the object in the area;
    wherein the pose information is calculated by the video processor based on visual data provided by the stereo camera and inertial data provided by the inertial measurement unit.
  7. The method of claim 6, wherein the position information of the object in the area provided by the ultra-wideband positioning unit is filtered before the fusing.
  8. A personal gear, comprising:
    the positioning system of any one of claims 1 to 5;
    a personal clothing, in particular a personal protective equipment;
    wherein the positioning system is detachably mounted on the personal clothing.
  9. A mobile device, comprising:
    the positioning system of any one of claims 1 to 5;
    a mobile working apparatus;
    wherein the positioning system is mounted on the mobile working apparatus.
  10. A computer-readable storage medium having stored thereon a computer program, wherein the computer program, when executed by a processor, carries out the steps of the method according to any one of claims 6 to 7.
PCT/CN2020/096225 2020-06-15 2020-06-15 Positioning system and method for operating the positioning system WO2021253195A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/096225 WO2021253195A1 (en) 2020-06-15 2020-06-15 Positioning system and method for operating the positioning system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2020/096225 WO2021253195A1 (en) 2020-06-15 2020-06-15 Positioning system and method for operating the positioning system

Publications (1)

Publication Number Publication Date
WO2021253195A1 true WO2021253195A1 (en) 2021-12-23

Family

ID=79268958

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2020/096225 WO2021253195A1 (en) 2020-06-15 2020-06-15 Positioning system and method for operating the positioning system

Country Status (1)

Country Link
WO (1) WO2021253195A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106028695A (en) * 2016-07-25 2016-10-12 泉州市云尚三维科技有限公司 Positioning module combination applicable to wearable intelligent clothes
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
US20170336220A1 (en) * 2016-05-20 2017-11-23 Daqri, Llc Multi-Sensor Position and Orientation Determination System and Device
CN108609034A (en) * 2016-12-30 2018-10-02 河南辉煌信通软件有限公司 Train Approaching EW system based on ultra wideband location techniques
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170336220A1 (en) * 2016-05-20 2017-11-23 Daqri, Llc Multi-Sensor Position and Orientation Determination System and Device
CN106028695A (en) * 2016-07-25 2016-10-12 泉州市云尚三维科技有限公司 Positioning module combination applicable to wearable intelligent clothes
CN108609034A (en) * 2016-12-30 2018-10-02 河南辉煌信通软件有限公司 Train Approaching EW system based on ultra wideband location techniques
CN107179080A (en) * 2017-06-07 2017-09-19 纳恩博(北京)科技有限公司 The localization method and device of electronic equipment, electronic equipment, electronic positioning system
CN110487267A (en) * 2019-07-10 2019-11-22 湖南交工智能技术有限公司 A kind of UAV Navigation System and method based on VIO&UWB pine combination

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
WANG CHEN; ZHANG HANDUO; NGUYEN THIEN-MINH; XIE LIHUA: "Ultra-wideband aided fast localization and mapping system", 2017 IEEE/RSJ INTERNATIONAL CONFERENCE ON INTELLIGENT ROBOTS AND SYSTEMS (IROS), IEEE, 24 September 2017 (2017-09-24), pages 1602 - 1609, XP033266119, DOI: 10.1109/IROS.2017.8205968 *

Similar Documents

Publication Publication Date Title
CN107402000B (en) Method and system for correlating a display device with respect to a measurement instrument
CN107438752B (en) Positioning method, terminal and server
CN110458961B (en) Augmented reality based system
TWI686686B (en) Method and apparatus for controlling aircraft
CN103207383A (en) Method for performing two-dimensional wireless positioning on stationary node based on single mobile node
US20210183100A1 (en) Data processing method and apparatus
Meier et al. The pixhawk open-source computer vision framework for mavs
CN103487054B (en) A kind of localization method of Novel hand-held indoor locating system
CN113820735B (en) Determination method of position information, position measurement device, terminal and storage medium
JP2018136143A (en) Unmanned aircraft tracking device, method for tracking unmanned aircraft, unmanned aircraft tracking system, and program
US20210208608A1 (en) Control method, control apparatus, control terminal for unmanned aerial vehicle
JP2019078560A (en) Gyro sensor offset correcting device, offset correction program, and pedestrian autonomous navigation device
Rhudy et al. Unmanned aerial vehicle navigation using wide‐field optical flow and inertial sensors
EP4211422A1 (en) Systems and methods for gps-based and sensor-based relocalization
US11598636B2 (en) Location information display device and surveying system
CN105184268A (en) Gesture recognition device, gesture recognition method, and virtual reality system
US20220019222A1 (en) Unmanned Aerial Vehicle, Unmanned Aerial Vehicle Flight Control Device, Unmanned Aerial Vehicle Flight Control Method and Program
CN115164936A (en) Global pose correction method and device for point cloud splicing in high-precision map manufacturing
US20210240185A1 (en) Shooting control method and unmanned aerial vehicle
WO2022141314A1 (en) Route planning method and apparatus, device, unmanned aerial vehicle and readable storage medium
WO2021253195A1 (en) Positioning system and method for operating the positioning system
CN102967308B (en) A kind of 3-D positioning method of remote arbitrary target
Guo et al. The usefulness of sensor fusion for unmanned aerial vehicle indoor positioning
WO2020024150A1 (en) Map processing method, apparatus, and computer readable storage medium
WO2022264413A1 (en) Mobile body movement path generation method and program, management server, and management system

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20940498

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 20940498

Country of ref document: EP

Kind code of ref document: A1