CN115290069A - Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform - Google Patents

Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform Download PDF

Info

Publication number
CN115290069A
CN115290069A CN202210869647.2A CN202210869647A CN115290069A CN 115290069 A CN115290069 A CN 115290069A CN 202210869647 A CN202210869647 A CN 202210869647A CN 115290069 A CN115290069 A CN 115290069A
Authority
CN
China
Prior art keywords
mobile platform
data fusion
sensor data
handheld mobile
heterogeneous sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202210869647.2A
Other languages
Chinese (zh)
Other versions
CN115290069B (en
Inventor
季向阳
王启铭
连晓聪
王乃棒
沈永进
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tsinghua University
Original Assignee
Tsinghua University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tsinghua University filed Critical Tsinghua University
Priority to CN202210869647.2A priority Critical patent/CN115290069B/en
Publication of CN115290069A publication Critical patent/CN115290069A/en
Application granted granted Critical
Publication of CN115290069B publication Critical patent/CN115290069B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/005Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 with correlation of navigation data from several sources, e.g. map or contour matching
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1656Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with passive imaging devices, e.g. cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/18Stabilised platforms, e.g. by gyroscope
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network
    • G01C21/28Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 specially adapted for navigation in a road network with correlation of data from several navigational instruments
    • G01C21/30Map- or contour-matching
    • G01C21/32Structuring or formatting of map data
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/38Electronic maps specially adapted for navigation; Updating thereof
    • G01C21/3804Creation or updating of map data
    • G01C21/3833Creation or updating of map data characterised by the source of data
    • G01C21/3841Data obtained from two or more sources, e.g. probe vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/06Systems determining position data of a target
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/02Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
    • G01S13/50Systems of measurement based on relative movement of target
    • G01S13/58Velocity or trajectory determination systems; Sense-of-movement determination systems
    • G01S13/589Velocity or trajectory determination systems; Sense-of-movement determination systems measuring the velocity vector
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/02Systems using the reflection of electromagnetic waves other than radio waves
    • G01S17/06Systems determining position data of a target
    • G01S17/08Systems determining position data of a target for measuring distance only
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S19/00Satellite radio beacon positioning systems; Determining position, velocity or attitude using signals transmitted by such systems
    • G01S19/38Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system
    • G01S19/39Determining a navigation solution using signals transmitted by a satellite radio beacon positioning system the satellite radio beacon positioning system transmitting time-stamped messages, e.g. GPS [Global Positioning System], GLONASS [Global Orbiting Navigation Satellite System] or GALILEO
    • G01S19/42Determining position
    • G01S19/45Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement
    • G01S19/47Determining position by combining measurements of signals from the satellite radio beacon positioning system with a supplementary measurement the supplementary measurement being an inertial measurement, e.g. tightly coupled inertial

Landscapes

  • Engineering & Computer Science (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Automation & Control Theory (AREA)
  • Electromagnetism (AREA)
  • Optical Radar Systems And Details Thereof (AREA)
  • Radar Systems Or Details Thereof (AREA)

Abstract

The invention discloses a multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform, which comprises: a housing; a laser radar; two visible light cameras; an infrared camera; a touch display screen; an upper layer circuit board; a millimeter wave radar; an inertial measurement unit; a time synchronization device; a calculation unit; a switch; an industrial personal computer; a lower circuit board; a power conversion device; and the power supply device is externally connected. The multi-source heterogeneous sensor data fusion and cooperative sensing handheld mobile platform provided by the embodiment of the invention has the advantages of small volume, convenience in handheld, high integration level, multiple types of sensors, high space utilization rate, high precision and the like.

Description

Multi-source heterogeneous sensor data fusion and cooperative sensing handheld mobile platform
Technical Field
The invention relates to the technical field of automatic driving, in particular to a multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform.
Background
With the breakthrough progress of technologies such as artificial intelligence and automatic driving, the autonomous intelligent development of the unmanned system comes with unprecedented opportunities. For an unmanned system, accurate and reliable sensing data is a premise for ensuring normal operation of the whole system, and even though many single sensors at present can meet basic requirements in the aspects of accuracy, real-time performance and the like, a sensing scheme based on a single sensor still has a great defect, for example, in the unmanned driving, for a single sensor, even for some basic tasks such as lane line detection, the robustness of the system is difficult to ensure.
When a single sensor independently performs a task, certain limitations are imposed. For example, a camera has a high resolution, can clearly extract features of a target object when capturing an image, but is susceptible to factors such as light and is not suitable for night work. The laser radar can detect the distance information of a target object, can better realize the tasks of avoiding obstacles and establishing a map and positioning, but has lower resolution than a camera, and the emitted wire harness of the laser radar can be influenced by heavy fog and raindrops. The millimeter wave radar can obtain the speed information of the moving object according to the Doppler effect, but the resolution is low.
The multi-source heterogeneous sensor data fusion platform in the related technology fuses data of various sensors by integrating various sensors, but the integrated functions are less and the integration level is poorer.
Disclosure of Invention
The present invention is directed to solving at least one of the problems of the prior art. Therefore, the invention provides a multi-source heterogeneous sensor data fusion and cooperative sensing handheld mobile platform, which has the advantages of small volume, convenience in handheld, high integration level, multiple types of sensors, high space utilization rate, high precision and the like.
To achieve the above object, an embodiment of the present invention provides a multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform, including: the shell comprises an upper layer part, a lower layer part and two connecting rods, wherein an upper installation cavity is arranged in the upper layer part, a lower installation cavity is arranged in the lower layer part, the lower layer part and the upper layer part are arranged at intervals in the vertical direction, two ends of each connecting rod are respectively connected with the upper layer part and the lower layer part, a holding part is formed on the outer surface of each connecting rod, a wire passing channel communicated with the upper layer part and the lower layer part is formed in each connecting rod, the two connecting rods are respectively located on two sides of the upper layer part and two sides of the lower layer part, an observation part is arranged on the upper surface of the lower layer part, and the observation part is provided with an observation port communicated with the lower installation cavity; a lidar mounted on an upper surface of the upper portion; two visible light cameras mounted on a front surface of the upper layer portion and arranged at intervals in a horizontal direction; the infrared camera is arranged on the front surface of the upper layer part and is positioned between the two visible light cameras; a satellite positioning device mounted within the upper mounting cavity; the touch display screen is arranged on the lower surface of the upper layer part in a turnover mode and at least provided with a containing position attached to the lower surface of the upper layer part; the upper circuit board is arranged in the upper mounting cavity and is electrically connected with the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device and the touch display screen respectively; a millimeter wave radar mounted on a front surface of the lower layer portion; an inertial measurement unit mounted within the lower mounting cavity; a time synchronizer adapted to generate a pulse per second and electrically connected to the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device, the millimeter wave radar, and the inertial measurement unit, respectively; a computing unit mounted within the lower mounting cavity; a switch mounted within the lower mounting cavity; the industrial personal computer is installed in the lower installation cavity and is suitable for observing through the observation port; the lower-layer circuit board is installed in the lower installation cavity and is respectively and electrically connected with the millimeter-wave radar, the inertia measurement unit, the calculation unit, the switch and the industrial personal computer, and the lower-layer circuit board is electrically connected with the upper-layer circuit board through a wire harness penetrating through the wire passing channel; the power conversion device is arranged in the lower mounting cavity and is electrically connected with the lower-layer circuit board; and the external power supply device is positioned outside the shell and electrically connected with the power supply conversion device through a power supply line.
The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform provided by the embodiment of the invention has the advantages of small size, convenience in handheld, high integration level, multiple types of sensors, high space utilization rate, high precision and the like.
In addition, the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to the above embodiment of the present invention may further have the following additional technical features:
according to one embodiment of the invention, the upper surface of the upper layer part is provided with a laser radar mounting groove for mounting the laser radar, the front surface of the upper layer part is provided with an infrared camera mounting port for mounting an infrared camera and two visible light camera mounting ports for mounting the visible light camera, and the front surface of the lower layer part is provided with a millimeter wave radar mounting port for mounting the millimeter wave radar.
According to one embodiment of the invention, the viewing port is covered with a transparent barrier.
According to an embodiment of the present invention, the case further includes an upper bottom cover and a lower rear cover, the lower surface of the upper layer portion is provided with an upper opening, the rear surface of the lower layer portion is provided with a lower opening, the upper bottom cover is openably and closably mounted on the upper layer portion to open or close the upper opening, and the lower rear cover is openably and closably mounted on the lower layer portion to open or close the lower opening.
According to one embodiment of the invention, the height of the multi-source heterogeneous sensor data fusion and cooperative perception handheld mobile platform is less than or equal to 300 mm, and the length and the width in the horizontal direction are less than or equal to 250 mm.
According to one embodiment of the invention, the weight of the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform is less than or equal to 5 kilograms.
According to one embodiment of the invention, the housing is a piece of resin material.
According to one embodiment of the invention, the wall thickness of the housing is 3-5 mm.
According to one embodiment of the invention, the housing is one piece.
According to one embodiment of the invention, the housing is a 3D print.
Additional aspects and advantages of the invention will be set forth in part in the description which follows and, in part, will be obvious from the description, or may be learned by practice of the invention.
Drawings
The above and/or additional aspects and advantages of the present invention will become apparent and readily appreciated from the following description of the embodiments, taken in conjunction with the accompanying drawings of which:
fig. 1 is a schematic structural diagram of a housing of a multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to an embodiment of the invention.
Fig. 2 is a schematic structural diagram of a housing of a multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to an embodiment of the invention.
Reference numerals are as follows: the touch screen comprises a shell 10, an upper layer part 100, a laser radar mounting groove 110, a visible light camera mounting port 120, an infrared camera mounting port 130, an upper opening 140, a touch screen mounting hole 150, a lower layer part 200, an observation part 210, an observation port 211, a millimeter wave radar mounting port 220, a lower opening 230, a connecting rod 300 and a holding part 310.
Detailed Description
Reference will now be made in detail to embodiments of the present invention, examples of which are illustrated in the accompanying drawings, wherein like or similar reference numerals refer to the same or similar elements or elements having the same or similar function throughout. The embodiments described below with reference to the accompanying drawings are illustrative only for the purpose of explaining the present invention and are not to be construed as limiting the present invention.
In the description of the present invention, it is to be understood that the terms "central," "longitudinal," "lateral," "length," "width," "thickness," "upper," "lower," "front," "rear," "left," "right," "vertical," "horizontal," "top," "bottom," "inner," "outer," "clockwise," "counterclockwise," "axial," "radial," "circumferential," and the like are used in the orientations and positional relationships indicated in the drawings for convenience in describing the invention and to simplify the description, and are not intended to indicate or imply that the referenced devices or elements must have a particular orientation, be constructed and operated in a particular orientation, and are therefore not to be considered limiting of the invention. Furthermore, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless otherwise specified.
In the description of the present invention, it should be noted that, unless otherwise explicitly specified or limited, the terms "mounted," "connected," and "connected" are to be construed broadly and may be, for example, fixedly connected, detachably connected, or integrally connected; can be mechanically or electrically connected; they may be connected directly or indirectly through intervening media, or they may be interconnected between two elements. The specific meanings of the above terms in the present invention can be understood in a specific case to those of ordinary skill in the art.
The following describes a multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform according to an embodiment of the invention with reference to the accompanying drawings.
As shown in fig. 1 and 2, the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to the embodiment of the invention includes a housing 10, a laser radar, two visible light cameras, an infrared camera, a satellite positioning device, a touch display screen, an upper circuit board, a millimeter wave radar, an inertial measurement unit, a time synchronization device, a calculation unit, a switch, an industrial personal computer, a lower circuit board, a power conversion device, and an external power supply device.
Casing 10 includes upper portion 100, lower floor portion 200 and two connecting rods 300, be equipped with the upper mounting cavity in upper portion 100, be equipped with the lower mounting cavity in lower floor portion 200, lower floor portion 200 and upper portion 100 interval setting in the upper and lower direction, the both ends of every connecting rod 300 link to each other with upper portion 100 and lower floor portion 200 respectively, the surface of connecting rod 300 is formed with the portion of gripping 310, be formed with the wire-passing channel who communicates upper portion 100 and lower floor portion 200 in the connecting rod 300, two connecting rods 300 are located the both sides of upper portion 100 respectively and are located the both sides of lower floor portion 200, the upper surface of lower floor portion 200 is equipped with observation portion 210, observation portion 210 has the intercommunication the viewing aperture 211 of lower mounting cavity.
The lidar is mounted on an upper surface of the upper portion 100.
The visible light cameras are installed on the front surface of the upper portion 100 and are spaced apart in the horizontal direction.
The infrared camera is mounted on the front surface of the upper portion 100 and is located between the two visible light cameras.
The satellite positioning device is installed in the upper installation cavity.
The touch display screen is installed on the lower surface of the upper portion 100 in a turnable manner and at least has a containing position attached to the lower surface of the upper portion 100.
The upper circuit board is installed in the upper installation cavity and is electrically connected with the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device and the touch display screen respectively.
And a millimeter wave radar mounted on the front surface of the lower portion 200.
The inertial measurement unit is mounted in the lower mounting cavity.
A time synchronizer adapted to generate a pulse per second and electrically connected to the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device, the millimeter wave radar, and the inertial measurement unit, respectively.
The computing unit is mounted within the lower mounting cavity.
The switch is mounted within the lower mounting cavity.
The industrial computer is installed in the lower installation cavity and is suitable for observation through the observation port 211.
The lower circuit board is installed in the lower installation cavity and is respectively electrically connected with the millimeter wave radar, the inertia measurement unit, the computing unit, the switch and the industrial personal computer, and the lower circuit board is electrically connected with the upper circuit board through a wire harness which penetrates through the wire passing channel.
The power supply conversion device is arranged in the lower mounting cavity and is electrically connected with the lower-layer circuit board;
the external power supply device is located outside the housing 10 and is electrically connected with the power conversion device through a power supply line.
Specifically, the visible camera can clearly extract the features of the target object when acquiring the image. The infrared camera can be used for image acquisition at night. The laser radar can detect the distance information of a target object, and can realize the tasks of avoiding obstacles and building images and positioning. The millimeter wave radar can obtain the speed information of the moving object according to the Doppler effect. The inertial measurement unit can obtain speed and position information of the body through an accelerometer and a gyroscope, and can complete accurate positioning of the multi-source heterogeneous sensor data fusion and the cooperative sensing handheld mobile platform by matching with a satellite positioning device, a laser radar and the like.
The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform can perform data fusion and collaborative perception on the data of the 6 sensors to obtain multi-source comprehensive data, make up for the defects of a single sensor, and improve target detection tracking and image identification positioning accuracy.
According to the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform provided by the embodiment of the invention, the shell 10 comprises the upper layer part 100, the lower layer part 200 and the two connecting rods 300, so that one part of a plurality of sensors can be installed in the upper layer part 100, and the other part can be installed in the lower layer part 200.
Through setting up connecting rod 300, not only can utilize connecting rod 300 to connect upper portion 100 and lower part portion 200, can set up on connecting rod 300 portion of gripping 310 moreover, make the user can grip connecting rod 300, it is right multisource heterogeneous sensor data fusion and the handheld mobile platform of perception in coordination move, because the both ends of connecting rod 300 connect upper portion 100 and lower part portion 200 respectively and two connecting rods 300 are located the both sides of upper portion 100 and lower part portion 200 respectively, can make when holding connecting rod 300 multisource heterogeneous sensor data fusion and the handheld mobile platform's of perception in coordination focus more steady, make the user grip when multisource heterogeneous sensor data fusion moves with the handheld mobile platform of perception in coordination more steady. In addition, the wire passage in the link 300 may allow the wire harness to pass through the sensor that is convenient to connect the upper portion 100 and the lower portion 200, may not only facilitate the electrical connection of the upper portion 100 and the lower portion 200, but also may improve the space utilization rate in the link 300.
Through inciting somebody to action laser radar installs the upper surface of upper portion 100, because laser radar needs 360 degrees scanning work to can not shelter from around its transmitting surface, consequently will laser radar installs the upper surface of upper portion 100 and can be convenient for laser radar's scanning work avoids other structures to shelter from laser radar's scanning guarantees laser radar operates the reliability.
Through installing infrared camera and two visible light cameras at the front surface of upper portion 100, can be convenient for gather the image in the place ahead, avoid sheltering from infrared camera and visible light camera, the user is handheld moreover during the handheld mobile platform of multisource heterogeneous sensor data fusion and perception in coordination, can make the collection visual angle of infrared camera and visible light camera and user's visual angle be close to further be convenient for user operation.
By installing the satellite positioning device in the upper portion 100, blocking and interference of other structures to satellite signals can be reduced, and accuracy of the satellite positioning device can be improved.
Through setting up the touch display screen can utilize the touch display screen shows the operation and the data acquisition state of handheld mobile platform of heterogeneous sensor data fusion of multisource and perception in coordination, can be convenient for moreover the user through the operation the touch display screen is right handheld mobile platform of heterogeneous sensor data fusion of multisource and perception in coordination is controlled. In addition, the touch display screen can be arranged on the lower surface of the upper layer part in a turnover mode and at least has a containing position attached to the lower surface of the upper layer part, so that the touch display screen can be turned to a position suitable for observation when the touch display screen needs to be operated and observed, and can be turned to the containing position when the touch display screen does not need to be operated, the touch display screen is prevented from occupying extra space, the touch display screen is prevented from interfering other operations, and the screen is prevented from being scratched.
By installing the millimeter wave radar on the front surface of the lower portion 200, since the millimeter wave radar detects the position and the moving speed of the object in front through the doppler effect, the millimeter wave radar can be prevented from being blocked, and the detection accuracy of the millimeter wave radar is ensured.
Through setting up time synchronizer utilizes time synchronizer sends the pulse per second in step to each sensor, makes each sensor synchronous trigger, can guarantee that it can not take place to detect the target because the time stamp is asynchronous and appear delaying the problem that mismatch to and positioning failure such as inertial measurement unit motion compensation when data acquisition.
Through setting up viewing aperture 211, make the industrial computer is suitable for and observes through viewing aperture 211, can make the user observe through viewing aperture 211 the running state of industrial computer, thereby be convenient for monitor multisource heterogeneous sensor data fusion and the handheld moving platform's of perception in coordination running state.
Through setting up external power supply unit not only can alleviate the weight of casing 10 part, avoids the battery to occupy in casing 10 space, reduces casing 10's volume, makes casing 10 part miniaturization, and the handheld of being convenient for can be convenient for connect the power supply structure of bigger capacity moreover, is convenient for external power supply unit's change and replenishment to extension multisource heterogeneous sensor data fusion and the duration of the handheld moving platform of perception in coordination.
The method has the advantages that the sensors of 6 different modes are integrated on the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform to serve as perception equipment, fusion perception data are provided, tasks such as pedestrian/vehicle tracking, target recognition, obstacle/lane line detection and the like can be completed, corresponding decision instructions are generated, redundant fusion perception data are provided for different types of robots, tasks in specific scenes are achieved, and acquisition work of data such as images and point clouds in different scenes is provided.
Therefore, the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform has the advantages of being small in size, convenient to hold, high in integration level, multiple in sensor types, high in space utilization rate, high in precision and the like.
The following describes a multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform according to an embodiment of the invention with reference to the accompanying drawings.
In some embodiments of the present invention, as shown in fig. 1 and fig. 2, a multi-source heterogeneous sensor data fusion and collaborative sensing handheld mobile platform according to an embodiment of the present invention includes a housing 10, a laser radar, two visible light cameras, an infrared camera, a satellite positioning device, a touch display screen, an upper circuit board, a millimeter wave radar, an inertial measurement unit, a time synchronization device, a computing unit, a switch, an industrial personal computer, a lower circuit board, a power conversion device, and an external power supply device.
Specifically, as shown in fig. 1 and 2, the upper surface of the upper portion 100 is provided with a laser radar installation groove 110 for installing the laser radar, the front surface of the upper portion 100 is provided with an infrared camera installation port 130 for installing an infrared camera and two visible light camera installation ports 120 for installing the visible light camera, and the front surface of the lower portion 200 is provided with a millimeter wave radar installation port 220 for installing the millimeter wave radar. Can be convenient for like this laser radar infrared camera visible light camera with millimeter wave radar's installation can avoid sheltering from moreover laser radar infrared camera visible light camera with millimeter wave radar guarantees the accuracy and the reliability that these sensors detected.
Specifically, the lidar may protrude out of the lidar mounting slot 110 to avoid being obscured.
Advantageously, the viewing port 211 is covered with a transparent barrier. This prevents foreign matter such as dust from entering the lower portion 200, and ensures the observation effect of the observation port 211.
More specifically, the case 10 further includes an upper bottom cover provided with an upper opening 140 on a lower surface thereof and a lower rear cover provided with a lower opening 230 on a rear surface thereof, the upper bottom cover being openably and closably mounted on the upper layer portion 100 to open or close the upper opening 140, and the lower rear cover being openably and closably mounted on the lower layer portion 200 to open or close the lower opening 230. This may facilitate the installation of the various structures within the upper and lower sections 100 and 200.
Optionally, the height of the multi-source heterogeneous sensor data fusion and cooperative sensing handheld mobile platform is less than or equal to 300 mm, and the length and the width in the horizontal direction are less than or equal to 250 mm. Therefore, the miniaturization of the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform can be facilitated, and the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform can be conveniently handled.
Further, the weight of the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform is less than or equal to 5 kilograms. Therefore, the light weight of the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform can be facilitated, and the multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform can be conveniently held in hand.
Fig. 1 and 2 illustrate a multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform according to some examples of the invention. As shown in fig. 1 and 2, the housing 10 is a unitary piece. Therefore, the structural strength of the housing 10 can be ensured, the mounting steps of the housing 10 can be saved, and the connecting piece on the housing 10 can be saved, so that the weight of the housing 10 can be further reduced, and the housing 10 can be lightened.
Specifically, the housing 10 is a 3D printed piece. This may facilitate the shaping of the housing 10.
Alternatively, the housing 10 is a resin material piece. This can facilitate the manufacture of the housing 10 while ensuring the structural strength of the housing 10.
Further, the wall thickness of the housing 10 is 3-5 mm. This makes it possible to reduce the weight of the housing 10 while ensuring the structural strength of the housing 10.
Specifically, the observation portion 210 protrudes upward above the upper surface of the lower portion 200. This may facilitate viewing of structures within the lower portion 200.
A touch screen mounting hole 150 is formed at the rear edge of the lower surface of the upper portion 100, and the touch display screen is mounted on the lower portion 200 by a fastener fitted into the touch screen mounting hole 150. This may facilitate the mounting of the touch sensitive display screen.
The front surface of the grip 310 may be provided with a non-slip protrusion and the rear surface may be provided with a depression. This may make the grip 310 more ergonomic for the user to hold.
The housing 10 is adapted to be mounted on an unmanned vehicle to facilitate the collection of multi-source fusion data with the unmanned vehicle.
As can be understood by those skilled in the art, the positions of the sensors need to be calibrated between formal acquisition of the multi-source heterogeneous sensor data fusion and the cooperative sensing handheld mobile platform, so that data fusion is facilitated. The calibration of the laser radar and the camera adopts a PnP (passive-n-Point) method, the line-surface characteristics on the laser radar Point cloud are manually selected by extracting the angular Point characteristics of a calibration plate on a color picture, so that matching is performed, the optimization is performed by using a least square method, and finally a transfer matrix from the laser radar to the camera is obtained. And verifying the calibration result in a reprojection mode.
The laser radar and the inertia measurement unit are calibrated by adopting the principle of hand-eye calibration. Firstly, collecting 8-word data outdoors by using a portable device, then constructing a map of the surrounding environment by using a fast-lio SLAM algorithm, and simultaneously collecting the odometry information of the laser radar to obtain the movement track of the laser radar; and then recording data of the inertial measurement unit and solving the position information. And then, optimizing the distance between the two tracks by a least square method, thereby obtaining an optimal rotation and translation matrix from the laser radar to the inertial measurement unit, namely a calibration result.
After the calibration is completed, various algorithms can be deployed on the industrial personal computer to verify the target detection distance, positioning accuracy and other targets.
According to the multi-source heterogeneous sensor data fusion and cooperative sensing handheld mobile platform provided by the embodiment of the invention, the platform can be held by hands to complete the task of building images indoors and outdoors, and can also be placed on an unmanned vehicle to assist in completing the task of unmanned driving. The method can complete the work of three-dimensional reconstruction and the like, and the constructed three-dimensional map can be used in actual tasks such as building construction, factory detection, AR/VR and the like. Of course, the platform can also be used as an experimental platform for algorithms such as colleges and universities research machine vision, SLAM and three-dimensional reconstruction
Other configurations and operations of the multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform according to embodiments of the present invention are known to those of ordinary skill in the art and will not be described in detail herein.
In the description herein, references to the description of the term "one embodiment," "some embodiments," "an illustrative embodiment," "an example," "a specific example," or "some examples" or the like mean that a particular feature, structure, material, or characteristic described in connection with the embodiment or example is included in at least one embodiment or example of the invention. In this specification, the schematic representations of the terms used above do not necessarily refer to the same embodiment or example. Furthermore, the particular features, structures, materials, or characteristics described may be combined in any suitable manner in any one or more embodiments or examples.
While embodiments of the invention have been shown and described, it will be understood by those of ordinary skill in the art that: various changes, modifications, substitutions and alterations can be made to the embodiments without departing from the principles and spirit of the invention, the scope of which is defined by the claims and their equivalents.

Claims (10)

1. A multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform is characterized by comprising:
the shell comprises an upper layer part, a lower layer part and two connecting rods, wherein an upper installation cavity is arranged in the upper layer part, a lower installation cavity is arranged in the lower layer part, the lower layer part and the upper layer part are arranged at intervals in the vertical direction, two ends of each connecting rod are respectively connected with the upper layer part and the lower layer part, a holding part is formed on the outer surface of each connecting rod, a wire passing channel communicated with the upper layer part and the lower layer part is formed in each connecting rod, the two connecting rods are respectively located on two sides of the upper layer part and two sides of the lower layer part, an observation part is arranged on the upper surface of the lower layer part, and the observation part is provided with an observation port communicated with the lower installation cavity;
a lidar mounted on an upper surface of the upper layer portion;
the two visible light cameras are arranged on the front surface of the upper layer part and are arranged at intervals along the horizontal direction;
the infrared camera is arranged on the front surface of the upper layer part and is positioned between the two visible light cameras;
a satellite positioning device mounted within the upper mounting cavity;
the touch display screen is arranged on the lower surface of the upper layer part in a turnover mode and at least provided with a containing position attached to the lower surface of the upper layer part;
the upper circuit board is arranged in the upper mounting cavity and is electrically connected with the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device and the touch display screen respectively;
a millimeter wave radar mounted on a front surface of the lower layer portion;
an inertial measurement unit mounted within the lower mounting cavity;
a time synchronization device adapted to generate a pulse per second and electrically connected to the laser radar, the two visible light cameras, the infrared camera, the satellite positioning device, the millimeter wave radar, and the inertial measurement unit, respectively;
a computing unit mounted within the lower mounting cavity;
a switch mounted within the lower mounting cavity;
the industrial personal computer is installed in the lower installation cavity and is suitable for observing through the observation port;
the lower-layer circuit board is installed in the lower installation cavity and is respectively and electrically connected with the millimeter-wave radar, the inertia measurement unit, the calculation unit, the switch and the industrial personal computer, and the lower-layer circuit board is electrically connected with the upper-layer circuit board through a wire harness penetrating through the wire passing channel;
the power conversion device is arranged in the lower mounting cavity and is electrically connected with the lower-layer circuit board;
and the external power supply device is positioned outside the shell and is electrically connected with the power supply conversion device through a power supply line.
2. The handheld mobile platform for data fusion and collaborative perception of the multisource heterogeneous sensor according to claim 1, wherein a laser radar installation groove for installing the laser radar is formed in the upper surface of the upper layer portion, an infrared camera installation opening for installing an infrared camera and two visible light camera installation openings for installing the visible light cameras are formed in the front surface of the upper layer portion, and a millimeter wave radar installation opening for installing the millimeter wave radar is formed in the front surface of the lower layer portion.
3. The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to claim 1, wherein the view port cover is covered with a transparent partition.
4. The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to claim 1, wherein the housing further includes an upper bottom cover and a lower rear cover, an upper opening is provided on a lower surface of the upper layer portion, a lower opening is provided on a rear surface of the lower layer portion, the upper bottom cover is openably and closably mounted on the upper layer portion to open or close the upper opening, and the lower rear cover is openably and closably mounted on the lower layer portion to open or close the lower opening.
5. The multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform according to claim 1, wherein the multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform has a height of 300 mm or less and a horizontal length and width of 250 mm or less.
6. The multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform of claim 1, wherein a weight of the multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform is less than or equal to 5 kilograms.
7. The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to claim 1, wherein the housing is a piece of resin material.
8. The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to claim 1, wherein a wall thickness of the housing is 3-5 millimeters.
9. The multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform according to claim 1, wherein the housing is a single piece.
10. The multi-source heterogeneous sensor data fusion and collaborative awareness handheld mobile platform of claim 1, wherein the housing is a 3D print.
CN202210869647.2A 2022-07-22 2022-07-22 Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform Active CN115290069B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210869647.2A CN115290069B (en) 2022-07-22 2022-07-22 Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210869647.2A CN115290069B (en) 2022-07-22 2022-07-22 Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform

Publications (2)

Publication Number Publication Date
CN115290069A true CN115290069A (en) 2022-11-04
CN115290069B CN115290069B (en) 2024-06-18

Family

ID=83823414

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210869647.2A Active CN115290069B (en) 2022-07-22 2022-07-22 Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform

Country Status (1)

Country Link
CN (1) CN115290069B (en)

Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103703363A (en) * 2011-04-15 2014-04-02 法罗技术股份有限公司 Six degree-of-freedom laser tracker that cooperates with a remote sensor
CN109583383A (en) * 2018-11-30 2019-04-05 湖南华诺星空电子技术有限公司 A kind of unmanned plane life detection method and system based on Multiple Source Sensor
WO2019209887A1 (en) * 2018-04-23 2019-10-31 The Regents Of The University Of Colorado, A Body Corporate Mobile and augmented reality based depth and thermal fusion scan
WO2020014181A1 (en) * 2018-07-09 2020-01-16 Siemens Aktiengesellschaft Knowledge graph for real time industrial control system security event monitoring and management
CN110873879A (en) * 2018-08-30 2020-03-10 沈阳航空航天大学 Device and method for deep fusion of characteristics of multi-source heterogeneous sensor
CN110900618A (en) * 2019-10-22 2020-03-24 上海东古智能科技有限公司 Automatic inspection system based on robot
CN111527463A (en) * 2018-01-22 2020-08-11 深圳市大疆创新科技有限公司 Method and system for multi-target tracking
CN112687113A (en) * 2020-12-31 2021-04-20 北京星云互联科技有限公司 Roadside information perception equipment
CN113313154A (en) * 2021-05-20 2021-08-27 四川天奥空天信息技术有限公司 Integrated multi-sensor integrated automatic driving intelligent sensing device
CN113632030A (en) * 2018-12-27 2021-11-09 奇跃公司 System and method for virtual reality and augmented reality
CN113795773A (en) * 2019-03-08 2021-12-14 欧司朗股份有限公司 Component for a LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device
CN113884090A (en) * 2021-09-28 2022-01-04 中国科学技术大学先进技术研究院 Intelligent platform vehicle environment sensing system and data fusion method thereof
CN215678754U (en) * 2021-09-07 2022-01-28 山东佳禾轨道车辆服务有限公司 Portable three-dimensional point cloud laser radar detection early warning device
CN114298142A (en) * 2021-11-22 2022-04-08 理工雷科智途(泰安)汽车科技有限公司 Multi-source heterogeneous sensor information fusion method and device for camera and millimeter wave radar
CN114397877A (en) * 2021-06-25 2022-04-26 南京交通职业技术学院 Intelligent automobile automatic driving system

Patent Citations (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103703363A (en) * 2011-04-15 2014-04-02 法罗技术股份有限公司 Six degree-of-freedom laser tracker that cooperates with a remote sensor
CN111527463A (en) * 2018-01-22 2020-08-11 深圳市大疆创新科技有限公司 Method and system for multi-target tracking
WO2019209887A1 (en) * 2018-04-23 2019-10-31 The Regents Of The University Of Colorado, A Body Corporate Mobile and augmented reality based depth and thermal fusion scan
WO2020014181A1 (en) * 2018-07-09 2020-01-16 Siemens Aktiengesellschaft Knowledge graph for real time industrial control system security event monitoring and management
CN110873879A (en) * 2018-08-30 2020-03-10 沈阳航空航天大学 Device and method for deep fusion of characteristics of multi-source heterogeneous sensor
CN109583383A (en) * 2018-11-30 2019-04-05 湖南华诺星空电子技术有限公司 A kind of unmanned plane life detection method and system based on Multiple Source Sensor
CN113632030A (en) * 2018-12-27 2021-11-09 奇跃公司 System and method for virtual reality and augmented reality
CN113795773A (en) * 2019-03-08 2021-12-14 欧司朗股份有限公司 Component for a LIDAR sensor system, LIDAR sensor device, method for a LIDAR sensor system and method for a LIDAR sensor device
CN110900618A (en) * 2019-10-22 2020-03-24 上海东古智能科技有限公司 Automatic inspection system based on robot
CN112687113A (en) * 2020-12-31 2021-04-20 北京星云互联科技有限公司 Roadside information perception equipment
CN113313154A (en) * 2021-05-20 2021-08-27 四川天奥空天信息技术有限公司 Integrated multi-sensor integrated automatic driving intelligent sensing device
CN114397877A (en) * 2021-06-25 2022-04-26 南京交通职业技术学院 Intelligent automobile automatic driving system
CN215678754U (en) * 2021-09-07 2022-01-28 山东佳禾轨道车辆服务有限公司 Portable three-dimensional point cloud laser radar detection early warning device
CN113884090A (en) * 2021-09-28 2022-01-04 中国科学技术大学先进技术研究院 Intelligent platform vehicle environment sensing system and data fusion method thereof
CN114298142A (en) * 2021-11-22 2022-04-08 理工雷科智途(泰安)汽车科技有限公司 Multi-source heterogeneous sensor information fusion method and device for camera and millimeter wave radar

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ZHIWEI ZHONG; XIANMING LIU; JUNJUN JIANG; DEBIN ZHAO; ZHIWEN CHEN; XIANGYANG JI: "High-Resolution Depth Maps Imaging via Attention-Based Hierarchical Multi-Modal Fusion", IEEE TRANSACTIONS ON IMAGE PROCESSING, vol. 31, 8 December 2021 (2021-12-08) *
苗濛;吴克捷: "机载探测系统的最新发展", 国际航空, no. 04, 8 April 2005 (2005-04-08) *

Also Published As

Publication number Publication date
CN115290069B (en) 2024-06-18

Similar Documents

Publication Publication Date Title
US10914569B2 (en) System and method of defining a path and scanning an environment
CN106774436B (en) Control system and method for stably tracking target of rotor unmanned aerial vehicle based on vision
CN109730590B (en) Cleaning robot and method for automatically returning and charging same
CN105492985B (en) A kind of system and method for the control loose impediment in environment
US11692811B2 (en) System and method of defining a path and scanning an environment
KR102608046B1 (en) Guidance robot for airport and method thereof
EP3637141A1 (en) A system and method of defining a path and scanning an environment
JP2022554248A (en) Structural scanning using unmanned air vehicles
CN107995962B (en) Obstacle avoidance method and device, movable object and computer readable storage medium
CN113140040A (en) Multi-sensor fusion coal mine underground space positioning and mapping method and device
CN108814452A (en) Sweeping robot and its disorder detection method
CN113566833A (en) Multi-sensor fusion vehicle positioning method and system
CN112828853A (en) Indoor autonomous mobile robot
CN111251271B (en) SLAM robot for constructing and positioning rotary laser radar and indoor map
CN115435772A (en) Method and device for establishing local map, electronic equipment and readable storage medium
US20200056890A1 (en) Mapping and tracking methods and systems principally for use in connection with swimming pools and spas
CN208198848U (en) A kind of airborne aerial device and the unmanned plane containing it
CN113081525A (en) Intelligent walking aid equipment and control method thereof
CN115290069A (en) Multi-source heterogeneous sensor data fusion and collaborative perception handheld mobile platform
CN113140039A (en) Multi-sensor fusion underground coal mine digital positioning and map construction system
KR101350930B1 (en) Air shooting system for processing image with photograph and edit shooting image
Jensen et al. Laser range imaging using mobile robots: From pose estimation to 3d-models
CN208188678U (en) Unmanned machine positioning device and unmanned plane
CN116352722A (en) Multi-sensor fused mine inspection rescue robot and control method thereof
CN216265979U (en) Indoor autonomous mobile robot

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant