CN113177973A - Multi-data fusion processing system and method - Google Patents

Multi-data fusion processing system and method Download PDF

Info

Publication number
CN113177973A
CN113177973A CN202110563807.6A CN202110563807A CN113177973A CN 113177973 A CN113177973 A CN 113177973A CN 202110563807 A CN202110563807 A CN 202110563807A CN 113177973 A CN113177973 A CN 113177973A
Authority
CN
China
Prior art keywords
data
sensor
distance
format
thermal radiation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110563807.6A
Other languages
Chinese (zh)
Inventor
赵雪香
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sichuan Qiliwei Innovation Technology Co ltd
Original Assignee
Sichuan Qiliwei Innovation Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sichuan Qiliwei Innovation Technology Co ltd filed Critical Sichuan Qiliwei Innovation Technology Co ltd
Priority to CN202110563807.6A priority Critical patent/CN113177973A/en
Publication of CN113177973A publication Critical patent/CN113177973A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/246Analysis of motion using feature-based methods, e.g. the tracking of corners or segments
    • G06T7/248Analysis of motion using feature-based methods, e.g. the tracking of corners or segments involving reference images or patches
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S11/00Systems for determining distance or velocity not using reflection or reradiation
    • G01S11/12Systems for determining distance or velocity not using reflection or reradiation using electromagnetic waves other than radio waves
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/865Combination of radar systems with lidar systems
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • G01S13/867Combination of radar systems with cameras
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • G01S13/931Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S17/00Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
    • G01S17/88Lidar systems specially adapted for specific applications
    • G01S17/93Lidar systems specially adapted for specific applications for anti-collision purposes
    • G01S17/931Lidar systems specially adapted for specific applications for anti-collision purposes of land vehicles
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30181Earth observation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30241Trajectory

Abstract

The invention discloses a multi-data fusion processing system and a method, wherein the system comprises a plurality of sensors for acquiring environmental data, a sensor processing module for processing corresponding sensor data, a central processing module for processing data with the same format and generating an electronic map and identification data, and a central fusion module for establishing the corresponding relation between the identification data and the electronic map data, thereby realizing multi-data fusion processing and providing multi-aspect reference information for the operation of a robot.

Description

Multi-data fusion processing system and method
Technical Field
The invention relates to the field of sensor data processing, in particular to a multi-data fusion processing system and method with multiple types of sensors.
Background
With the increasing diversification and refinement of the use of the intelligent robot, the demand for the intelligent robot is also increasing. However, no matter how intelligent the robot is, the robot is intelligently controlled by various algorithms based on monitoring data.
Monitoring of data is mainly achieved by various sensors, such as a depth camera, an infrared sensor, an ultrasonic sensor, a laser radar, and the like, which are widely used in a vehicle driving assistance system (ADAS) and an automatic driving system, a robot, an Automated Guided Vehicle (AGV), an intelligent gardening device, and various systems requiring environmental awareness.
However, different types of sensors have different acquired data formats, and thus, a lot of obstacles are caused to the comprehensive utilization of various data and the improvement of accurate control of the robot. If the advantages of multiple data cannot be exerted in the field needing real-time judgment or with complex environment, judgment errors or delayed decision making can be caused to cause disastrous results. Particularly for the monitoring of map information of the surrounding environment and moving objects, different monitoring data of various sensors are needed at the same time for comprehensive analysis and determination of the operation of the robot.
Disclosure of Invention
To at least partially solve the above-mentioned problems, the present invention provides a multiple data fusion processing system, comprising: a plurality of sensors for acquiring environmental data, the environmental data including distance data and moving object data; the sensor processing modules are used for processing corresponding environment data, forming first format data and then transmitting the first format data to the database, wherein the first format data comprises sensor identification; the central processing module is used for matching the preset sensor identifier with the first format data, calling the distance data to form electronic map data, and calling the moving object data to form identification data, wherein the identification data comprises a relative distance and a heat radiation parameter; the central fusion module is used for generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance.
Further, the sensor is a laser sensor, a radar sensor and a thermal radiation imager; the sensor processing module is used for acquiring the distance data from the laser sensor, and the processed first format data comprises the distance data and a laser sensor identifier; the first format data formed by processing comprises the relative distance and radar sensor identification; the first format data formed by processing comprises the thermal radiation parameters and the thermal radiation imager identification; the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area.
Further, the central processing module is used for matching the preset laser sensor identifier with the first format data, calling the distance data and generating electronic map data by using an SLAM algorithm;
matching the preset radar sensor identification with the first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm; the thermal radiation imaging system is used for matching the preset thermal radiation imager identifier with the first format data and calling the thermal radiation parameters to generate an imaging graph; the identification data includes the motion trajectory and the imaging map.
Further, the central fusion module is configured to compare a preset biological image library with the imaging graph to generate biological type information.
Further, the sensor processing module is configured to find data abnormality when processing the environmental data, and add warning information to the first format data.
In another aspect, the present invention provides a method for processing multiple data fusion, comprising: acquiring environment data, wherein the environment data comprises distance data and moving object data, and the environment data is obtained by monitoring a sensor; processing corresponding environment data, forming first format data and transmitting the first format data to a database, wherein the first format data comprises a sensor identifier; matching the preset sensor identification with the first format data, calling the distance data to form electronic map data, and calling the moving object data to form identification data, wherein the identification data comprises relative distance and heat radiation parameters; generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance.
Further, the distance data is acquired from a laser sensor, and the processed first format data comprises the distance data and a laser sensor identifier; acquiring the moving object data from a radar sensor, wherein the processed first format data comprises a relative distance and a radar sensor identifier; acquiring the moving object data from a thermal radiation imager, wherein the processed first format data comprises the thermal radiation parameters and a thermal radiation imager identifier; the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area.
Further, matching is carried out on the basis of a preset laser sensor identifier and the first format data, and the distance data is called to generate electronic map data by using a SLAM algorithm; matching the preset radar sensor identification with the first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm; and matching the preset thermal radiation imager identifier with the first format data, and calling the thermal radiation parameters to generate an imaging graph.
Further, biological species information is generated based on comparison between a preset biological image library and the imaging graph.
Further, when the environmental data is processed, data abnormality is found, and early warning information is added to the first format data.
The multi-data fusion processing system and the method provided by the invention are characterized in that corresponding processing modules are respectively arranged aiming at different types of sensors, so that monitoring data are unified into the same data format and sensor identifiers are arranged, and the data are convenient to call to provide a basis for realizing data fusion; the sensors transmit the first format data to the same database, the central processing module calls the corresponding first format data according to the sensor identification to generate electronic map data and identification data, and the central fusion module establishes the corresponding relation between the identification data and the electronic map data, so that the fusion processing of multiple data is realized, and multiple reference information is provided for the operation of the robot.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application. In the drawings:
fig. 1 is a schematic diagram of the configuration of the multiple data fusion processing system of the present invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments in the present application may be combined with each other without conflict. The term "first" in the description and claims of this application and the above-described drawings does not have an implied meaning nor does it have a meaning that describes a particular order or sequence; "plurality" means two or more.
In order to make the technical solutions better understood by those skilled in the art, the technical solutions in the embodiments of the present application will be clearly and completely described below with reference to the drawings in the embodiments of the present application, and it is obvious that the described embodiments are only partial embodiments of the present application, but not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present application.
As shown in fig. 1, a multiple data fusion processing system includes a plurality of sensors, a plurality of sensor processing modules, a central processing module, and a central fusion module.
The system comprises a plurality of sensors, a data processing unit and a data processing unit, wherein the sensors are used for acquiring environment data, and the environment data comprises distance data and moving object data. The sensors for monitoring the distance data can be more than one, can be arranged at fixed positions of the surrounding environment, and can also be arranged on the robot. Similarly, more than one sensor for monitoring the data of the moving object can be arranged at a fixed place of the surrounding environment, and can also be arranged on the robot; and different types of sensors can be used to obtain monitoring data of different aspects so as to monitor the moving object more comprehensively.
Specifically, the sensor may be a radar, and the corresponding data is the relative distance, the relative movement speed, or the radar scattering cross section RCS data of the target captured by the radar; the sensor can be a camera, and the corresponding data is image information captured by the camera. The data monitored by the sensors can be set as distance data or moving object data according to actual needs.
The sensor processing modules are used for processing corresponding environment data, forming first format data and then transmitting the first format data to the database, wherein the first format data comprises sensor identification. Different types of sensors have different data formats, and even the data formats output by different types of sensors of the same type are not completely the same. In the scheme, a corresponding sensor processing module is arranged for each sensor, so that data of the corresponding sensors are processed in a targeted mode and processed into the same first format data in a unified mode, and corresponding sensor identifiers are arranged on the first format data of different sensors. Specifically, the sensor identification of the radar sensor can be 001, the sensor identification of the camera sensor can be 002, and after data in the same format is transmitted to the database, the processing module in the database can distinguish the data through the sensor identification, and then calls a corresponding algorithm to obtain target data.
In one embodiment, the sensor processing module is configured to detect data anomalies when processing the environmental data and add warning information to the first format data. And when the central processing module processes the data, the central processing module judges early warning information and sends out warning information to a system user to check the condition of the sensor.
The central processing module is used for matching with the first format data based on a preset sensor identifier, calling distance data to form electronic map data, and calling moving object data to form identification data, wherein the identification data comprises a relative distance and a heat radiation parameter. The central processing module calls first format data containing distance data based on preset sensor identification, and the first format data is used as an input parameter of an electronic map algorithm to form electronic map data so as to obtain information such as distribution and shape of surrounding obstacles; calling first format data containing moving object data, using the first format data as an input parameter of a moving track algorithm or a heat radiation algorithm and the like to form identification data, and obtaining information such as a moving track of a moving object and a radiation image of the object.
The central fusion module is used for generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance. The central fusion module is used for fusing the identification data and the electronic map data, and determining the initial position and the motion track of the moving object on the electronic map data according to the relative distance in the moving object data, so that the motion trend of the moving object on the electronic map is predicted. In one embodiment, the thermal radiation data of the moving object is monitored to judge whether the moving object is a living object, and the further operation of the robot is determined by combining the movement trend of the moving object.
In one embodiment, the sensors are laser sensors, radar sensors, and thermal radiation imagers; the sensor processing module is used for acquiring distance data from the laser sensor, and the processed first format data comprises the distance data and a laser sensor identifier; the radar sensor processing device is used for acquiring moving object data from a radar sensor, and the processed first format data comprises a relative distance and a radar sensor identifier; the first format data formed by processing comprises thermal radiation parameters and a thermal radiation imager identifier; the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area. The distance data of the scheme is point cloud data used for generating electronic map data; the continuously measured relative distance is used for establishing a motion track of the moving object, identifying the motion trend of the moving object and providing a corresponding basis for later data fusion; the thermal radiation parameters are used to generate an image of the moving object, which is at least a contour image of the moving object, and can identify whether the moving object is a living body or not and even judge the biological species, for example, judge the biological species by manually identifying a thermal imaging image of the moving object.
Specifically, the laser sensor extracts and finds that the position of the target is distance data by analyzing the characteristics of the received target echo; the thermal radiation imager utilizes the thermal effect of thermal radiation to cause temperature rise after absorbing the radiant energy, so that certain related physical parameters are changed, and the thermal radiation absorbed by the thermal radiation imager is determined by measuring the change of the physical parameters, namely thermal radiation parameters; and the radar sensor acquires continuous relative distance data of the moving object according to the ultrasonic waves reflected back by the moving object. The laser sensor, the radar sensor and the thermal radiation imager monitor the same area cooperatively, and target environment data and identification data of a moving object in the same area at the same time period can be obtained simultaneously. Specifically, the scheme can acquire the motion trail of the moving object in the monitoring area and judge whether the moving object is a living object; according to the actual monitoring data, the robot is further intelligently controlled to further act, such as stop moving, bypassing, stopping operation and the like, so that the safety of organisms and the robot is ensured.
In one embodiment, the central processing module is used for matching the preset laser sensor identifier with the first format data, calling distance data and generating electronic map data by using a SLAM algorithm; matching the preset radar sensor identification with the first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm; and the thermal radiation imaging device is used for matching the preset thermal radiation imaging device identification with the first format data and calling the thermal radiation parameters to generate an imaging graph. In the scheme, corresponding points of relative positions on the electronic map are connected to form a motion track according to the relative distance of continuous moving objects.
In one embodiment, the central fusion module is configured to generate the biological type information based on a comparison between a preset biological image library and the imaging graph. In the scheme, an imaging graph is generated according to the thermal radiation parameters measured by the thermal radiation imager, and the imaging graph of the thermal radiation imager can also be directly called; comparing the imaging image with a biological image library preset in a database one by one, and when the similarity reaches a preset value, such as 80%, judging that the moving object is the biological species, and generating biological species information; the imaging graph is at least a contour graph of the moving object. Therefore, the robot can replace manpower, intelligently judge the biological species, provide multi-dimensional richer reference data for the operation of the robot by combining the motion trail of the robot, and control the coping operation of the robot more reasonably and scientifically.
Another aspect of the present invention provides a multi-data fusion processing method: acquiring environmental data, wherein the environmental data comprises distance data and moving object data, and the environmental data is obtained by monitoring through a sensor; processing the corresponding environment data, forming first format data and transmitting the first format data to the database, wherein the first format data comprises a sensor identifier; matching the preset sensor identification with the first format data, calling distance data to form electronic map data, and calling moving object data to form identification data, wherein the identification data comprises relative distance and heat radiation parameters; generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance.
In one embodiment, distance data is acquired from a laser sensor, and first format data formed by processing comprises the distance data and a laser sensor identifier; acquiring the moving object data from a radar sensor, wherein the processed first format data comprises a relative distance and a radar sensor identifier; acquiring the moving object data from a thermal radiation imager, wherein the processed first format data comprises thermal radiation parameters and a thermal radiation imager identifier; the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area.
In one embodiment, based on the matching of a preset laser sensor identifier and first format data, calling the distance data and generating electronic map data by using a SLAM algorithm; matching the preset radar sensor identification with first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm; and matching the preset thermal radiation imager identifier with the first format data, and calling the thermal radiation parameters to generate an imaging graph.
In the specific embodiment, the environmental data is obtained by monitoring by a sensor, and the laser sensor extracts and finds that the position of the target is distance data by analyzing the characteristics of the received target echo; the thermal radiation imager utilizes the thermal effect of thermal radiation to cause temperature rise after absorbing the radiant energy, so that certain related physical parameters are changed, and the thermal radiation absorbed by the thermal radiation imager is determined by measuring the change of the physical parameters, namely thermal radiation parameters; and the radar sensor acquires continuous relative distance data of the moving object according to the ultrasonic waves reflected back by the moving object. The laser sensor, the radar sensor and the thermal radiation imager monitor the same area cooperatively, and target environment data and identification data of a moving object in the same area at the same time period can be obtained simultaneously.
Calling first format data containing distance data based on a preset sensor identifier, and forming electronic map data by taking the first format data as an input parameter of an electronic map algorithm to obtain information such as distribution and shape of surrounding obstacles; calling first format data containing moving object data, using the first format data as an input parameter of a motion track algorithm or a radiation algorithm and the like to form identification data, and obtaining information such as a motion track of a moving object and a radiation condition of the object.
And determining the initial position and the motion track of the moving object on the electronic map data according to the relative distance in the moving object data, thereby predicting the motion trend of the monitored object. In one embodiment, the radiation data of the moving object is monitored to judge whether the moving object is an organism or not, and the further operation of the robot is determined by combining the motion track of the moving object.
In one embodiment, the biological species information is generated based on a comparison between a preset biological image library and an imaging map. The scheme can monitor moving objects in the area and judge the biological species. Specifically, an imaging graph is generated according to the thermal radiation parameters measured by the thermal radiation imager, and the imaging graph of the thermal radiation imager can also be directly called; comparing the imaging image with a biological image library preset in a database one by one, and when the similarity reaches a preset value, such as 80%, judging that the moving object is the biological species, and generating biological species information; the imaging graph is at least a contour graph of the moving object. Therefore, the method can replace manpower, realize intelligent judgment of the biological species, and provide multi-dimensional richer reference data for the operation of the robot by combining the motion trail of the biological species.
According to the multidimensional monitoring data, the coping operation of the robot can be more reasonably and scientifically controlled, such as the control of the robot to stop moving, detour, stop working and the like, and the safety of organisms and the robot is ensured.
In another embodiment, when the environmental data is processed, data abnormality is found, and early warning information is added to the first format data. And when the central processing module processes the data, the central processing module judges early warning information and sends out warning information to a system user to check the condition of the sensor.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of methods according to embodiments of the application. It will be understood that each flow and/or block of the flow diagrams and/or block diagrams, and combinations of flows and/or blocks in the flow diagrams and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
The process modules, central processing modules, and central fusion modules of the present application may be computer program instructions, or may be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
In a typical configuration, a computing device includes one or more processors (CPUs), input/output interfaces, network interfaces, and memory.
The memory may include forms of volatile memory in a computer readable medium, Random Access Memory (RAM) and/or non-volatile memory, such as Read Only Memory (ROM) or flash memory (flash RAM). The memory is an example of a computer-readable medium.
Computer-readable media, including both non-transitory and non-transitory, removable and non-removable media, may implement information storage by any method or technology. The information may be computer readable instructions, data structures, modules of a program, or other data.
Examples of computer storage media include, but are not limited to, phase change memory (PRAM), Static Random Access Memory (SRAM), Dynamic Random Access Memory (DRAM), other types of Random Access Memory (RAM), Read Only Memory (ROM), Electrically Erasable Programmable Read Only Memory (EEPROM), flash memory or other memory technology, compact disc read only memory (CD-ROM), Digital Versatile Discs (DVD) or other optical storage, magnetic cassettes, magnetic tape magnetic disk storage or other magnetic storage devices, or any other non-transmission medium that can be used to store information that can be accessed by a computing device. As defined herein, a computer readable medium does not include a transitory computer readable medium such as a modulated data signal and a carrier wave.
It is also noted that the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, data, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
As will be appreciated by one skilled in the art, embodiments of the present application may be provided as a method, system, or computer program product. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product embodied on one or more computer-usable storage media (including, but not limited to, disk storage, CD-ROM, optical storage, and the like) having computer-usable program code embodied therein.
The above are merely examples of the present application and are not intended to limit the present application. Various modifications and changes may occur to those skilled in the art. Any modification, equivalent replacement, improvement, etc. made within the spirit and principle of the present application should be included in the scope of the claims of the present application.

Claims (10)

1. A multiple data fusion processing system, the system comprising:
a plurality of sensors for acquiring environmental data, the environmental data including distance data and moving object data;
the sensor processing modules are used for processing corresponding environment data, forming first format data and then transmitting the first format data to the database, wherein the first format data comprises sensor identification;
the central processing module is used for matching the preset sensor identifier with the first format data, calling the distance data to form electronic map data, and calling the moving object data to form identification data, wherein the identification data comprises a relative distance and a heat radiation parameter;
the central fusion module is used for generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance.
2. The multiple data fusion processing system of claim 1,
the sensor is a laser sensor, a radar sensor and a thermal radiation imager;
the sensor processing module is used for acquiring the distance data from the laser sensor, and the processed first format data comprises the distance data and a laser sensor identifier; the first format data formed by processing comprises the relative distance and radar sensor identification; the first format data formed by processing comprises the thermal radiation parameters and the thermal radiation imager identification;
the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area.
3. The multiple data fusion processing system of claim 2,
the central processing module is used for matching the preset laser sensor identification with the first format data and calling the distance data to generate electronic map data by using a SLAM algorithm;
matching the preset radar sensor identification with the first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm;
and the thermal radiation imaging device is used for matching the preset thermal radiation imaging device identification with the first format data and calling the thermal radiation parameters to generate an imaging graph.
4. The multiple data fusion processing system of claim 3,
the central fusion module is used for comparing a preset biological image library with the imaging graph to generate biological species information.
5. The multiple data fusion processing system of claim 1,
the sensor processing module is used for finding data abnormity when processing the environment data and adding early warning information in the first format data.
6. The multi-data fusion processing method is characterized by comprising the steps of obtaining environment data, wherein the environment data comprise distance data and moving object data, and the environment data are obtained by monitoring through a sensor;
processing corresponding environment data, forming first format data and transmitting the first format data to a database, wherein the first format data comprises a sensor identifier;
matching the preset sensor identification with the first format data, calling the distance data to form electronic map data, and calling the moving object data to form identification data, wherein the identification data comprises relative distance and heat radiation parameters;
generating a motion track of the moving object according to the relative distance; and determining the relative position of the motion trail in the electronic map data according to the corresponding relation between the distance data and the relative distance.
7. The multiple data fusion process of claim 6, wherein the distance data is obtained from a laser sensor, and the first format data processed comprises the distance data and a laser sensor identification; acquiring the moving object data from a radar sensor, wherein the processed first format data comprises a relative distance and a radar sensor identifier; acquiring the moving object data from a thermal radiation imager, wherein the processed first format data comprises the thermal radiation parameters and a thermal radiation imager identifier; the laser sensor, the radar sensor and the thermal radiation imager cooperatively monitor the same area.
8. The multiple data fusion processing method as defined in claim 7,
matching the preset laser sensor identification with the first format data, calling the distance data, and generating electronic map data by using a SLAM algorithm;
matching the preset radar sensor identification with the first format data, and calling the relative distance to generate a motion track by using a Kalman filtering algorithm; and matching the preset thermal radiation imager identifier with the first format data, and calling the thermal radiation parameters to generate an imaging graph.
9. The multiple data fusion processing method as claimed in claim 8, wherein the biological species information is generated based on a comparison between a predetermined biological image library and the imaging graph.
10. The multiple data fusion processing method as claimed in claim 6, wherein when data abnormality is found while processing the environment data, warning information is added to the first format data.
CN202110563807.6A 2021-05-24 2021-05-24 Multi-data fusion processing system and method Pending CN113177973A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110563807.6A CN113177973A (en) 2021-05-24 2021-05-24 Multi-data fusion processing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110563807.6A CN113177973A (en) 2021-05-24 2021-05-24 Multi-data fusion processing system and method

Publications (1)

Publication Number Publication Date
CN113177973A true CN113177973A (en) 2021-07-27

Family

ID=76929718

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110563807.6A Pending CN113177973A (en) 2021-05-24 2021-05-24 Multi-data fusion processing system and method

Country Status (1)

Country Link
CN (1) CN113177973A (en)

Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU1800401A (en) * 1999-11-23 2001-06-04 Maxygen, Inc. Shuffling of agrobacterium and viral genes, plasmids and genomes for improved plant transformation
CN103207634A (en) * 2013-03-20 2013-07-17 北京工业大学 Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
CN104049585A (en) * 2013-03-15 2014-09-17 费希尔-罗斯蒙特系统公司 Context sensitive mobile control in process plant
CN105515184A (en) * 2015-12-04 2016-04-20 国网河南省电力公司电力科学研究院 Wireless sensor network-based cooperative monitoring system of multi-sensor and multi-parameter distribution network
CN107003785A (en) * 2014-12-09 2017-08-01 巴斯夫欧洲公司 Fluorescence detector
US20180181737A1 (en) * 2014-08-28 2018-06-28 Facetec, Inc. Facial Recognition Authentication System Including Path Parameters
US20180233047A1 (en) * 2017-02-11 2018-08-16 Ben Mandeville-Clarke Systems and methods for detecting and avoiding an emergency vehicle in the proximity of a substantially autonomous vehicle
CN109059927A (en) * 2018-08-21 2018-12-21 南京邮电大学 The mobile robot slam of multisensor builds drawing method and system under complex environment
CN109188460A (en) * 2018-09-25 2019-01-11 北京华开领航科技有限责任公司 Unmanned foreign matter detection system and method
CN109916397A (en) * 2019-03-15 2019-06-21 斑马网络技术有限公司 For tracking method, apparatus, electronic equipment and the storage medium of inspection track
CN110361027A (en) * 2019-06-25 2019-10-22 马鞍山天邦开物智能商务管理有限公司 Robot path planning method based on single line laser radar Yu binocular camera data fusion
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof

Patent Citations (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU1800401A (en) * 1999-11-23 2001-06-04 Maxygen, Inc. Shuffling of agrobacterium and viral genes, plasmids and genomes for improved plant transformation
CN104049585A (en) * 2013-03-15 2014-09-17 费希尔-罗斯蒙特系统公司 Context sensitive mobile control in process plant
CN103207634A (en) * 2013-03-20 2013-07-17 北京工业大学 Data fusion system and method of differential GPS (Global Position System) and inertial navigation in intelligent vehicle
US20180181737A1 (en) * 2014-08-28 2018-06-28 Facetec, Inc. Facial Recognition Authentication System Including Path Parameters
CN107003785A (en) * 2014-12-09 2017-08-01 巴斯夫欧洲公司 Fluorescence detector
CN105515184A (en) * 2015-12-04 2016-04-20 国网河南省电力公司电力科学研究院 Wireless sensor network-based cooperative monitoring system of multi-sensor and multi-parameter distribution network
US20180233047A1 (en) * 2017-02-11 2018-08-16 Ben Mandeville-Clarke Systems and methods for detecting and avoiding an emergency vehicle in the proximity of a substantially autonomous vehicle
CN109059927A (en) * 2018-08-21 2018-12-21 南京邮电大学 The mobile robot slam of multisensor builds drawing method and system under complex environment
CN109188460A (en) * 2018-09-25 2019-01-11 北京华开领航科技有限责任公司 Unmanned foreign matter detection system and method
CN109916397A (en) * 2019-03-15 2019-06-21 斑马网络技术有限公司 For tracking method, apparatus, electronic equipment and the storage medium of inspection track
CN110361027A (en) * 2019-06-25 2019-10-22 马鞍山天邦开物智能商务管理有限公司 Robot path planning method based on single line laser radar Yu binocular camera data fusion
CN110553652A (en) * 2019-10-12 2019-12-10 上海高仙自动化科技发展有限公司 robot multi-sensor fusion positioning method and application thereof

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张洁茹;苏峰;袁培江;王田苗;陶一宁;丁东;: "双光谱智能体温检测与健康大数据管理系统", 北京航空航天大学学报, pages 1739 - 1746 *
王威;陈巍;陆琴心;: "基于GPS+激光雷达的变电站巡检车混合导航研究", 自动化应用, pages 95 - 99 *

Similar Documents

Publication Publication Date Title
US10127677B1 (en) Using observations from one or more robots to generate a spatio-temporal model that defines pose values for a plurality of objects in an environment
CN109871745A (en) Identify method, system and the vehicle of empty parking space
CN113780064A (en) Target tracking method and device
Wang et al. Reduction of uncertainties for safety assessment of automated driving under parallel simulations
Sydney et al. Physics-inspired motion planning for information-theoretic target detection using multiple aerial robots
KR101813790B1 (en) Apparatus and Method for multi-sensor information fusion based on feature information
Wang et al. A forward collision warning system based on self-learning algorithm of driver characteristics
US11199561B2 (en) System and method for standardized evaluation of activity sequences
CN112986982B (en) Environment map reference positioning method and device and mobile robot
Misu et al. Specific person detection and tracking by a mobile robot using 3D LIDAR and ESPAR antenna
Dorokhov et al. Recognition of cow teats using the 3D-ToF camera when milking in the “herringbone” milking parlor
US20210080571A1 (en) Classification of Static and Dynamic Objects
CN113177973A (en) Multi-data fusion processing system and method
CN111181957B (en) Internet of things equipment security verification method and system and central control equipment
Muharom et al. Room Searching Robot Based on Door Detection and Room Number Recognition for Automatic Target Shooter Robot Application
Adarsh et al. Neuro-fuzzy based fusion of LiDAR and ultrasonic sensors to minimize error in range estimation for the navigation of mobile robots
CN115792945B (en) Floating obstacle detection method and device, electronic equipment and storage medium
KR20090113746A (en) A method of robot localization using spatial semantics of objects
Houtman et al. Automated flower counting from partial detections: Multiple hypothesis tracking with a connected-flower plant model
EP3859595A1 (en) Target tracking method and device
US20230281872A1 (en) System for calibrating extrinsic parameters for a camera in an autonomous vehicle
CN113744518B (en) Method and device for detecting vehicle travelable area
CN114419859A (en) Safety detection method, processor and device for supporting leg and fire fighting truck
JP2019106166A (en) Information processing method, information processing apparatus and program
CN114114241A (en) Data fusion method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination