CN117429410A - Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion - Google Patents
Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion Download PDFInfo
- Publication number
- CN117429410A CN117429410A CN202311222394.0A CN202311222394A CN117429410A CN 117429410 A CN117429410 A CN 117429410A CN 202311222394 A CN202311222394 A CN 202311222394A CN 117429410 A CN117429410 A CN 117429410A
- Authority
- CN
- China
- Prior art keywords
- real
- time
- commercial vehicle
- warehousing
- map
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 230000008447 perception Effects 0.000 title claims abstract description 15
- 230000000007 visual effect Effects 0.000 title claims abstract description 15
- 230000004927 fusion Effects 0.000 title claims abstract description 14
- 238000012544 monitoring process Methods 0.000 claims abstract description 85
- 230000009466 transformation Effects 0.000 claims abstract description 17
- 238000012545 processing Methods 0.000 claims description 13
- 238000001914 filtration Methods 0.000 claims description 7
- 238000007781 pre-processing Methods 0.000 claims description 7
- 230000002194 synthesizing effect Effects 0.000 claims description 7
- 238000004364 calculation method Methods 0.000 claims description 4
- 238000004088 simulation Methods 0.000 claims description 4
- 238000013527 convolutional neural network Methods 0.000 claims description 3
- 238000013135 deep learning Methods 0.000 claims description 3
- 230000008569 process Effects 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 3
- 238000005516 engineering process Methods 0.000 description 5
- 230000006870 function Effects 0.000 description 3
- 230000001133 acceleration Effects 0.000 description 2
- 238000007689 inspection Methods 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 238000009825 accumulation Methods 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000004422 calculation algorithm Methods 0.000 description 1
- 238000004140 cleaning Methods 0.000 description 1
- 238000010276 construction Methods 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 230000004807 localization Effects 0.000 description 1
- 238000013507 mapping Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000012795 verification Methods 0.000 description 1
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/60—Control of cameras or camera modules
- H04N23/698—Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R1/00—Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/20—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
- B60R1/22—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
- B60R1/23—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
- B60R1/27—Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W30/00—Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
- B60W30/06—Automatic manoeuvring for parking
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01C—MEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
- G01C21/00—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
- G01C21/10—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
- G01C21/12—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
- G01C21/16—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
- G01C21/165—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
- G01C21/1652—Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/0257—Hybrid positioning
- G01S5/0258—Hybrid positioning by combining or switching between measurements derived from different systems
- G01S5/02585—Hybrid positioning by combining or switching between measurements derived from different systems at least one of the measurements being a non-radio measurement
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S5/00—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
- G01S5/02—Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
- G01S5/08—Position of single direction-finder fixed by determining direction of a plurality of spaced sources of known location
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F16/00—Information retrieval; Database structures therefor; File system structures therefor
- G06F16/20—Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
- G06F16/29—Geographical information databases
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06N—COMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
- G06N3/00—Computing arrangements based on biological models
- G06N3/02—Neural networks
- G06N3/04—Architecture, e.g. interconnection topology
- G06N3/0464—Convolutional networks [CNN, ConvNet]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T3/00—Geometric image transformation in the plane of the image
- G06T3/40—Scaling the whole image or part thereof
- G06T3/4038—Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/16—Image acquisition using multiple overlapping images; Image stitching
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
- G06V10/44—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
- G06V10/443—Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
- G06V10/449—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
- G06V10/451—Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
- G06V10/454—Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/70—Arrangements for image or video recognition or understanding using pattern recognition or machine learning
- G06V10/82—Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/52—Surveillance or monitoring of activities, e.g. for recognising suspicious objects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/90—Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N7/00—Television systems
- H04N7/18—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
- H04N7/181—Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W4/00—Services specially adapted for wireless communication networks; Facilities therefor
- H04W4/30—Services specially adapted for particular environments, situations or purposes
- H04W4/40—Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04W—WIRELESS COMMUNICATION NETWORKS
- H04W64/00—Locating users or terminals or network equipment for network management purposes, e.g. mobility management
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/30—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
- B60R2300/304—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60R—VEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
- B60R2300/00—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
- B60R2300/80—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
- B60R2300/806—Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2200/00—Indexing scheme for image data processing or generation, in general
- G06T2200/32—Indexing scheme for image data processing or generation, in general involving image mosaicing
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02T—CLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
- Y02T10/00—Road transport of goods or passengers
- Y02T10/10—Internal combustion engine [ICE] based vehicles
- Y02T10/40—Engine management systems
Abstract
The invention provides a commercial vehicle automatic warehousing method based on visual SLAM and UWB perception fusion, which comprises the following steps: s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing; s2: acquiring map basic data; s3: constructing a global warehouse-in map; according to the automatic warehousing method for the commercial vehicle based on visual SLAM and UWB perception fusion, four looking-around cameras are respectively assembled in the front direction, the rear direction, the left direction and the right direction of a commercial vehicle body, real-time monitoring images in the four directions are obtained when the commercial vehicle is warehoused, and the real-time monitoring images in the four directions are integrated into one image after being subjected to inverse transformation treatment, so that a real-time panoramic monitoring image is obtained; the 3D picture is obtained, the perception range is increased, and the problem that the surrounding environment picture of the vehicle is distorted and the splicing part is lost in 360-degree looking around splicing effect is solved.
Description
Technical Field
The invention relates to the field of Internet of vehicles, in particular to an automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion.
Background
At present, the automatic driving technology is mainly concentrated in the L2 and L3 stages, the existing domestic automatic driving system with the L2 and L3 functions can be only temporarily applied to a designated experimental road section, and continuous data accumulation and road verification are temporarily required for the vehicle to reach the aim of fully automatic driving FSD; although the functions of the automatic driving technology on the market are gradually enriched and perfected in recent years, the automatic driving technology is mainly applied to passenger vehicles, and for large-scale commercial vehicles, the technology maturity is still low;
when the large commercial vehicle is backed up and put in storage in a vehicle factory at the present stage, the following problems still exist:
1. the common monitoring technical scheme of the commercial vehicle at the present stage mainly monitors the warehouse-in environment in real time through 6-8 paths of cameras arranged around the commercial vehicle, and finally displays or performs 360-degree looking-around splicing in a split picture mode to assist a driver in observing and operating the commercial vehicle; the split picture display is mainly displayed in a 2D mode, and a driver cannot know the position state of the vehicle in a three-dimensional space mode; 360-degree round-the-clock splicing realizes 3D pictures through software internal algorithm splicing, but the environment pictures around the vehicle on the display effect have distortion, the splicing position has the problem of partial environment loss, and the specific distance between the vehicle and surrounding objects cannot be accurately measured;
2. the commercial vehicle has the problems of large volume, long vehicle body, insufficient turning visual angle, large blind area and the like, and the requirements of the direction and visual angle handle control on the experience of a driver are high when the vehicle is backed up and put in storage;
disclosure of Invention
The invention aims to solve the defects in the prior art, and provides an automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion.
In order to achieve the above purpose, the invention adopts the following technical scheme:
the automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
the method comprises the following substeps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
the method comprises the following substeps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: synthesizing a panoramic monitoring image;
the method comprises the following substeps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
integrating pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image;
s2: acquiring map basic data;
the UWB TOF positioning component unit of the positioning sensing module comprises a UWB transmitting end and a UWB receiving end, wherein the UWB transmitting end is deployed at a parking space mark of a warehouse or in garage peripheral equipment, and the UWB receiving end is deployed on a commercial vehicle body;
the method comprises the following substeps:
s21: transmitting data;
when a commercial vehicle enters a warehouse, a UWB TOF sensor receiving end, an inertial navigation unit IMU and a wheel encoder collect sensor data in real time;
the sensor data collected by the UWB TOF sensor receiving end comprises the distance between the commercial vehicle and the UWB transmitting end, response time and the like when the commercial vehicle is in warehouse entry and driving;
the sensor data collected by the inertial navigation unit IMU comprises acceleration, vehicle attitude angle and the like of the commercial vehicle for warehouse entry running;
the sensor data collected by the wheel encoder comprises the speed, the rotating speed, the angle and the like of the commercial vehicle for warehouse entry running;
transmitting the real-time panoramic monitoring image and the real-time sensor data to a central control navigation module;
s22: synchronizing the data of multiple sensors;
the System On Chip (SOC) of the central control navigation module synchronously processes the received sensor data;
s23: preprocessing sensor data;
preprocessing real-time sensor data and a real-time panoramic monitoring image;
the preprocessing comprises invalid value removal, sequence inspection, point cloud shielding point and parallel point removal, coordinate system processing and the like;
s24: removing distortion;
performing de-distortion processing on the preprocessed sensor data and the real-time panoramic monitoring image to obtain map basic data;
s25: front end matching;
matching the map basic data with front-end data;
further, warehouse-in simulation of the commercial vehicle is performed in advance, real-time sensor data and real-time warehouse-in monitoring images during simulated warehouse-in are synchronously acquired, and the acquired real-time sensor data and real-time warehouse-in monitoring images are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front end data by taking the front end data as a reference, and if the map basic data at the corresponding position is inconsistent with the front end data, giving out a corresponding prompt and calibrating the map basic data in real time;
the speed, the gesture and the like of the vehicle warehouse entry are adjusted in real time by calibrating the map basic data;
s3: constructing a global warehouse-in map;
the method comprises the following substeps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
the features include lane lines, library bit lines, obstacles, walls, etc.;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range when a commercial vehicle is backed up and put in storage into one local map to obtain a series of local maps;
finally integrating a series of local maps into a global warehouse-in map;
the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing of the commercial vehicle is monitored by a driver;
compared with the prior art, the invention has the beneficial effects that:
according to the automatic warehousing method for the commercial vehicle based on visual SLAM and UWB perception fusion, four looking-around cameras are respectively assembled in the front direction, the rear direction, the left direction and the right direction of a commercial vehicle body, real-time monitoring images in the four directions are obtained when the commercial vehicle is warehoused, and the real-time monitoring images in the four directions are integrated into one image after being subjected to inverse transformation treatment, so that a real-time panoramic monitoring image is obtained; the 3D picture is obtained, the perception range is increased, and the problem that the surrounding environment picture of the vehicle is distorted and the splicing part is lost in 360-degree looking around splicing effect is solved;
according to the method, the global warehousing map is generated by SLAM (simultaneous localization and mapping) technology, the inertial measurement unit IMU and the wheel encoder, the vehicle is accurately positioned, the monitoring vision of the warehousing of the commercial vehicle is widened, the problem that the omnibearing environmental information cannot be obtained when a narrow parking lot is shielded and has a plurality of interference factors is solved, and the experience requirement of a driver of the commercial vehicle is reduced;
description of the embodiments
For a further understanding of the objects, construction, features, and functions of the invention, reference should be made to the following detailed description of the preferred embodiments.
The automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
the method comprises the following substeps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
the method comprises the following substeps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: synthesizing a panoramic monitoring image;
the method comprises the following substeps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
integrating pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image;
the pixel points of the monitoring images in four directions are integrated into one image to obtain a real-time panoramic monitoring image in storage, so that the sensing range is increased, and the problem that the omnibearing environment information cannot be obtained when a narrow parking lot is shielded and has a plurality of interference factors is solved;
s2: acquiring map basic data;
the UWB TOF positioning component unit of the positioning sensing module comprises a UWB transmitting end and a UWB receiving end, wherein the UWB transmitting end is deployed at a parking space mark of a warehouse or in garage peripheral equipment, and the UWB receiving end is deployed on a commercial vehicle body;
the method comprises the following substeps:
s21: transmitting data;
when a commercial vehicle enters a warehouse, a UWB TOF sensor receiving end, an inertial navigation unit IMU and a wheel encoder collect sensor data in real time;
the sensor data collected by the UWB TOF sensor receiving end comprises the distance between the commercial vehicle and the UWB transmitting end, response time and the like when the commercial vehicle is in warehouse entry and driving;
the sensor data collected by the inertial navigation unit IMU comprises acceleration, vehicle attitude angle and the like of the commercial vehicle for warehouse entry running;
the sensor data collected by the wheel encoder comprises the speed, the rotating speed, the angle and the like of the commercial vehicle for warehouse entry running;
transmitting the real-time panoramic monitoring image and the real-time sensor data to a central control navigation module;
s22: synchronizing the data of multiple sensors;
the System On Chip (SOC) of the central control navigation module synchronously processes the received sensor data;
the System On Chip (SOC) of the central control navigation module is used for synchronously processing the real-time sensor data, so that the real-time sensor data are ensured to have the same time reference and data format, and the subsequent data fusion and processing are facilitated;
s23: preprocessing sensor data;
preprocessing real-time sensor data and a real-time panoramic monitoring image;
the preprocessing comprises invalid value removal, sequence inspection, point cloud shielding point and parallel point removal, coordinate system processing and the like;
s24: removing distortion;
performing de-distortion processing on the preprocessed sensor data and the real-time panoramic monitoring image to obtain map basic data;
s25: front end matching;
matching the map basic data with front-end data;
further, warehouse-in simulation of the commercial vehicle is performed in advance, real-time sensor data and real-time warehouse-in monitoring images during simulated warehouse-in are synchronously acquired, and the acquired real-time sensor data and real-time warehouse-in monitoring images are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front end data by taking the front end data as a reference, and if the map basic data at the corresponding position is inconsistent with the front end data, giving out a corresponding prompt and calibrating the map basic data in real time;
the speed, the gesture and the like of the vehicle warehouse entry are adjusted in real time by calibrating the map basic data;
s3: constructing a global warehouse-in map;
the method comprises the following substeps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
the method comprises the steps of performing filtering calculation on sensor data in map basic data, cleaning the sensor data and eliminating interference signals;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
the features include lane lines, library bit lines, obstacles, walls, etc.;
the parking space can be detected, the obstacle can be avoided and the driving space can be planned by extracting the characteristics;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range when a commercial vehicle is backed up and put in storage into one local map to obtain a series of local maps;
by constructing a series of local maps for warehouse entry of commercial vehicles and detecting a closed loop according to the local maps, the data error caused by drift caused by long-time operation of the odometer can be eliminated;
finally integrating a series of local maps into a global warehouse-in map;
the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing of the commercial vehicle is monitored by a driver;
the invention has been described with respect to the above-described embodiments, however, the above-described embodiments are merely examples of practicing the invention. It should be noted that the disclosed embodiments do not limit the scope of the invention. On the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.
Claims (6)
1. A commercial vehicle automatic warehousing method based on visual SLAM and UWB perception fusion is characterized in that: the automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through a camera, and synthesizing the real-time monitoring images in the four directions into a real-time panoramic monitoring image;
s2: acquiring map basic data;
when the commercial vehicle is put in storage, the positioning sensing module collects sensor data in real time and sends the real-time panoramic monitoring image and the real-time sensor data to the central control navigation module;
the central control navigation module receives the real-time panoramic monitoring image and the real-time sensor data and synchronously processes the sensor data;
after the synchronization processing is completed, preprocessing and de-distorting the real-time sensor data and the real-time panoramic monitoring image to obtain map basic data;
s3: constructing a global warehouse-in map;
extracting features in the map basic data;
integrating the pixel points of the features extracted in a certain range during reversing and warehousing of the commercial vehicle into a local map to obtain a series of local maps;
finally integrating a series of local maps into a global warehouse-in map;
and the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing monitoring of the commercial vehicle by a driver is realized.
2. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
step S1 comprises the following sub-steps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: and synthesizing the panoramic monitoring image.
3. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion as claimed in claim 2, wherein the method comprises the following steps:
step S12 comprises the following sub-steps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
and integrating the pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image.
4. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
further, in step S2, a commercial vehicle warehouse entry simulation is performed in advance, real-time sensor data and a real-time warehouse entry monitoring image during the simulation of warehouse entry are synchronously acquired, and the acquired real-time sensor data and real-time warehouse entry monitoring image are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front-end data by taking the front-end data as a reference, and giving out corresponding prompts if the map basic data at the corresponding position is inconsistent with the front-end data; if the map basic data of the corresponding position is consistent with the front end data, no corresponding prompt is given, and the map basic data is calibrated in real time.
5. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
step S3 comprises the following sub-steps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range in reversing and warehousing of a commercial vehicle into a local semantic map to obtain a series of local semantic maps;
finally integrating a series of local semantic maps into a global warehouse-in map;
and the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing monitoring of the commercial vehicle by a driver is realized.
6. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
in step S3, the features include lane lines, library bit lines, obstacles, walls.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311222394.0A CN117429410A (en) | 2023-09-21 | 2023-09-21 | Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202311222394.0A CN117429410A (en) | 2023-09-21 | 2023-09-21 | Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117429410A true CN117429410A (en) | 2024-01-23 |
Family
ID=89548787
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202311222394.0A Pending CN117429410A (en) | 2023-09-21 | 2023-09-21 | Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN117429410A (en) |
-
2023
- 2023-09-21 CN CN202311222394.0A patent/CN117429410A/en active Pending
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7073315B2 (en) | Vehicles, vehicle positioning systems, and vehicle positioning methods | |
CN111986506B (en) | Mechanical parking space parking method based on multi-vision system | |
CN111553252B (en) | Road pedestrian automatic identification and positioning method based on deep learning and U-V parallax algorithm | |
CN107341453B (en) | Lane line extraction method and device | |
CN109766757B (en) | Parking space high-precision positioning method and system integrating vehicle and visual information | |
CN105300403B (en) | A kind of vehicle mileage calculating method based on binocular vision | |
CN111830953B (en) | Vehicle self-positioning method, device and system | |
JP2020525809A (en) | System and method for updating high resolution maps based on binocular images | |
CN111065043B (en) | System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication | |
CN112734841B (en) | Method for realizing positioning by using wheel type odometer-IMU and monocular camera | |
JP2021508815A (en) | Systems and methods for correcting high-definition maps based on the detection of obstructing objects | |
Shunsuke et al. | GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon | |
CN113781562B (en) | Lane line virtual-real registration and self-vehicle positioning method based on road model | |
CN114419098A (en) | Moving target trajectory prediction method and device based on visual transformation | |
CN111986261A (en) | Vehicle positioning method and device, electronic equipment and storage medium | |
CN111678518A (en) | Visual positioning method for correcting automatic parking path | |
CN112519800A (en) | Self-checking system for unmanned vehicle | |
CN112819711A (en) | Monocular vision-based vehicle reverse positioning method utilizing road lane line | |
CN116681776B (en) | External parameter calibration method and system for binocular camera | |
CN116804553A (en) | Odometer system and method based on event camera/IMU/natural road sign | |
CN117429410A (en) | Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion | |
CN111243021A (en) | Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium | |
CN114503044A (en) | System and method for automatically labeling objects in 3D point clouds | |
WO2023283929A1 (en) | Method and apparatus for calibrating external parameters of binocular camera | |
CN113781645A (en) | Indoor parking environment-oriented positioning and mapping method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |