CN117429410A - Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion - Google Patents

Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion Download PDF

Info

Publication number
CN117429410A
CN117429410A CN202311222394.0A CN202311222394A CN117429410A CN 117429410 A CN117429410 A CN 117429410A CN 202311222394 A CN202311222394 A CN 202311222394A CN 117429410 A CN117429410 A CN 117429410A
Authority
CN
China
Prior art keywords
real
time
commercial vehicle
warehousing
map
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202311222394.0A
Other languages
Chinese (zh)
Inventor
李若凡
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yukuai Chuangling Intelligent Technology Nanjing Co ltd
Original Assignee
Yukuai Chuangling Intelligent Technology Nanjing Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yukuai Chuangling Intelligent Technology Nanjing Co ltd filed Critical Yukuai Chuangling Intelligent Technology Nanjing Co ltd
Priority to CN202311222394.0A priority Critical patent/CN117429410A/en
Publication of CN117429410A publication Critical patent/CN117429410A/en
Pending legal-status Critical Current

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/698Control of cameras or camera modules for achieving an enlarged field of view, e.g. panoramic image capture
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R1/00Optical viewing arrangements; Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/20Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles
    • B60R1/22Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle
    • B60R1/23Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view
    • B60R1/27Real-time viewing arrangements for drivers or passengers using optical image capturing systems, e.g. cameras or video systems specially adapted for use in or on vehicles for viewing an area outside the vehicle, e.g. the exterior of the vehicle with a predetermined field of view providing all-round vision, e.g. using omnidirectional cameras
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/06Automatic manoeuvring for parking
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00
    • G01C21/10Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration
    • G01C21/12Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning
    • G01C21/16Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation
    • G01C21/165Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments
    • G01C21/1652Navigation; Navigational instruments not provided for in groups G01C1/00 - G01C19/00 by using measurements of speed or acceleration executed aboard the object being navigated; Dead reckoning by integrating acceleration or speed, i.e. inertial navigation combined with non-inertial navigation instruments with ranging devices, e.g. LIDAR or RADAR
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/0257Hybrid positioning
    • G01S5/0258Hybrid positioning by combining or switching between measurements derived from different systems
    • G01S5/02585Hybrid positioning by combining or switching between measurements derived from different systems at least one of the measurements being a non-radio measurement
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S5/00Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations
    • G01S5/02Position-fixing by co-ordinating two or more direction or position line determinations; Position-fixing by co-ordinating two or more distance determinations using radio waves
    • G01S5/08Position of single direction-finder fixed by determining direction of a plurality of spaced sources of known location
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F16/00Information retrieval; Database structures therefor; File system structures therefor
    • G06F16/20Information retrieval; Database structures therefor; File system structures therefor of structured data, e.g. relational data
    • G06F16/29Geographical information databases
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T3/00Geometric image transformation in the plane of the image
    • G06T3/40Scaling the whole image or part thereof
    • G06T3/4038Scaling the whole image or part thereof for image mosaicing, i.e. plane images composed of plane sub-images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/16Image acquisition using multiple overlapping images; Image stitching
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • G06V10/443Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components by matching or filtering
    • G06V10/449Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters
    • G06V10/451Biologically inspired filters, e.g. difference of Gaussians [DoG] or Gabor filters with interaction between the filter responses, e.g. cortical complex cells
    • G06V10/454Integrating the filters into a hierarchical structure, e.g. convolutional neural networks [CNN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/52Surveillance or monitoring of activities, e.g. for recognising suspicious objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/90Arrangement of cameras or camera modules, e.g. multiple cameras in TV studios or sports stadiums
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/181Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a plurality of remote sources
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W4/00Services specially adapted for wireless communication networks; Facilities therefor
    • H04W4/30Services specially adapted for particular environments, situations or purposes
    • H04W4/40Services specially adapted for particular environments, situations or purposes for vehicles, e.g. vehicle-to-pedestrians [V2P]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W64/00Locating users or terminals or network equipment for network management purposes, e.g. mobility management
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/30Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing
    • B60R2300/304Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the type of image processing using merged images, e.g. merging camera image with stored images
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R2300/00Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle
    • B60R2300/80Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement
    • B60R2300/806Details of viewing arrangements using cameras and displays, specially adapted for use in a vehicle characterised by the intended use of the viewing arrangement for aiding parking
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/32Indexing scheme for image data processing or generation, in general involving image mosaicing
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02TCLIMATE CHANGE MITIGATION TECHNOLOGIES RELATED TO TRANSPORTATION
    • Y02T10/00Road transport of goods or passengers
    • Y02T10/10Internal combustion engine [ICE] based vehicles
    • Y02T10/40Engine management systems

Abstract

The invention provides a commercial vehicle automatic warehousing method based on visual SLAM and UWB perception fusion, which comprises the following steps: s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing; s2: acquiring map basic data; s3: constructing a global warehouse-in map; according to the automatic warehousing method for the commercial vehicle based on visual SLAM and UWB perception fusion, four looking-around cameras are respectively assembled in the front direction, the rear direction, the left direction and the right direction of a commercial vehicle body, real-time monitoring images in the four directions are obtained when the commercial vehicle is warehoused, and the real-time monitoring images in the four directions are integrated into one image after being subjected to inverse transformation treatment, so that a real-time panoramic monitoring image is obtained; the 3D picture is obtained, the perception range is increased, and the problem that the surrounding environment picture of the vehicle is distorted and the splicing part is lost in 360-degree looking around splicing effect is solved.

Description

Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion
Technical Field
The invention relates to the field of Internet of vehicles, in particular to an automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion.
Background
At present, the automatic driving technology is mainly concentrated in the L2 and L3 stages, the existing domestic automatic driving system with the L2 and L3 functions can be only temporarily applied to a designated experimental road section, and continuous data accumulation and road verification are temporarily required for the vehicle to reach the aim of fully automatic driving FSD; although the functions of the automatic driving technology on the market are gradually enriched and perfected in recent years, the automatic driving technology is mainly applied to passenger vehicles, and for large-scale commercial vehicles, the technology maturity is still low;
when the large commercial vehicle is backed up and put in storage in a vehicle factory at the present stage, the following problems still exist:
1. the common monitoring technical scheme of the commercial vehicle at the present stage mainly monitors the warehouse-in environment in real time through 6-8 paths of cameras arranged around the commercial vehicle, and finally displays or performs 360-degree looking-around splicing in a split picture mode to assist a driver in observing and operating the commercial vehicle; the split picture display is mainly displayed in a 2D mode, and a driver cannot know the position state of the vehicle in a three-dimensional space mode; 360-degree round-the-clock splicing realizes 3D pictures through software internal algorithm splicing, but the environment pictures around the vehicle on the display effect have distortion, the splicing position has the problem of partial environment loss, and the specific distance between the vehicle and surrounding objects cannot be accurately measured;
2. the commercial vehicle has the problems of large volume, long vehicle body, insufficient turning visual angle, large blind area and the like, and the requirements of the direction and visual angle handle control on the experience of a driver are high when the vehicle is backed up and put in storage;
disclosure of Invention
The invention aims to solve the defects in the prior art, and provides an automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion.
In order to achieve the above purpose, the invention adopts the following technical scheme:
the automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
the method comprises the following substeps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
the method comprises the following substeps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: synthesizing a panoramic monitoring image;
the method comprises the following substeps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
integrating pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image;
s2: acquiring map basic data;
the UWB TOF positioning component unit of the positioning sensing module comprises a UWB transmitting end and a UWB receiving end, wherein the UWB transmitting end is deployed at a parking space mark of a warehouse or in garage peripheral equipment, and the UWB receiving end is deployed on a commercial vehicle body;
the method comprises the following substeps:
s21: transmitting data;
when a commercial vehicle enters a warehouse, a UWB TOF sensor receiving end, an inertial navigation unit IMU and a wheel encoder collect sensor data in real time;
the sensor data collected by the UWB TOF sensor receiving end comprises the distance between the commercial vehicle and the UWB transmitting end, response time and the like when the commercial vehicle is in warehouse entry and driving;
the sensor data collected by the inertial navigation unit IMU comprises acceleration, vehicle attitude angle and the like of the commercial vehicle for warehouse entry running;
the sensor data collected by the wheel encoder comprises the speed, the rotating speed, the angle and the like of the commercial vehicle for warehouse entry running;
transmitting the real-time panoramic monitoring image and the real-time sensor data to a central control navigation module;
s22: synchronizing the data of multiple sensors;
the System On Chip (SOC) of the central control navigation module synchronously processes the received sensor data;
s23: preprocessing sensor data;
preprocessing real-time sensor data and a real-time panoramic monitoring image;
the preprocessing comprises invalid value removal, sequence inspection, point cloud shielding point and parallel point removal, coordinate system processing and the like;
s24: removing distortion;
performing de-distortion processing on the preprocessed sensor data and the real-time panoramic monitoring image to obtain map basic data;
s25: front end matching;
matching the map basic data with front-end data;
further, warehouse-in simulation of the commercial vehicle is performed in advance, real-time sensor data and real-time warehouse-in monitoring images during simulated warehouse-in are synchronously acquired, and the acquired real-time sensor data and real-time warehouse-in monitoring images are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front end data by taking the front end data as a reference, and if the map basic data at the corresponding position is inconsistent with the front end data, giving out a corresponding prompt and calibrating the map basic data in real time;
the speed, the gesture and the like of the vehicle warehouse entry are adjusted in real time by calibrating the map basic data;
s3: constructing a global warehouse-in map;
the method comprises the following substeps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
the features include lane lines, library bit lines, obstacles, walls, etc.;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range when a commercial vehicle is backed up and put in storage into one local map to obtain a series of local maps;
finally integrating a series of local maps into a global warehouse-in map;
the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing of the commercial vehicle is monitored by a driver;
compared with the prior art, the invention has the beneficial effects that:
according to the automatic warehousing method for the commercial vehicle based on visual SLAM and UWB perception fusion, four looking-around cameras are respectively assembled in the front direction, the rear direction, the left direction and the right direction of a commercial vehicle body, real-time monitoring images in the four directions are obtained when the commercial vehicle is warehoused, and the real-time monitoring images in the four directions are integrated into one image after being subjected to inverse transformation treatment, so that a real-time panoramic monitoring image is obtained; the 3D picture is obtained, the perception range is increased, and the problem that the surrounding environment picture of the vehicle is distorted and the splicing part is lost in 360-degree looking around splicing effect is solved;
according to the method, the global warehousing map is generated by SLAM (simultaneous localization and mapping) technology, the inertial measurement unit IMU and the wheel encoder, the vehicle is accurately positioned, the monitoring vision of the warehousing of the commercial vehicle is widened, the problem that the omnibearing environmental information cannot be obtained when a narrow parking lot is shielded and has a plurality of interference factors is solved, and the experience requirement of a driver of the commercial vehicle is reduced;
description of the embodiments
For a further understanding of the objects, construction, features, and functions of the invention, reference should be made to the following detailed description of the preferred embodiments.
The automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
the method comprises the following substeps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
the method comprises the following substeps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: synthesizing a panoramic monitoring image;
the method comprises the following substeps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
integrating pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image;
the pixel points of the monitoring images in four directions are integrated into one image to obtain a real-time panoramic monitoring image in storage, so that the sensing range is increased, and the problem that the omnibearing environment information cannot be obtained when a narrow parking lot is shielded and has a plurality of interference factors is solved;
s2: acquiring map basic data;
the UWB TOF positioning component unit of the positioning sensing module comprises a UWB transmitting end and a UWB receiving end, wherein the UWB transmitting end is deployed at a parking space mark of a warehouse or in garage peripheral equipment, and the UWB receiving end is deployed on a commercial vehicle body;
the method comprises the following substeps:
s21: transmitting data;
when a commercial vehicle enters a warehouse, a UWB TOF sensor receiving end, an inertial navigation unit IMU and a wheel encoder collect sensor data in real time;
the sensor data collected by the UWB TOF sensor receiving end comprises the distance between the commercial vehicle and the UWB transmitting end, response time and the like when the commercial vehicle is in warehouse entry and driving;
the sensor data collected by the inertial navigation unit IMU comprises acceleration, vehicle attitude angle and the like of the commercial vehicle for warehouse entry running;
the sensor data collected by the wheel encoder comprises the speed, the rotating speed, the angle and the like of the commercial vehicle for warehouse entry running;
transmitting the real-time panoramic monitoring image and the real-time sensor data to a central control navigation module;
s22: synchronizing the data of multiple sensors;
the System On Chip (SOC) of the central control navigation module synchronously processes the received sensor data;
the System On Chip (SOC) of the central control navigation module is used for synchronously processing the real-time sensor data, so that the real-time sensor data are ensured to have the same time reference and data format, and the subsequent data fusion and processing are facilitated;
s23: preprocessing sensor data;
preprocessing real-time sensor data and a real-time panoramic monitoring image;
the preprocessing comprises invalid value removal, sequence inspection, point cloud shielding point and parallel point removal, coordinate system processing and the like;
s24: removing distortion;
performing de-distortion processing on the preprocessed sensor data and the real-time panoramic monitoring image to obtain map basic data;
s25: front end matching;
matching the map basic data with front-end data;
further, warehouse-in simulation of the commercial vehicle is performed in advance, real-time sensor data and real-time warehouse-in monitoring images during simulated warehouse-in are synchronously acquired, and the acquired real-time sensor data and real-time warehouse-in monitoring images are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front end data by taking the front end data as a reference, and if the map basic data at the corresponding position is inconsistent with the front end data, giving out a corresponding prompt and calibrating the map basic data in real time;
the speed, the gesture and the like of the vehicle warehouse entry are adjusted in real time by calibrating the map basic data;
s3: constructing a global warehouse-in map;
the method comprises the following substeps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
the method comprises the steps of performing filtering calculation on sensor data in map basic data, cleaning the sensor data and eliminating interference signals;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
the features include lane lines, library bit lines, obstacles, walls, etc.;
the parking space can be detected, the obstacle can be avoided and the driving space can be planned by extracting the characteristics;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range when a commercial vehicle is backed up and put in storage into one local map to obtain a series of local maps;
by constructing a series of local maps for warehouse entry of commercial vehicles and detecting a closed loop according to the local maps, the data error caused by drift caused by long-time operation of the odometer can be eliminated;
finally integrating a series of local maps into a global warehouse-in map;
the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing of the commercial vehicle is monitored by a driver;
the invention has been described with respect to the above-described embodiments, however, the above-described embodiments are merely examples of practicing the invention. It should be noted that the disclosed embodiments do not limit the scope of the invention. On the contrary, the intention is to cover all modifications, equivalents, and alternatives falling within the spirit and scope of the invention.

Claims (6)

1. A commercial vehicle automatic warehousing method based on visual SLAM and UWB perception fusion is characterized in that: the automatic commercial vehicle warehousing system comprises a panoramic monitoring module, a positioning sensing module and a central control navigation module;
the panoramic monitoring module is used for acquiring panoramic monitoring images of the commercial vehicle when the commercial vehicle is backed up and put in storage;
the positioning sensing module comprises a UWB TOF positioning composition unit, an inertial navigation unit (IMU) and a wheel encoder, and is used for acquiring map basic data;
the central control navigation module comprises a central control host and a display screen, and is used for observing the states of the vehicle and the surrounding environment in real time, so that the monitoring of a driver on the commercial vehicle is realized;
the specific automatic warehouse-in method of the commercial vehicle comprises the following steps:
s1: acquiring a panoramic monitoring image of a commercial vehicle during reversing and warehousing;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through a camera, and synthesizing the real-time monitoring images in the four directions into a real-time panoramic monitoring image;
s2: acquiring map basic data;
when the commercial vehicle is put in storage, the positioning sensing module collects sensor data in real time and sends the real-time panoramic monitoring image and the real-time sensor data to the central control navigation module;
the central control navigation module receives the real-time panoramic monitoring image and the real-time sensor data and synchronously processes the sensor data;
after the synchronization processing is completed, preprocessing and de-distorting the real-time sensor data and the real-time panoramic monitoring image to obtain map basic data;
s3: constructing a global warehouse-in map;
extracting features in the map basic data;
integrating the pixel points of the features extracted in a certain range during reversing and warehousing of the commercial vehicle into a local map to obtain a series of local maps;
finally integrating a series of local maps into a global warehouse-in map;
and the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing monitoring of the commercial vehicle by a driver is realized.
2. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
step S1 comprises the following sub-steps:
s11: acquiring real-time monitoring images of four directions when a commercial vehicle enters a warehouse through a camera;
the method comprises the steps that four looking-around cameras are respectively assembled in the front, back, left and right directions of a commercial vehicle body, the looking-around cameras are provided with fisheye lenses, and the directions of the fisheye lenses are downward;
acquiring real-time monitoring images in four directions when a commercial vehicle is put in storage through four looking-around cameras;
s12: and synthesizing the panoramic monitoring image.
3. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion as claimed in claim 2, wherein the method comprises the following steps:
step S12 comprises the following sub-steps:
s121: performing inverse transformation on the monitoring image;
the panoramic monitoring module respectively carries out inverse perspective transformation processing on the real-time monitoring images in four directions to obtain an inverse perspective transformation IPM image of the real-time monitoring images;
the method comprises the steps of carrying out inverse transformation processing on four real-time monitoring images, and recovering the monitoring images of the camera view angles into orthogonal view angle images conforming to actual scenes;
s122: synthesizing a real-time panoramic monitoring image;
and integrating the pixel points in the four inverse perspective transformation IPM images into one image to obtain a real-time panoramic monitoring image.
4. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
further, in step S2, a commercial vehicle warehouse entry simulation is performed in advance, real-time sensor data and a real-time warehouse entry monitoring image during the simulation of warehouse entry are synchronously acquired, and the acquired real-time sensor data and real-time warehouse entry monitoring image are front-end data;
the real-time sensor data for simulating warehousing refer to all real-time UWB TOF sensor receiving end data, real-time inertial navigation unit IMU data and real-time wheel encoder data during simulating warehousing;
matching the map basic data with the front-end data by taking the front-end data as a reference, and giving out corresponding prompts if the map basic data at the corresponding position is inconsistent with the front-end data; if the map basic data of the corresponding position is consistent with the front end data, no corresponding prompt is given, and the map basic data is calibrated in real time.
5. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
step S3 comprises the following sub-steps:
s31: filtering and calculating;
performing filtering calculation on sensor data in the map basic data;
s32: extracting features;
segmenting a real-time panoramic monitoring image in the map basic data by a U-Net deep learning method in a convolutional neural network method, and extracting features contained in the real-time panoramic monitoring image;
s33: constructing a global warehouse-in map;
projecting the pixel points corresponding to the extracted features into a 3D space, and converting the feature pixel points projected into the 3D space into feature pixel points under a world coordinate system based on an odometer;
integrating characteristic pixel points in a world coordinate system contained in a certain range in reversing and warehousing of a commercial vehicle into a local semantic map to obtain a series of local semantic maps;
finally integrating a series of local semantic maps into a global warehouse-in map;
and the global warehousing map is imported into a central control navigation module, and the display screen of the central control navigation module observes the states of the vehicle and the surrounding environment in real time according to the global warehousing map, so that the automatic warehousing monitoring of the commercial vehicle by a driver is realized.
6. The automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion according to claim 1, wherein the method comprises the following steps:
in step S3, the features include lane lines, library bit lines, obstacles, walls.
CN202311222394.0A 2023-09-21 2023-09-21 Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion Pending CN117429410A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202311222394.0A CN117429410A (en) 2023-09-21 2023-09-21 Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202311222394.0A CN117429410A (en) 2023-09-21 2023-09-21 Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion

Publications (1)

Publication Number Publication Date
CN117429410A true CN117429410A (en) 2024-01-23

Family

ID=89548787

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202311222394.0A Pending CN117429410A (en) 2023-09-21 2023-09-21 Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion

Country Status (1)

Country Link
CN (1) CN117429410A (en)

Similar Documents

Publication Publication Date Title
JP7073315B2 (en) Vehicles, vehicle positioning systems, and vehicle positioning methods
CN111986506B (en) Mechanical parking space parking method based on multi-vision system
CN111553252B (en) Road pedestrian automatic identification and positioning method based on deep learning and U-V parallax algorithm
CN107341453B (en) Lane line extraction method and device
CN109766757B (en) Parking space high-precision positioning method and system integrating vehicle and visual information
CN105300403B (en) A kind of vehicle mileage calculating method based on binocular vision
CN111830953B (en) Vehicle self-positioning method, device and system
JP2020525809A (en) System and method for updating high resolution maps based on binocular images
CN111065043B (en) System and method for fusion positioning of vehicles in tunnel based on vehicle-road communication
CN112734841B (en) Method for realizing positioning by using wheel type odometer-IMU and monocular camera
JP2021508815A (en) Systems and methods for correcting high-definition maps based on the detection of obstructing objects
Shunsuke et al. GNSS/INS/on-board camera integration for vehicle self-localization in urban canyon
CN113781562B (en) Lane line virtual-real registration and self-vehicle positioning method based on road model
CN114419098A (en) Moving target trajectory prediction method and device based on visual transformation
CN111986261A (en) Vehicle positioning method and device, electronic equipment and storage medium
CN111678518A (en) Visual positioning method for correcting automatic parking path
CN112519800A (en) Self-checking system for unmanned vehicle
CN112819711A (en) Monocular vision-based vehicle reverse positioning method utilizing road lane line
CN116681776B (en) External parameter calibration method and system for binocular camera
CN116804553A (en) Odometer system and method based on event camera/IMU/natural road sign
CN117429410A (en) Automatic commercial vehicle warehousing method based on visual SLAM and UWB perception fusion
CN111243021A (en) Vehicle-mounted visual positioning method and system based on multiple combined cameras and storage medium
CN114503044A (en) System and method for automatically labeling objects in 3D point clouds
WO2023283929A1 (en) Method and apparatus for calibrating external parameters of binocular camera
CN113781645A (en) Indoor parking environment-oriented positioning and mapping method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination