CN109947108B - Method for predicting road condition in front of mobile robot - Google Patents

Method for predicting road condition in front of mobile robot Download PDF

Info

Publication number
CN109947108B
CN109947108B CN201910255755.9A CN201910255755A CN109947108B CN 109947108 B CN109947108 B CN 109947108B CN 201910255755 A CN201910255755 A CN 201910255755A CN 109947108 B CN109947108 B CN 109947108B
Authority
CN
China
Prior art keywords
mobile robot
stereo camera
binocular stereo
calculating
binary image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910255755.9A
Other languages
Chinese (zh)
Other versions
CN109947108A (en
Inventor
刘瑜
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Yuhetian Smart City Operation Group Co ltd
Original Assignee
Shenzhen Qifeng Intelligent Robot Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Qifeng Intelligent Robot Technology Co ltd filed Critical Shenzhen Qifeng Intelligent Robot Technology Co ltd
Priority to CN201910255755.9A priority Critical patent/CN109947108B/en
Publication of CN109947108A publication Critical patent/CN109947108A/en
Application granted granted Critical
Publication of CN109947108B publication Critical patent/CN109947108B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The method for predicting the road condition in front of the mobile robot comprises a binocular stereo camera arranged at the front part of the mobile robot, wherein the optical axis of the binocular stereo camera is parallel to the chassis of the mobile robot, the method also comprises a processor connected with the binocular stereo camera, and the processor is used for setting the method for predicting the road condition in front and comprises the following steps: obtaining depth information z = f1(x, y); calculating a projection angle θ = arctan (y/f); the perpendicular distance l = z · tan θ from the ground point to the optical axis; calculating the degree of unevenness f2(x, y) = h-l, binary f3(x, y); mapping the binary image f3(x, y) performing erosion and expansion operations to obtain optimized values, and removing f by using hadamard product of matrix2Stray data in (x, y); from the direction of travel, the area s of the uneven road surface and the average unevenness are calculated. The method provides front road information for the mobile robot on the basis of binocular stereoscopic vision, and ensures safe work.

Description

Method for predicting road condition in front of mobile robot
Technical Field
The invention relates to a method for predicting a road condition in front of a mobile robot, belonging to the field of image processing of machine vision.
Background
Most of the outdoor mobile robots and the indoor mobile robots are driven by three wheels or four wheels. The exception is that the two-foot or four-foot upright walking robot of boston power company can overcome the difficulty of uneven road surface under laboratory conditions or limited practical environments, and most mobile robots work on the premise of flat road surface, but accidents can be caused when the ground is uneven. The household dust collector is characterized in that an infrared proximity sensor is arranged at the front end of a driving wheel, and the household dust collector must be stopped to avoid falling when the infrared proximity sensor cannot detect the ground. But this approach is not universally significant.
Machine vision and binocular stereo vision developed on the basis have the advantages of wide detection range and rich information, and can be used for detecting the front road surface condition.
Disclosure of Invention
Aiming at the problems, the invention provides a method for predicting the road condition in front of a mobile robot, which helps the mobile robot to detect the leveling condition of the ground in front.
The technical scheme adopted by the invention for solving the technical problems is as follows:
the method for predicting the road condition in front of the mobile robot comprises a binocular stereo camera arranged in front of the mobile robot, wherein the focal length is f, the base line width is b, the height is h, the optical axis of the binocular stereo camera is parallel to the chassis of the mobile robot, the method further comprises a processor connected with the binocular stereo camera, and the processor is used for setting the method for predicting the road condition in front and comprises the following steps:
(1) the processor acquires the image pair f of the binocular stereo cameraLAnd fRForming depth information z = f1(x, y) = f b/d, where d is from image pair fLAnd fRCalculating the parallax of the obtained position (x, y), wherein x, y are image plane coordinates, and z is the corresponding depth;
(2) for imaging point (x, y), depth z = f1(x, y), projection angle θ = arctan (y/f); the vertical distance l = z · tan θ = z · y/f from the ground point corresponding to the imaging point to the optical axis of the binocular stereo camera;
(3) calculating the degree of unevenness f2(x, y) = h-l if | h-l->T, then is denoted as f3(x, y) =1, otherwise f3(x, y) =0, wherein the threshold T is the maximum unevenness that the mobile robot can cross;
(4) mapping the binary image f3(x, y) removing the binary image f by performing an erosion-first and then dilation operation3Excess impurity sites in (x, y); on the basis, the operation of expansion and corrosion is carried out, and a binary image f is filled3Fine holes in (x, y) to finally obtain an optimized binary image f3(x, y) calculating a binary map f3(x, y) and f2Hadamard product of (x, y), removing f2Stray data in (x, y), i.e. f2(x,y)=f3(x,y)*f2(x,y);
The mobile robot is in f according to the advancing direction3Scanning the road surface condition on (x, y), and uneven road surface area s = ∑ f3(x,y)·z·spixF, average roughness ∑ f2(x,y)/∑f3(x, y), wherein the value range of x, y is determined according to the moving direction of the mobile robot, spixIs the sensor pixel size.
The invention has the following beneficial effects: 1. the leveling state of the front ground can be predicted in advance, and environmental information is provided for the motion and navigation control of the mobile robot; 2. the operation speed is high, and finally a detailed ground unevenness description file is formed.
Drawings
FIG. 1 is a schematic exterior view of a mobile robot;
FIG. 2 is a schematic of the out-of-flatness calculation;
FIG. 3 is a schematic illustration of unevenness patterning.
Detailed Description
The invention is further described below with reference to the accompanying drawings:
referring to fig. 1 to 3, the method for predicting the road condition ahead of the mobile robot includes a binocular stereo camera disposed at the front of the mobile robot, the focal length is f, the baseline width is b, and the height is h, and the optical axis of the binocular stereo camera is parallel to the chassis of the mobile robot. The binocular stereo camera is the basic configuration of the mobile robot, can output monocular common images, and can provide depth information for obstacle avoidance navigation.
The binocular stereo camera system further comprises a processor connected with the binocular stereo camera, the processor is provided with a front road condition prediction method, and the binocular stereo camera system comprises the following steps:
(1) the processor acquires the image pair f of the binocular stereo cameraLAnd fRForming depth information z = f1(x, y) = f b/d, where d is from image pair fLAnd fRThe calculated parallax of the position (x, y),x and y are image plane coordinates, and z is corresponding depth;
the processor calculates depth information according to the parallax principle, and the z = f can be obtained by using the parameters of the binocular stereo camera1(x, y) = f b/d, where d is from image pair fLAnd fRThe parallax of the resulting position (x, y) is calculated.
(2) For imaging point (x, y), depth z = f1(x, y), projection angle θ = arctan (y/f); the vertical distance l = z · tan θ = z · y/f from the ground point corresponding to the imaging point to the optical axis of the binocular stereo camera;
as shown in fig. 2, the vertical distance l can be calculated according to the tangent formula of the right triangle.
(3) Calculating the degree of unevenness f2(x, y) = h-l if | h-l->T, then is denoted as f3(x, y) =1, otherwise f3(x, y) =0, wherein the threshold T is the maximum unevenness that the mobile robot can cross;
f2(x, y) is according to the depth map f1(x, y) and f is calculated by using a threshold value T to filter the influence of the calculation noise3(x, y) is subjected to binarization to obtain f2(x,y)。
(4) Mapping the binary image f3(x, y) removing the binary image f by performing an erosion-first and then dilation operation3Excess impurity sites in (x, y); on the basis, the operation of expansion and corrosion is carried out, and a binary image f is filled3Fine holes in (x, y) to finally obtain an optimized binary image f3(x, y) calculating a binary map f3(x, y) and f2Hadamard product of (x, y), removing f2Stray data in (x, y), i.e. f2(x,y)=f3(x,y)*f2(x,y);
Mapping the binary image f3(x, y) carrying out optimization treatment, removing stray points and missed detection points, and then optimizing f by using a matrix operation hadamard product2(x,y)。
(5) The mobile robot is in f according to the advancing direction3Scanning the road surface condition on (x, y), and uneven road surface area s = ∑ f3(x,y)·z·spixF, average ofFlatness of Σ f2(x,y)/∑f3(x, y), wherein the value range of x, y is determined according to the moving direction of the mobile robot, spixIs the sensor pixel size.
Finally, according to the optimized unevenness map f3(x, y) the area and depth of the protrusion or depression location can be calculated, where z · spixAnd/f is the imaging area size corresponding to a single sensor pixel.

Claims (1)

1. The method for predicting the road condition in front of the mobile robot comprises a binocular stereo camera arranged at the front part of the mobile robot, wherein the focal length is f, the base line width is b, and the height is h, and the method is characterized in that: the optical axis of the binocular stereo camera is parallel to the chassis of the mobile robot, the binocular stereo camera system further comprises a processor connected with the binocular stereo camera, and the processor is provided with a front road condition prediction method, and the method comprises the following steps:
(1) the processor acquires the image pair f of the binocular stereo cameraLAnd fRForming depth information z = f1(x, y) = f b/d, where d is from image pair fLAnd fRCalculating the parallax of the obtained position (x, y), wherein x, y are image plane coordinates, and z is the corresponding depth;
(2) for imaging point (x, y), depth z = f1(x, y), projection angle θ = arctan (y/f); the vertical distance l = z · tan θ = z · y/f from the ground point corresponding to the imaging point to the optical axis of the binocular stereo camera;
(3) calculating the degree of unevenness f2(x, y) = h-l if | h-l->T, then is denoted as f3(x, y) =1, otherwise f3(x, y) =0, wherein the threshold T is the maximum unevenness that the mobile robot can cross;
(4) mapping the binary image f3(x, y) removing the binary image f by performing an erosion-first and then dilation operation3Excess impurity sites in (x, y); on the basis, the operation of expansion and corrosion is carried out, and a binary image f is filled3Fine holes in (x, y) to finally obtain an optimized binary image f3(x, y) calculating a binary map f3(x, y) and f2Hadamard product of (x, y), removing f2Stray data in (x, y), i.e. f2(x,y)=f3(x,y)*f2(x,y);
(5) The mobile robot is in f according to the advancing direction3Scanning the road surface condition on (x, y), and uneven road surface area s = ∑ f3(x,y)·z·spixF, average roughness ∑ f2(x,y)/∑f3(x, y), wherein the value range of x, y is determined according to the moving direction of the mobile robot, spixIs the sensor pixel size.
CN201910255755.9A 2019-04-01 2019-04-01 Method for predicting road condition in front of mobile robot Active CN109947108B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910255755.9A CN109947108B (en) 2019-04-01 2019-04-01 Method for predicting road condition in front of mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910255755.9A CN109947108B (en) 2019-04-01 2019-04-01 Method for predicting road condition in front of mobile robot

Publications (2)

Publication Number Publication Date
CN109947108A CN109947108A (en) 2019-06-28
CN109947108B true CN109947108B (en) 2021-11-26

Family

ID=67012456

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910255755.9A Active CN109947108B (en) 2019-04-01 2019-04-01 Method for predicting road condition in front of mobile robot

Country Status (1)

Country Link
CN (1) CN109947108B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112880599B (en) * 2021-01-26 2022-05-20 武汉市市政建设集团有限公司 Roadbed flatness detection system based on four-foot robot and working method

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955920A (en) * 2014-04-14 2014-07-30 桂林电子科技大学 Binocular vision obstacle detection method based on three-dimensional point cloud segmentation
CN106978774A (en) * 2017-03-22 2017-07-25 中公高科养护科技股份有限公司 A kind of road surface pit automatic testing method

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3842747A1 (en) * 2015-02-10 2021-06-30 Mobileye Vision Technologies Ltd. Sparse map for autonomous vehicle navigation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103955920A (en) * 2014-04-14 2014-07-30 桂林电子科技大学 Binocular vision obstacle detection method based on three-dimensional point cloud segmentation
CN106978774A (en) * 2017-03-22 2017-07-25 中公高科养护科技股份有限公司 A kind of road surface pit automatic testing method

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
Agreement of driving simulator and on-road driving performance in patients with binocular visual field loss;J Ungewiss等;《Graefes Archive for Clinical&Experimental Ophthalmology》;20181231;全文 *
图象分析系统的腐蚀与膨胀算法;许龙律;《哈尔滨科学技术大学科报》;19871231;全文 *
基于双目立体视觉的三维地形重构;邢怀学等;《地理与地理信息科学》;20070330(第02期);全文 *

Also Published As

Publication number Publication date
CN109947108A (en) 2019-06-28

Similar Documents

Publication Publication Date Title
US8736820B2 (en) Apparatus and method for distinguishing ground and obstacles for autonomous mobile vehicle
AU2015234395B2 (en) Real-time range map generation
US6906620B2 (en) Obstacle detection device and method therefor
JP4676373B2 (en) Peripheral recognition device, peripheral recognition method, and program
CN101067557A (en) Environment sensing one-eye visual navigating method adapted to self-aid moving vehicle
CN108481327B (en) Positioning device, positioning method and robot for enhancing vision
JP5982298B2 (en) Obstacle detection device and obstacle detection method
CN106569225B (en) Unmanned vehicle real-time obstacle avoidance method based on ranging sensor
KR102056147B1 (en) Registration method of distance data and 3D scan data for autonomous vehicle and method thereof
JP6464410B2 (en) Obstacle determination device and obstacle determination method
WO2019031137A1 (en) Roadside object detection device, roadside object detection method, and roadside object detection system
JPWO2017169365A1 (en) Road surface displacement detection device and suspension control method
JP3961584B2 (en) Lane marking detector
CN109919139B (en) Road surface condition rapid detection method based on binocular stereo vision
JP6895371B2 (en) Information processing device and suspension control method
JP5957359B2 (en) Stereo image processing apparatus and stereo image processing method
CN109947108B (en) Method for predicting road condition in front of mobile robot
JP6204782B2 (en) Off-road dump truck
WO2017188158A1 (en) Device for detecting road surface state
CN109883393B (en) Method for predicting front gradient of mobile robot based on binocular stereo vision
CN109903325B (en) Ground accurate description method based on stereoscopic vision depth information
WO2022059289A1 (en) Vehicle orientation estimation system and vehicle orientation estimation method
JP4106163B2 (en) Obstacle detection apparatus and method
KR20160063039A (en) Method of Road Recognition using 3D Data
CN110097592B (en) Semantic description method of ground information

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20211110

Address after: 518000 304a, floor 3, Qinghai building, No. 7043, Beihuan Avenue, Kangxin community, Lianhua street, Futian District, Shenzhen, Guangdong Province

Applicant after: Shenzhen Qifeng Intelligent Robot Technology Co.,Ltd.

Address before: 310013 No. 256, floor 6, building 2, Huahong building, 248 Tianmushan Road, Xihu District, Hangzhou, Zhejiang

Applicant before: HANGZHOU JINGYI INTELLIGENT SCIENCE & TECHNOLOGY Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20220704

Address after: 518000 a-1801, a1802, a1803, Haisong building, Tairan 9th Road, chegongmiao, Futian District, Shenzhen City, Guangdong Province (office only)

Patentee after: Shenzhen yuhetian smart city operation group Co.,Ltd.

Address before: 304a, 3rd floor, Qinghai building, 7043 Beihuan Avenue, Kangxin community, Lianhua street, Futian District, Shenzhen, Guangdong 518000

Patentee before: Shenzhen Qifeng Intelligent Robot Technology Co.,Ltd.

CP02 Change in the address of a patent holder
CP02 Change in the address of a patent holder

Address after: C315, Building C, Huafeng International Robot Industrial Park, Hangcheng Avenue, Nanchang Community, Xixiang Street, Baoan District, Shenzhen City, Guangdong Province, 518100

Patentee after: Shenzhen yuhetian smart city operation group Co.,Ltd.

Address before: 518000 a-1801, a1802, a1803, Haisong building, Tairan 9th Road, chegongmiao, Futian District, Shenzhen City, Guangdong Province (office only)

Patentee before: Shenzhen yuhetian smart city operation group Co.,Ltd.