CN108804824B - Terrain recognition method - Google Patents

Terrain recognition method Download PDF

Info

Publication number
CN108804824B
CN108804824B CN201810599159.8A CN201810599159A CN108804824B CN 108804824 B CN108804824 B CN 108804824B CN 201810599159 A CN201810599159 A CN 201810599159A CN 108804824 B CN108804824 B CN 108804824B
Authority
CN
China
Prior art keywords
sample
terrain
sample set
samples
phi
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810599159.8A
Other languages
Chinese (zh)
Other versions
CN108804824A (en
Inventor
王翠凤
梅明亮
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Yingjing Innovation Technology Co.,Ltd.
Original Assignee
Guangdong Yingke Robot Industry Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Yingke Robot Industry Co Ltd filed Critical Guangdong Yingke Robot Industry Co Ltd
Priority to CN201810599159.8A priority Critical patent/CN108804824B/en
Publication of CN108804824A publication Critical patent/CN108804824A/en
Application granted granted Critical
Publication of CN108804824B publication Critical patent/CN108804824B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F30/00Computer-aided design [CAD]
    • G06F30/20Design optimisation, verification or simulation
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C7/00Tracing profiles
    • G01C7/02Tracing profiles of land surfaces

Abstract

The invention discloses a terrain recognition method which comprises an off-line training part and an on-line classification part. The technical scheme has the advantages that: 1) terrain classification is performed based on analysis of joint pressure data, so that the terrain classification is not easily influenced by external environment interference and has strong environmental adaptability; 2) the feature extraction is carried out based on the time domain, so that the operation efficiency is higher; 3) samples acquired by offline collected samples are simplified to obtain a representative sample set, so that the calculation time of a subsequent classification algorithm is reduced; 4) through analysis of historical terrain prediction data, samples which are wrongly classified can be found out, and then the classifier is corrected on line to improve the performance of the classifier.

Description

Terrain recognition method
Technical Field
The invention relates to the technical field of robots, in particular to a terrain identification method.
Background
Wheeled robots are often affected by obstacles and therefore tend to detour to reach a destination. The legged robot can move on rough and highly unstructured terrains and can easily span complex terrains such as grooves, gaps, steps or sand. However, the standing stability of the legged robot is a critical issue. In order to realize high stability walking of the walking robot under static state and dynamic state, the terrain parameters such as roughness, friction coefficient, small geometric hazards and the like need to be considered. Based on real-time terrain recognition, the legged-foot robot can optimize a step mode and a leg posture control algorithm, and further improve stability.
In the invention, a terrain recognition method is designed for a legged robot, and the terrain recognition method comprises an off-line training part and an on-line classification part. The technical scheme has the advantages that: 1) terrain classification is performed based on analysis of joint pressure data, so that the terrain classification is not easily influenced by external environment interference and has strong environmental adaptability; 2) the feature extraction is carried out based on the time domain, so that the operation efficiency is higher; 3) samples acquired by offline collected samples are simplified to obtain a representative sample set, so that the calculation time of a subsequent classification algorithm is reduced; 4) through analysis of historical terrain prediction data, samples which are wrongly classified can be found out, and then the classifier is corrected on line to improve the performance of the classifier.
Disclosure of Invention
In order to solve the problems, the invention provides a terrain identification method, which comprises the following steps:
an off-line training part:
firstly, 4 pressure sensors are arranged on the joint of one leg of the legged robotA controller for controlling the robot to walk on the terrain expected to be identified and collecting the pressure sensor signals, the sampling frequency of the sensor is N Hz, and a time sequence r of each sensor reading is obtainediSumming the readings of the 4 pressure sensors at each sampling point to obtain a combined time sequence r;
and a second step of dividing the time sequence r acquired in the first step into data in time T units to obtain a data frame set f with mu data frames { a }1,a2,a3,…,aμEach element of f is a data frame, each data frame containing n.t data,
Figure GDA0002187687560000011
where t is 1,2, …, μ,
Figure GDA0002187687560000012
denotes atWherein i ═ 1,2, …, l;
thirdly, extracting and normalizing the characteristics of the data frame set f obtained in the second step to obtain a sample set sigma with mu samples, wherein each sample S in the sample set sigmatE sigma is described by 6 features, where t is 1,2, …, mu, then each sample
Figure GDA0002187687560000021
Figure GDA0002187687560000022
Is a vector of a 6-dimensional sample space,
Figure GDA0002187687560000023
is StIs given the superscript p ═ 1,2, …,6, where:
Figure GDA0002187687560000024
Figure GDA0002187687560000025
Figure GDA0002187687560000026
Figure GDA0002187687560000027
Figure GDA0002187687560000028
Figure GDA0002187687560000029
fourthly, labeling the sample set sigma obtained in the third step to obtain a sample set omega { (S)1,Y1),(S2,Y2),…(Sμ,Yμ) In which Y istE C, t 1,2, …, μ denotes sample StCorresponding label, YtI.e. the real terrain, μ represents the number of samples in Ω, and the set of terrain C ═ C1,c2,…,cmM represents the number of terrains;
the fifth step, calculate the representative sample set
Extracting a representative sample set phi from the sample set omega obtained in the fourth step as follows:
5.1 initializing a representative sample set phi, and enabling phi to be an empty set; generating a sample set replica
Figure GDA00021876875600000215
5.2 from
Figure GDA00021876875600000210
Take out one sample E and remove it from
Figure GDA00021876875600000211
Deleting the sample;
5.3 generating a new representative sample subset R to be added to Φ, where R ═ E;
5.4 if
Figure GDA00021876875600000212
If the current is an empty set, jumping to the step 5.6; otherwise, from
Figure GDA00021876875600000213
Take out one sample E and remove it from
Figure GDA00021876875600000214
Deleting the sample; then calculate R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE Φ, where j 1,2, …, m, m represents the number of terrains, R+Is rho (R)iE) R corresponding to the minimumiAnd ρ represents the sample of E and RiOf the sample center, RjHas a sample center of RjThe mean value of the samples corresponding to all the samples;
5.5 if the tag of E with R+If the labels are different, jumping to the step 5.3; otherwise, adding E to R+Then jump to step 5.4;
5.6 stopping the algorithm to obtain a representative sample set phi;
and an online classification part:
sixthly, collecting the kth data frame akK denotes online time point, and k is 1,2, 3.;
seventhly, extracting features from the data frame obtained in the sixth step and normalizing to obtain a sample Sk
The eighth step of sampling S obtained in the seventh stepkUsing a K nearest neighbor model to predict terrain, firstly adopting Euler distance, and obtaining the calculated distance S of omega in the step 4kThe most recent K sample sets N (S)k) (ii) a Then find N (S)k) The most significant terrain in the set, i.e. the kth predicted terrain xk(ii) a Obtaining a predicted terrain sequence Xk={x1.x2,…,xk};
Ninth, classifier result correction
For X obtained in the ninth stepk={x1.x2,…,xkAnd correcting by the following method:
Figure GDA0002187687560000031
wherein, cjE is C; II is an indicative function when xi=cjWhen II is 1, otherwise II is 0; τ > 0 represents the window length, a positive integer; can obtain a corrected terrain sequence
Figure GDA0002187687560000032
Tenth step, classifier modification
For those obtained in the ninth step
Figure GDA0002187687560000033
Performing an analysis if
Figure GDA0002187687560000034
Then use the sample
Figure GDA0002187687560000035
And correcting the representative sample set phi by the following method:
calculation of R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE.g., Φ), if ρ (R)+Labels of E) > σ or E and R+The labels are different, a new representative sample subset R is generated and added to Φ, where R ═ E, otherwise, E is added to R+In (1).
Compared with the prior art, the invention has the advantages that: 1) terrain classification is performed based on analysis of joint pressure data, so that the terrain classification is not easily influenced by external environment interference and has strong environmental adaptability; 2) the feature extraction is carried out based on the time domain, so that the operation efficiency is higher; 3) samples acquired by offline collected samples are simplified to obtain a representative sample set, so that the calculation time of a subsequent classification algorithm is reduced; 4) through analysis of historical terrain prediction data, samples which are wrongly classified can be found out, and then the classifier is corrected on line to improve the performance of the classifier.
Drawings
FIG. 1 is a schematic diagram of the placement of a pressure sensor according to the present invention
FIG. 2 is a schematic diagram of the distribution of pressure sensors in the present invention
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention will be described in detail with reference to the accompanying drawings and specific embodiments.
The invention is divided into an off-line training part and an on-line classification part, and the specific implementation steps are as follows:
an off-line training part:
firstly, mounting 4 pressure sensors on a joint of one leg of a legged robot, wherein the specific mounting mode is shown in fig. 1 and 2; controlling the robot to walk on the terrain desired to be identified and collecting the pressure sensor signals, the sampling frequency of the sensor being N Hz, obtaining a time series r of readings of each sensoriSumming the readings of the 4 pressure sensors at each sampling point to obtain a combined time sequence r;
and a second step of dividing the time sequence r acquired in the first step into data in time T units to obtain a data frame set f with mu data frames { a }1,a2,a3,…,aμEach element of f is a data frame, each data frame containing n.t data,
Figure GDA0002187687560000041
where t is 1,2, …, μ,
Figure GDA0002187687560000042
denotes atWherein i ═ 1,2, …, l;
thirdly, extracting and normalizing the characteristics of the data frame set f obtained in the second step to obtain a sample set sigma with mu samples, wherein each sample S in the sample set sigmat∈∑Is described by 6 features, where t is 1,2, …, mu, then each sample
Figure GDA0002187687560000043
Figure GDA0002187687560000044
Is a vector of a 6-dimensional sample space,
Figure GDA0002187687560000045
is StIs given the superscript p ═ 1,2, …,6, where:
Figure GDA0002187687560000046
Figure GDA0002187687560000047
Figure GDA0002187687560000048
Figure GDA0002187687560000049
Figure GDA00021876875600000410
Figure GDA00021876875600000411
fourthly, labeling the sample set sigma obtained in the third step to obtain a sample set omega { (S)1,Y1),(S2,Y2),…(Sμ,Yμ) In which Y istE C, t 1,2, …, μ denotes sample StCorresponding label, YtI.e. the real terrain, μ represents the number of samples in Ω, and the set of terrain C ═ C1,c2,…,cmM denotes the amount of terrain;
The fifth step, calculate the representative sample set
Extracting a representative sample set phi from the sample set omega obtained in the fourth step as follows:
5.1 initializing a representative sample set phi, and enabling phi to be an empty set; generating a sample set replica
Figure GDA0002187687560000051
5.2 from
Figure GDA0002187687560000052
Take out one sample E and remove it from
Figure GDA0002187687560000053
Deleting the sample;
5.3 generating a new representative sample subset R to be added to Φ, where R ═ E;
5.4 if
Figure GDA0002187687560000054
If the current is an empty set, jumping to the step 5.6; otherwise, from
Figure GDA0002187687560000055
Take out one sample E and remove it from
Figure GDA0002187687560000056
Deleting the sample; then calculate R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE Φ, where j 1,2, …, m, m represents the number of terrains, R+Is rho (R)iE) R corresponding to the minimumiAnd ρ represents the sample of E and RiOf the sample center, RjHas a sample center of RjThe mean value of the samples corresponding to all the samples;
5.5 if the tag of E with R+If the labels are different, jumping to the step 5.3; otherwise, adding E to R+Then jump to step 5.4;
5.6 stopping the algorithm to obtain a representative sample set phi;
and an online classification part:
sixthly, collecting the kth data frame akK denotes online time point, and k is 1,2, 3.;
seventhly, extracting features from the data frame obtained in the sixth step and normalizing to obtain a sample Sk
The eighth step of sampling S obtained in the seventh stepkUsing a K nearest neighbor model to predict terrain, firstly adopting Euler distance, and obtaining the calculated distance S of omega in the step 4kThe most recent K sample sets N (S)k) (ii) a Then find N (S)k) The most significant terrain in the set, i.e. the kth predicted terrain xk(ii) a Obtaining a predicted terrain sequence Xk={x1.x2,…,xk};
Ninth, classifier result correction
For X obtained in the ninth stepk={x1.x2,…,xkAnd correcting by the following method:
Figure GDA0002187687560000057
wherein, cjE is C; II is an indicative function when xi=cjWhen II is 1, otherwise II is 0; τ > 0 represents the window length, a positive integer; can obtain a corrected terrain sequence
Figure GDA0002187687560000058
Tenth step, classifier modification
For those obtained in the ninth step
Figure GDA0002187687560000059
Performing an analysis if
Figure GDA00021876875600000510
Then use the sample
Figure GDA00021876875600000511
And correcting the representative sample set phi by the following method:
calculation of R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE.g., Φ), if ρ (R)+Labels of E) > σ or E and R+The labels are different, a new representative sample subset R is generated and added to Φ, where R ═ E, otherwise, E is added to R+In (1).
To validate the invention, we used legged robots to run over 6 common terrains each, and collected pressure sensor signals at the joints. About 10 minutes of data was recorded for each topography. The sampling rate is 10Hz so the sound sequence length per terrain is 6000 sample points. We truncated the data in 2 seconds for a total of 1800 samples, each sample being a frame of data containing 20 pressure data. The data were processed using MATLAB software on a desktop computer and classifier models were obtained and cross-validated with test sets. The error rate before correction was 28.9%, and the error rate after correction was 18.3%, and it was found that the present invention is effective.

Claims (1)

1. A terrain recognition method is characterized by comprising the following steps:
an off-line training part:
firstly, installing 4 pressure sensors at the joint of a certain leg of the legged robot, controlling the robot to walk on the terrain expected to be identified, collecting signals of the pressure sensors, wherein the sampling frequency of the sensors is N Hz, and obtaining a time sequence r of the reading of each sensoriSumming the readings of the 4 pressure sensors at each sampling point to obtain a combined time sequence r;
and a second step of dividing the time sequence r acquired in the first step into data in time T units to obtain a data frame set f with mu data frames { a }1,a2,a3,…,aμEach element of f is a data frame, each data frame containing n.t data,
Figure FDA0002187687550000011
where t is 1,2, …, τ,
Figure FDA0002187687550000012
denotes atWherein i ═ 1,2, …, l;
thirdly, performing feature extraction and normalization on the data frame set f obtained in the second step to obtain a sample set sigma with tau samples, wherein each sample S in the sample set sigmatE Σ is described by 6 features, where t is 1,2, …, μ, then each sample
Figure FDA0002187687550000013
Figure FDA0002187687550000014
Is a vector of a 6-dimensional sample space,
Figure FDA0002187687550000015
is StIs given the superscript p ═ 1,2, …,6, where:
Figure FDA0002187687550000016
Figure FDA0002187687550000017
Figure FDA0002187687550000018
Figure FDA0002187687550000019
Figure FDA00021876875500000110
Figure FDA00021876875500000111
fourthly, labeling the sample set sigma obtained in the third step to obtain a sample set omega { (S)1,Y1),(S2,Y2),…(Sμ,Yμ) In which Y istE C, t 1,2, …, μ denotes sample StCorresponding label, YtI.e. the real terrain, μ represents the number of samples in Ω, and the set of terrain C ═ C1,c2,…,cmM represents the number of terrains;
the fifth step, calculate the representative sample set
Extracting a representative sample set phi from the sample set omega obtained in the fourth step as follows:
5.1 initializing a representative sample set phi, and enabling phi to be an empty set; generating a sample set copy phi which is omega;
5.2 from
Figure FDA0002187687550000021
Take out one sample E and remove it from
Figure FDA0002187687550000022
Deleting the sample;
5.3 generating a new representative sample subset R to be added to Φ, where R ═ E;
5.4 if
Figure FDA0002187687550000023
If the current is an empty set, jumping to the step 5.6; otherwise, from
Figure FDA0002187687550000024
Take out one sample E and remove it from
Figure FDA0002187687550000025
Deleting the sample; then calculate R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE.Φ, where j is 1,2, …, m, m representsNumber of landforms, R+Is rho (R)iE) R corresponding to the minimumiAnd ρ represents the sample of E and RiOf the sample center, RjHas a sample center of RjThe mean value of the samples corresponding to all the samples;
5.5 if the tag of E with R+If the labels are different, jumping to the step 5.3; otherwise, adding E to R+Then jump to step 5.4;
5.6 stopping the algorithm to obtain a representative sample set phi;
and an online classification part:
sixthly, collecting the kth data frame akK denotes an online time point, and k is 1,2, 3, …;
seventhly, extracting features from the data frame obtained in the sixth step and normalizing to obtain a sample Sk
The eighth step of sampling S obtained in the seventh stepkUsing a K nearest neighbor model to predict terrain, firstly adopting Euler distance, and obtaining the calculated distance S of omega in the step 4kThe most recent K sample sets N (S)k) (ii) a Then find N (S)k) The most significant terrain in the set, i.e. the kth predicted terrain xk(ii) a Obtaining a predicted terrain sequence Xk={x1.x2,…,xk};
Ninth, classifier result correction
For X obtained in the ninth stepk={x1.x2,…,xkAnd correcting by the following method:
Figure FDA0002187687550000026
wherein, cj∈C;
Figure FDA0002187687550000027
For an illustrative function, when xi=cjWhen the temperature of the water is higher than the set temperature,
Figure FDA00021876875500000213
otherwise
Figure FDA0002187687550000028
τ>0 represents the window length and is a positive integer; can obtain a corrected terrain sequence
Figure FDA0002187687550000029
Tenth step, classifier modification
For those obtained in the ninth step
Figure FDA00021876875500000210
Performing an analysis if
Figure FDA00021876875500000211
Then use the sample
Figure FDA00021876875500000212
And correcting the representative sample set phi by the following method:
calculation of R+Satisfies rho (R)+,E)=min{ρ(Rj,E),RjE.g., Φ), if ρ (R)+,E)>Tag of σ or E with R+The labels are different, a new representative sample subset R is generated and added to Φ, where R ═ E, otherwise, E is added to R+In (1).
CN201810599159.8A 2018-06-12 2018-06-12 Terrain recognition method Active CN108804824B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810599159.8A CN108804824B (en) 2018-06-12 2018-06-12 Terrain recognition method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810599159.8A CN108804824B (en) 2018-06-12 2018-06-12 Terrain recognition method

Publications (2)

Publication Number Publication Date
CN108804824A CN108804824A (en) 2018-11-13
CN108804824B true CN108804824B (en) 2020-04-24

Family

ID=64085150

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810599159.8A Active CN108804824B (en) 2018-06-12 2018-06-12 Terrain recognition method

Country Status (1)

Country Link
CN (1) CN108804824B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114474053B (en) * 2021-12-30 2023-01-17 暨南大学 Robot terrain recognition and speed control method and system

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7370969B2 (en) * 2004-03-31 2008-05-13 Nidek Co., Ltd. Corneal topography analysis system
CN101701818B (en) * 2009-11-05 2011-03-30 上海交通大学 Method for detecting long-distance barrier
CN101813475B (en) * 2010-04-24 2011-06-01 上海交通大学 Method for adaptively detecting remote obstacle
JP2016197093A (en) * 2015-01-04 2016-11-24 高橋 正人 Direction information acquisition device, direction information acquisition program and direction information acquisition method
WO2018089543A1 (en) * 2016-11-08 2018-05-17 Massachusetts Institute Of Technology Kinetic sensing, signal generation, feature extraction, and pattern recognition for control of autonomous wearable leg devices

Also Published As

Publication number Publication date
CN108804824A (en) 2018-11-13

Similar Documents

Publication Publication Date Title
CN112629863B (en) Bearing fault diagnosis method for dynamic joint distribution alignment network under variable working conditions
CN107145645B (en) Method for predicting residual life of non-stationary degradation process with uncertain impact
Santos et al. An evaluation of 2D SLAM techniques available in robot operating system
KR20160031246A (en) Method and apparatus for gait task recognition
CN112488073A (en) Target detection method, system, device and storage medium
CN109977895B (en) Wild animal video target detection method based on multi-feature map fusion
WO2006099597A3 (en) Pose estimation based on critical point analysis
CN110689535B (en) Workpiece identification method and device, electronic equipment and storage medium
CN109829136B (en) Method and system for predicting residual life of degradation equipment with random jump
CN108284444B (en) Multi-mode human body action prediction method based on Tc-ProMps algorithm under man-machine cooperation
CN108804824B (en) Terrain recognition method
CN110189362B (en) Efficient target tracking method based on multi-branch self-coding countermeasure network
CN107657627B (en) Space-time context target tracking method based on human brain memory mechanism
Verma et al. Neural speed–torque estimator for induction motors in the presence of measurement noise
WO2022004773A1 (en) Model generation device, regression device, model generation method, and model generation program
US20140324743A1 (en) Autoregressive model for time-series data
WO2011085819A1 (en) A machine-learning system and a method for determining different operating points in such a system
Sójka et al. Learning an Efficient Terrain Representation for Haptic Localization of a Legged Robot
CN110728327B (en) Interpretable direct-push learning method and system
JP4102318B2 (en) Tool motion recognition device and tool motion recognition method
Ulapane et al. System Identification of Static Nonlinear Elements: A Unified Approach of Active Learning, Over-fit Avoidance, and Model Structure Determination
Asano et al. Risk evaluation of ground surface using multichannel foot sensors for biped robots
Concon et al. Deep Learning for Terrain Surface Classification: Vibration-based Approach.
KR100921495B1 (en) Method for updating gmm using particle filtering
Chen et al. Learning to identify footholds from geometric characteristics for a six-legged robot over rugged terrain

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20200330

Address after: 516400 Yingke robot industrial park, ecological technology city, Haifeng County, Shanwei City, Guangdong Province

Applicant after: Guangdong Yingke robot industry Co., Ltd

Address before: 230601 Anhui Hefei economic and Technological Development Zone, Shizhu road 339, Venus commercial city two phase 2005

Applicant before: ANHUI WEIAOMAN ROBOT Co.,Ltd.

GR01 Patent grant
GR01 Patent grant
TR01 Transfer of patent right
TR01 Transfer of patent right

Effective date of registration: 20210305

Address after: 516400 Yingjing Zhihui Park, phase IV, ecological science and Technology City, Chengdong Town, Haifeng County, Shanwei City, Guangdong Province

Patentee after: Guangdong Yingjing Innovation Technology Co.,Ltd.

Address before: 516400 Yingke robot industrial park, Haifeng County, Shanwei City, Guangdong Province

Patentee before: Guangdong Yingke robot industry Co.,Ltd.