CN111203873A - Mobile robot control method and mobile robot - Google Patents

Mobile robot control method and mobile robot Download PDF

Info

Publication number
CN111203873A
CN111203873A CN201911357849.3A CN201911357849A CN111203873A CN 111203873 A CN111203873 A CN 111203873A CN 201911357849 A CN201911357849 A CN 201911357849A CN 111203873 A CN111203873 A CN 111203873A
Authority
CN
China
Prior art keywords
mobile robot
sensing data
infrared
ultrasonic
sensors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN201911357849.3A
Other languages
Chinese (zh)
Other versions
CN111203873B (en
Inventor
胡泽田
谭世恒
张强
何东岭
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Shenlan Vision Technology Co Ltd
Original Assignee
Shenzhen Shenlan Vision Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Shenlan Vision Technology Co Ltd filed Critical Shenzhen Shenlan Vision Technology Co Ltd
Priority to CN201911357849.3A priority Critical patent/CN111203873B/en
Publication of CN111203873A publication Critical patent/CN111203873A/en
Application granted granted Critical
Publication of CN111203873B publication Critical patent/CN111203873B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1656Programme controls characterised by programming, planning systems for manipulators
    • B25J9/1664Programme controls characterised by programming, planning systems for manipulators characterised by motion, path, trajectory planning
    • B25J9/1666Avoiding collision or forbidden zones
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J13/00Controls for manipulators
    • B25J13/08Controls for manipulators by means of sensing devices, e.g. viewing or touching devices
    • B25J13/088Controls for manipulators by means of sensing devices, e.g. viewing or touching devices with position, velocity or acceleration sensors
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1674Programme controls characterised by safety, monitoring, diagnostic
    • B25J9/1676Avoiding collision or forbidden zones

Abstract

The application provides a mobile robot control method, which comprises the following steps: acquiring first infrared sensing data in a preset time period before the current moment through N infrared sensors on the mobile robot, wherein the N infrared sensors correspond to M directions of the mobile robot; acquiring first ultrasonic sensing data in the preset time period through L ultrasonic sensors on the mobile robot, wherein the L ultrasonic sensors correspond to K positions in the M positions; preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances, wherein the M obstacle distances correspond to M directions of the mobile robot; determining the angular speed and the linear speed of the mobile robot according to the M obstacle distances; and controlling the mobile robot to move according to the angular velocity and the linear velocity.

Description

Mobile robot control method and mobile robot
Technical Field
The present application relates to the field of mobile robot technology, and in particular, to a mobile robot control method, a mobile robot, and a computer-readable storage medium.
Background
With the development of the technology, various mobile robots such as floor sweeping robots are adopted, the production and life efficiency of people is greatly improved, and the labor force is saved. However, in the moving process of the mobile robot, various obstacles in the environment often need to be detected, for example, a sweeping robot needs to detect walls, furniture and the like, and move according to the detected obstacles. However, due to various materials, colors, shapes, and the like of the obstacles, the mobile robot often has low detection accuracy and poor detection effect on the obstacles, and the robot often collides with the obstacles during moving.
Disclosure of Invention
The embodiment of the application provides a mobile robot control method, a mobile robot and a computer readable storage medium, which can improve the detection precision of the mobile robot on obstacles.
In a first aspect, an embodiment of the present application provides a mobile robot control method, including:
acquiring first infrared sensing data in a preset time period before the current time by N infrared sensors on the mobile robot, wherein the N infrared sensors correspond to M directions of the mobile robot, N and M are integers greater than 0, and N is not less than M;
acquiring first ultrasonic sensing data in the preset time period through L ultrasonic sensors on the mobile robot, wherein the L ultrasonic sensors correspond to K directions in the M directions, and K and L are integers greater than 0;
preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances, wherein the M obstacle distances correspond to M directions of the mobile robot;
determining the angular speed and the linear speed of the mobile robot according to the M obstacle distances;
and controlling the mobile robot to move according to the angular velocity and the linear velocity.
In a second aspect, an embodiment of the present application provides a mobile robot, including N infrared sensors, L ultrasonic sensors, and a control circuit, where the control circuit is connected to the N infrared sensors and the L ultrasonic sensors respectively;
the control circuit comprises a memory, a processor and a computer program stored in the memory and executable on the processor, the processor implementing the mobile robot control method according to the first aspect when executing the computer program.
In a third aspect, the present application provides a computer-readable storage medium, which stores a computer program, and when the computer program is executed by a processor, the computer program implements the mobile robot control method according to the first aspect.
In a fourth aspect, the present application provides a computer program product, when running on a mobile robot, for causing the mobile robot to execute the mobile robot control method described in the first aspect.
Compared with the prior art, the embodiment of the application has the advantages that: in the embodiment of the application, through N infrared sensors on the mobile robot, obtain the first infrared sensing data in the preset time period before the present moment, and through L ultrasonic sensors on the mobile robot obtain first ultrasonic sensing data in the preset time period can combine infrared sensor and ultrasonic sensor to detect the condition of the obstacle in the environment, and it is respectively right first infrared sensing data with first ultrasonic sensing data carry out the preliminary treatment, obtain M obstacle distances, can reduce the detection error that leads to because the material, the colour, the shape etc. of various obstacles, improve the detection precision to the obstacle.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present application, the drawings needed to be used in the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flowchart of a mobile robot control method according to an embodiment of the present application;
fig. 2 is a schematic flowchart of step S104 according to an embodiment of the present application;
fig. 3 is a schematic flowchart of step S103 according to an embodiment of the present application;
fig. 4 is a schematic structural diagram of a mobile robot according to an embodiment of the present application;
fig. 5 is a schematic structural diagram of a mobile robot according to an embodiment of the present application;
fig. 6 is a block diagram of a mobile robot according to an embodiment of the present disclosure.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It should also be understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in the specification of this application and the appended claims, the term "if" may be interpreted contextually as "when … …" or "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
Reference throughout this specification to "one embodiment" or "some embodiments," or the like, means that a particular feature, structure, or characteristic described in connection with the embodiment is included in one or more embodiments of the present application. Thus, appearances of the phrases "in one embodiment," "in some embodiments," "in other embodiments," or the like, in various places throughout this specification are not necessarily all referring to the same embodiment, but rather "one or more but not all embodiments" unless specifically stated otherwise. The terms "comprising," "including," "having," and variations thereof mean "including, but not limited to," unless expressly specified otherwise.
The mobile robot control method provided in the embodiment of the present application may be applied to mobile robots such as a mobile phone, a tablet personal computer, a wearable device, a vehicle-mounted device, an Augmented Reality (AR)/Virtual Reality (VR) device, a notebook computer, an ultra-mobile personal computer (UMPC), a netbook, and a Personal Digital Assistant (PDA), and the specific type of the mobile robot is not limited in any way in the embodiment of the present application.
Specifically, fig. 1 shows a flowchart of a first mobile robot control method provided in an embodiment of the present application, and the mobile robot control method is applied to a mobile robot.
In the embodiment of the present application, the mobile robot may have at least N infrared sensors and at least L ultrasonic sensors, and in some cases, the mobile robot may further include a control circuit, and the control circuit may be connected to each of the infrared sensors and the ultrasonic sensors, respectively, to perform information transmission. In the embodiment of the present application, specific hardware, software, firmware, and functions of the mobile robot are not limited herein.
In the prior art, an infrared sensor is often adopted to detect obstacles. However, due to the material (e.g., glass), color (e.g., black or transparent), shape (e.g., mesh structure) and environmental factors (e.g., high intensity of sunlight), the reflectivity of the infrared signal of the infrared sensor may be reduced, and even the infrared signal may be lost, thereby causing a failure in detecting the obstacle. Compared with an infrared sensor, the ultrasonic sensor cannot detect obstacles unsuccessfully due to low reflectivity of ultrasonic signals caused by factors such as transparency or black environment. However, the ultrasonic sensor is susceptible to environmental noise, for example, a motor may generate a certain high frequency during rotation, high frequency noise generated by friction of wheels on a hard ground, shaking of the robot itself, even when there are multiple robots, sound waves emitted by other ultrasonic sensors may cause the ultrasonic sensor to receive an erroneous ultrasonic signal, resulting in erroneous judgment.
On this basis, the embodiment of the application provides a method for detecting the obstacle by effectively combining an infrared sensor and an ultrasonic sensor, so as to improve the detection precision of the mobile robot on the obstacle.
As shown in fig. 1, the mobile robot control method includes:
step S101, acquiring first infrared sensing data in a preset time period before the current time through N infrared sensors on the mobile robot, wherein the N infrared sensors correspond to M directions of the mobile robot, N and M are integers greater than 0, and N is not less than M.
In the embodiment of the present application, any one of the orientations may correspond to an angle or an angle range. In the M directions, the number of the infrared sensors corresponding to any one direction may be determined according to an actual application scenario, and the infrared sensors corresponding to the same direction may be arranged in various ways. For example, in the M orientations, the infrared sensors corresponding to any one of the orientations may be distributed in at least two height positions, and may be distributed equidistantly in an angle region corresponding to the orientation.
Step S102, obtaining first ultrasonic sensing data in the preset time period through L ultrasonic sensors on the mobile robot, wherein the L ultrasonic sensors correspond to K directions in the M directions, and K and L are integers larger than 0.
In this embodiment, the K orientations corresponding to the L ultrasonic sensors are some or all of the M orientations corresponding to the infrared sensors. In addition, the number and specific arrangement mode of the ultrasonic sensors corresponding to any one of the K directions can also be determined according to an actual scene.
Step S103, preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances, wherein the M obstacle distances correspond to M directions of the mobile robot.
In this embodiment of the application, the preprocessing may include one or more processing manners, for example, the first infrared sensing data and the first ultrasonic sensing data may be filtered in a preset filtering manner, and the filtering manner may be multiple, for example, median filtering, mean filtering, and averaging after rejecting abnormal data. In addition, for example, the preprocessing may further include dividing the first infrared sensing data and the first ultrasonic sensing data according to the M positions to obtain a part of the first infrared sensing data and/or a part of the first ultrasonic sensing data corresponding to each position, and then performing data screening and/or data fusion on the part of the first infrared sensing data and/or the part of the first ultrasonic sensing data corresponding to each position to obtain M obstacle distances. Wherein the M obstacle distances may correspond one-to-one to the M orientations.
And step S104, determining the angular speed and the linear speed of the mobile robot according to the M obstacle distances.
In the embodiment of the present application, there may be a plurality of methods for determining the angular velocity and the linear velocity of the mobile robot. For example, the angular velocity and the linear velocity of the mobile robot may be determined by a Dynamic Window Approach (DWA), a fuzzy control algorithm, or the like.
In the prior art, the mobile robot may be a wheeled robot, and in this case, the mobile robot may be moved by means of differential driving. When the moving speed of each wheel of the mobile robot is determined by a method such as a fuzzy control algorithm, it takes much time to test whether the correlation algorithm is correct and reasonable when applied to the actual control because of the correlation between the wheels. In some embodiments of the present application, when the angular velocity and the linear velocity of the mobile robot are determined by a fuzzy control algorithm or the like according to the distances between the M obstacles, since the angular velocity and the linear velocity are independent from each other, it is simpler and more intuitive to make a fuzzy rule by using the angular velocity and the linear velocity as fuzzy output quantities, so that a lot of time for testing the fuzzy rule can be saved, and a better effect can be obtained quickly.
And S105, controlling the mobile robot to move according to the angular velocity and the linear velocity.
In the embodiment of the present application, the mobile robot may move in various manners, for example, the mobile robot may be a wheel robot, a belt robot, a walking mobile robot (single-legged, double-legged, and multi-legged), and the like. The movement control of the mobile robot can be determined by the angular velocity and the linear velocity for different mobile robot configurations.
In some embodiments, the controlling the mobile robot to move according to the angular velocity and the linear velocity may include:
determining the left wheel speed and the right wheel speed of the mobile robot according to the angular speed and the linear speed;
and controlling the mobile robot to move according to the left wheel speed and the right wheel speed.
Specifically, the left wheel speed of the mobile robot may be:
Figure BDA0002336419640000071
the right wheel speed of the mobile robot may be:
Figure BDA0002336419640000072
wherein, v isLIs the left wheel speed of the mobile robot, vRThe velocity of the right wheel of the mobile robot is v, the linear velocity of the mobile robot is ω, the angular velocity of the mobile robot is ω, and d is a distance between the center point of the left wheel and the center point of the right wheel.
At this time, the left wheel speed v may be usedLAnd right wheel speed vRAnd controlling the mobile robot to move, wherein the mobile robot can be a two-wheel differential drive mobile robot.
As shown in fig. 2, in some embodiments, the step S104 specifically includes:
step S201, obtaining membership degrees corresponding to all fuzzy rules respectively according to the M obstacle distances, a first membership function related to the obstacle distances and a preset fuzzy rule table, wherein the fuzzy rule table comprises all fuzzy rules;
step S202, acquiring first values of the linear velocity relative to each fuzzy rule respectively;
step S203, acquiring second values of the angular velocity relative to each fuzzy rule respectively;
and S204, determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, the first values and the second values respectively corresponding to the fuzzy rules.
In the embodiment of the present application, the fuzzy rule table may merge corresponding accurate values into fuzzy subsets according to a specific membership function, and may be described by using a language variable.
The fuzzy rule table may be pre-established. The fuzzy rule table may include fuzzy rules, where each fuzzy rule may be represented in an if-then form, that is, any fuzzy rule may be structured in the form of:
If A is true,AND B is true,AND C is true then D.
with AND-connected A, B, C as a condition, when all of the conditions a, B, AND C are satisfied, the result D can be obtained.
Illustratively, the fuzzy language of the respective obstacle distances is described as { far (F), medium (M), near (N) }. Generally, the range of values of the obstacle distance corresponding to the fuzzy language description may be set as a measurement range of a specific sensor in the mobile robot, for example, in the embodiment of the present application, the measurement range of the infrared sensor and/or the ultrasonic sensor. The fuzzy rule table may be established.
And according to the fuzzy rule table and the determined distances of the M obstacles, obtaining the membership degrees corresponding to the fuzzy rules respectively.
In some embodiments, the step S201 specifically includes:
for each fuzzy rule, respectively determining a first sub-membership degree of each obstacle distance in the M obstacle distances relative to the fuzzy rule;
and taking the minimum value in each first sub-membership degree as the membership degree corresponding to the fuzzy rule.
In the embodiment of the present application, the linear velocity can be determined according to the value range of the linear velocity, the fuzzy language corresponding to the linear velocity, and the like, with respect to the first value of each fuzzy rule. Similarly, the second values of the angular velocity with respect to each fuzzy rule may be determined according to a value range of the angular velocity, a fuzzy language corresponding to the angular velocity, and the like. Illustratively, the universe of discourse for the linear velocity v may be [ -v [ ]max,vmax]And are provided withFor example, the discourse domain of v can be divided into a plurality of corresponding parts according to the number of the fuzzy language components of v, and the value corresponding to each fuzzy language component of v can be determined according to the maximum value, the minimum value or the average value of each part, and the like.
It should be noted that there may be other setting manners for the setting manner of the linear velocity with respect to the first value of each fuzzy rule, and the determination manner of the second value may be similar to or different from the determination manner of the first value. The examples herein are intended to be illustrative only and not limiting.
At this time, each input variable (i.e., the distance between M obstacles) may be blurred, and a fuzzy output may be obtained through a corresponding fuzzy rule, that is, a membership degree corresponding to each fuzzy rule, and each of the first value and the second value may be obtained.
After obtaining the blurred output, further defuzzification may be performed. At this time, further, the angular velocity and the linear velocity of the mobile robot may be determined according to the membership degree, each of the first values, and each of the second values respectively corresponding to each of the fuzzy rules.
The defuzzification can be performed according to one or more of a maximum membership function method, a weighted average method, a gravity center method and the like, so as to obtain the angular velocity and the linear velocity of the mobile robot.
In some embodiments, the determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, each of the first values, and each of the second values respectively corresponding to each of the fuzzy rules includes:
determining the linear velocity of the mobile robot according to a first formula, wherein the first formula is as follows:
Figure BDA0002336419640000091
wherein v is a linear velocity of the mobile robot, vkA first value of the linear velocity relative to a k-th fuzzy rule is obtained;
determining the angular velocity of the mobile robot according to a second formula, the second formula being:
Figure BDA0002336419640000092
wherein ω is an angular velocity of the mobile robot, ωkA second value of the angular velocity relative to the kth fuzzy rule;
in the first formula and the second formula, q iskAnd H is the membership degree corresponding to the kth fuzzy rule, and H is the number of the fuzzy rules.
By the embodiment of the application, after the fuzzy output is obtained through the fuzzy rule, the accurate output quantity, namely the accurate linear velocity and angular velocity, can be obtained through defuzzification, so as to be used for controlling the mobile robot subsequently. And when the angular speed and the linear speed of the mobile robot are determined by means of a fuzzy control algorithm and the like, the angular speed and the linear speed are directly used as fuzzy output instead of the speeds of all wheels, so that the fuzzy rule is simpler and more intuitive to make, a large amount of time for testing the fuzzy rule can be saved, and the fuzzy rule with better effect can be made quickly.
A specific way of determining the angular and linear velocities of the mobile robot is described below as a specific example.
If the obstacle distance d corresponding to the left, front and right directions of the mobile robot is obtainedL、dF、dRAnd, each obstacle distance dL、dF、dRThe corresponding discourse domain is [0, T]Blurring of distance of each obstacleThe language description is { far (F), medium (M), near (N) }. Generally, the value of T may be set as the maximum effective measurement distance value of the sensor, or may be other values within the maximum effective measurement range.
At this time, the fuzzy rule table may be established in advance as follows:
TABLE 1 fuzzy rule Table
Figure BDA0002336419640000101
Figure BDA0002336419640000111
In the fuzzy rule table, 27 first rules may be included.
Further, the first membership function for the obstacle distance d may be established in advance:
Figure BDA0002336419640000112
Figure BDA0002336419640000113
Figure BDA0002336419640000121
wherein d can be taken as the obstacle distance d of the left side, the front side and the right side of the mobile robotL、dF、dR,μN(d) The fuzzy language of d is a membership subfunction of near time, muM(d) A membership subfunction when the fuzzy language of d is middle, muF(d) A membership subfunction when the fuzzy language of d is far, T1Less than T2,T2Less than T3,T3Less than T, muN(dobs)+μM(dobs)+μF(dobs)=1。
At this time, after the M obstacle distances are determined, for any fuzzy rule in the fuzzy rule table, a first sub-membership degree of each obstacle distance in the M obstacle distances with respect to the fuzzy rule may be calculated, so as to obtain a membership degree corresponding to the fuzzy rule according to each first sub-membership degree.
In some application scenarios, the minimum value of each first sub-membership degree may be used as the membership degree corresponding to the fuzzy rule.
Specifically, the following may be mentioned:
qk=min{p1k,p2k,p3k}
wherein q iskIs the degree of membership, p, of the k-th fuzzy rule1kIs the distance d of the obstacleLFirst sub-degree of membership, p, relative to the k-th fuzzy rule2kIs the distance d of the obstacleFFirst sub-degree of membership, p, relative to the k-th fuzzy rule2kIs the distance d of the obstacleRWith respect to the first subdegree of membership of the kth fuzzy rule, k is 1,2,3, …, 27.
After obtaining the membership degrees corresponding to the fuzzy rules, the angular velocity and the linear velocity of the mobile robot may be determined according to the membership degrees, the first values and the second values corresponding to the fuzzy rules.
Specifically, for example, the linear velocity of the mobile robot is determined by a barycentric method as follows:
Figure BDA0002336419640000122
qkis the degree of membership of the k-th fuzzy rule, vkThe first value of the linear velocity relative to the k-th fuzzy rule.
The angular velocity of the mobile robot is:
Figure BDA0002336419640000131
wherein, the ω iskAnd taking a second value of the angular velocity relative to the k-th fuzzy rule.
It should be noted that the above example is only an exemplary illustration of how to determine the angular velocity and linear velocity of the mobile robot, and is not meant to be limiting.
As shown in fig. 3, in some embodiments, the step S103 specifically includes:
step S301, respectively performing first filtering processing on part of first infrared sensing data corresponding to each infrared sensor to obtain second infrared sensing data corresponding to each infrared sensor;
step S302, respectively carrying out second filtering processing on part of the first ultrasonic sensing data corresponding to each ultrasonic sensor to obtain second ultrasonic sensing data corresponding to each ultrasonic sensor;
step S303, dividing each second infrared sensing data into M groups of third infrared sensing data, wherein the M groups of third infrared sensing data correspond to the M directions one by one;
step S304, dividing each second ultrasonic sensing data into K groups of third ultrasonic sensing data, wherein the K groups of third ultrasonic sensing data correspond to the K directions one by one;
step S305, obtaining M groups of position sensing data which correspond to the M positions one by one according to the M groups of third infrared sensing data and the K groups of third ultrasonic sensing data;
step S306, screening each group of group position sensing data in the M groups of group position sensing data according to a preset data screening mode;
and S307, obtaining M obstacle distances according to each group of the screened azimuth sensing data.
In the embodiment of the application, the first filtering processing may be performed on part of the first infrared sensing data corresponding to each infrared sensor, so that filtering is performed based on data acquired by a single infrared sensor, and abnormal data in multiple measurements of the single infrared sensor is deleted. Similarly, outlier data from multiple measurements of a single ultrasonic sensor may be deleted. The specific manner of the first filtering process may be determined according to actual requirements, and may be one or more of median filtering, mean filtering, eliminating abnormal data, and then averaging, for example. The specific manner of the second filtering process may be determined according to actual requirements, for example, the second filtering process may be one or more of median filtering, mean filtering, eliminating abnormal data, and then averaging.
Illustratively, the first filtering process may be performed on a part of the first infrared sensing data corresponding to any one of the infrared sensors in a manner of eliminating abnormal data and then averaging.
Specifically, since a single infrared sensor measures the same barrier distance for a plurality of times within a preset time period, it can be considered that the corresponding infrared sensing data approximately satisfy normal distribution. Therefore, in the averaging method after the abnormal data is removed, optionally, the abnormal data may be screened by adopting a 3 δ principle, where:
Figure BDA0002336419640000141
wherein Outlier1 is the first abnormal value in the first infrared sensing data corresponding to the infrared sensor,
Figure BDA0002336419640000142
the value is an average value of the first infrared sensing data corresponding to the infrared sensor, and δ is a standard deviation of the first infrared sensing data corresponding to the infrared sensor.
The second filtering process and the first filtering process may be performed in the same manner or in different manners.
In the embodiment of the application, after the second infrared sensing data and the second ultrasonic sensing data are obtained, the second infrared sensing data and the second ultrasonic sensing data can be divided according to the directions to obtain M groups of position sensing data corresponding to the M directions one to one, and each group of position sensing data in the M groups of position sensing data is respectively screened according to a preset data screening mode to perform further abnormal value detection and elimination, so that the accuracy of the data is improved.
The specific screening method may be multiple, for example, each group of bit sensing data in the M groups of bit sensing data may be screened by an interquartile range (IQR).
Specifically, for a set of position sensing data, all values in the set of position sensing data may be arranged from small to large and divided into four equal parts according to the number. And the values of the positions of the three dividing points in the four equal parts are the quartiles. First quartile (Q)1) The value is also called as a smaller quartile, and is equal to the value of 25% of all the values in the group of bit sensing data after being arranged from small to large. Second quartile (Q)2) The value of the group of position sensing data is equal to the value of 50% after all values are arranged from small to large. Third quartile (Q)3) The value of the group of bits sensing data is 75% of the value of the group of bits sensing data after the values are arranged from small to large.
Determining
IQR=Q3-Q1
Then the second outlier in the set of bit-wise sensory data is:
Figure BDA0002336419640000151
wherein, Outlier2 is the second Outlier in the group of bit sensing data, k is a preset weight, and optionally, k may be set to 1.5.
After each set of the filtered position sensing data is obtained, the M obstacle distances may be determined. For example, any one of the M obstacle distances may be a minimum value in the corresponding direction sensing data, and the obstacle distance may also be obtained by fusing the corresponding direction sensing data in a linear weighting manner and the like.
In particular, exemplary, the obstacle distances d for the three left, front and right orientations of the mobile robotL、dF、dRWherein, mobile robot's left side and place ahead have infrared sensor, and the right side has infrared sensor and ultrasonic sensor, then:
dL=min{dL1,dL2,dL3,...,dLK}
dF=min{dF1,dF2,dF3,...,dFN}
dR=min{dR1,dR2,dR3,...,dRM,dU1,dU2,dU3,...,dUP}
the { dL1,dL2,dL3,…,dLKIs the orientation sensing data of the left side of the mobile robot, said { d }F1,dF2,dF3,…,dFNPosition sensing data in front of the mobile robot, { d }R1,dR2,dR3,…,dRM,dU1,dU2,dU3,…,dUPAnd the position sensing data of the right side of the mobile robot.
Or:
Figure BDA0002336419640000152
Figure BDA0002336419640000161
Figure BDA0002336419640000162
wherein the content of the first and second substances,
Figure BDA0002336419640000163
in the embodiment of the application, filtering is performed on the sensing data of each infrared sensor and each ultrasonic sensor, and abnormal data can be deleted by taking the multiple detection results of the infrared sensors as the reference; on the basis, the sensing data of the infrared sensor and the ultrasonic sensor are further integrated, the abnormal data are screened and deleted for the second time, the detection conditions of various types of sensors can be integrated, the detection error caused by the characteristics of a single sensor can be avoided, and at the moment, compared with a single sensor detection mode, in the embodiment of the application, each group of group position sensing data in the group M of group position sensing data can be screened respectively through a preset data screening mode, the error data obtained by the detection of a specific sensor can be effectively identified, so that the advantages of each sensor are fully utilized, and the detection precision of the mobile robot on the obstacle is improved.
In some embodiments, the N infrared sensors are respectively distributed at least two height positions on the mobile robot, and the height position of the ultrasonic sensor is located between the highest height position and the lowest height position corresponding to the N infrared sensors.
In the embodiment of the application, the infrared sensor can be arranged at least two height positions on the mobile robot, so that the obstacle information with different heights can be detected. In addition, optionally, the three-dimensional position information of the corresponding obstacle can be more conveniently acquired by processing the infrared sensing data at the at least two height positions. The ultrasonic sensor is located between the highest altitude position and the lowest altitude position that N infrared sensor corresponds, can make mobile robot obtain the relevance of the sensory information in same position better, simultaneously sensory information's level is more abundant, better acquisition not the ascending barrier information of direction of height, can avoid prior art, set up the sensor on same water flat line, lead to mobile robot can't discern the ascending barrier information of different directions of height well, thereby lead to mobile robot to take place the condition of colliding, the accuracy and the precision of follow-up data processing and analysis of carrying on have been promoted.
In some embodiments, the infrared sensors at each elevation position are respectively equidistantly distributed within the angular region corresponding to each of the M positions, and the ultrasonic sensors are respectively equidistantly distributed within the angular region corresponding to each of the K positions.
In the embodiment of the application, the infrared sensors and the ultrasonic sensors are uniformly and reasonably distributed correspondingly in the corresponding directions, so that the detection ranges of the infrared sensors and the ultrasonic sensors are more uniform and reasonable, and the coverage is wider.
Illustratively, as shown in fig. 4, the mobile robot is a schematic structural diagram of the mobile robot.
Fig. 4 may be a schematic structural diagram of the mobile robot in a top view.
The mobile robot can correspond to 3 positions, and the angle areas corresponding to the 3 positions can be a left area, a right area and a front area in the figure. The infrared sensor is characterized in that the left area, the front area and the right area are respectively and uniformly distributed with a plurality of infrared sensors, and in addition, a plurality of ultrasonic sensors are additionally arranged in the right area.
Fig. 5 is a schematic structural diagram of the mobile robot in a front view state.
At this time, the infrared sensors in the respective areas are distributed at two height positions, and in the right area, at an intermediate position between the highest height position and the lowest height position corresponding to the infrared sensors, an ultrasonic sensor is further provided.
Wherein, in some embodiments, the ultrasonic sensor may be disposed in only one of the M positions. In some cases, the function of the mobile robot is often emphasized to one side, for example, for a sweeping robot, a brush head of the mobile robot can adopt a unilateral brush to reduce the problems such as winding wires, flying objects to be swept and the like, and meanwhile, the power saving and the high efficiency are achieved. At this time, the ultrasonic sensor can be disposed in an orientation corresponding to the one-sided brush without being disposed in other orientations, thereby reducing hardware costs.
In the embodiment of the application, through N infrared sensors on the mobile robot, obtain the first infrared sensing data in the preset time period before the present moment, and through L ultrasonic sensors on the mobile robot obtain first ultrasonic sensing data in the preset time period can combine infrared sensor and ultrasonic sensor to detect the condition of the obstacle in the environment, and it is respectively right first infrared sensing data with first ultrasonic sensing data carry out the preliminary treatment, obtain M obstacle distances, can reduce the detection error that leads to because the material, the colour, the shape etc. of various obstacles, improve the detection precision to the obstacle.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present application.
As shown in fig. 6, a block diagram of a mobile robot provided in the embodiment of the present application is a control method of the mobile robot corresponding to the above embodiment, and for convenience of description, only the parts related to the embodiment of the present application are shown.
Referring to fig. 6, the mobile robot includes N infrared sensors 61 (not shown in detail), L ultrasonic sensors 62 (not shown in detail), and a control circuit 63, wherein the control circuit 63 is connected to the N infrared sensors 61 and the L ultrasonic sensors 62, respectively;
the control circuit 63 may include a memory 631, a processor 630, and a computer program 632 stored in the memory and executable on the processor, wherein the processor may implement the following steps when executing the computer program:
acquiring first infrared sensing data in a preset time period before the current time by N infrared sensors on the mobile robot, wherein the N infrared sensors correspond to M directions of the mobile robot, N and M are integers greater than 0, and N is not less than M;
acquiring first ultrasonic sensing data in the preset time period through L ultrasonic sensors on the mobile robot, wherein the L ultrasonic sensors correspond to K directions in the M directions, and K and L are integers greater than 0;
preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances, wherein the M obstacle distances correspond to M directions of the mobile robot;
determining the angular speed and the linear speed of the mobile robot according to the M obstacle distances;
and controlling the mobile robot to move according to the angular velocity and the linear velocity.
Optionally, when the processor executes the computer program, the determining the angular velocity and the linear velocity of the mobile robot according to the distances between the M obstacles includes:
obtaining membership degrees corresponding to all fuzzy rules respectively according to the M obstacle distances, a first membership function related to the obstacle distances and a preset fuzzy rule table, wherein the fuzzy rule table comprises all fuzzy rules;
acquiring first values of the linear velocity relative to each fuzzy rule respectively;
acquiring second values of the angular velocity relative to each fuzzy rule respectively;
and determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, the first values and the second values corresponding to each fuzzy rule.
Optionally, when the processor executes the computer program, the determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, each of the first values, and each of the second values respectively corresponding to each of the fuzzy rules includes:
determining the linear velocity of the mobile robot according to a first formula, wherein the first formula is as follows:
Figure BDA0002336419640000191
wherein v is a linear velocity of the mobile robot, vkA first value of the linear velocity relative to a k-th fuzzy rule is obtained;
determining the angular velocity of the mobile robot according to a second formula, the second formula being:
Figure BDA0002336419640000192
wherein ω is an angular velocity of the mobile robot, ωkA second value of the angular velocity relative to the kth fuzzy rule;
in the first formula and the second formula, q iskAnd H is the membership degree corresponding to the kth fuzzy rule, and H is the number of the fuzzy rules.
Optionally, when the processor executes the computer program, the controlling the mobile robot to move according to the angular velocity and the linear velocity includes:
obtaining the right wheel speed and the left wheel speed of the mobile robot according to the angular speed and the linear speed;
and controlling the mobile robot to move according to the right wheel speed and the left wheel speed.
Optionally, when the processor executes the computer program, the preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances includes:
respectively carrying out first filtering processing on part of first infrared sensing data corresponding to each infrared sensor to obtain second infrared sensing data corresponding to each infrared sensor;
respectively carrying out second filtering processing on part of the first ultrasonic sensing data corresponding to each ultrasonic sensor to obtain second ultrasonic sensing data corresponding to each ultrasonic sensor;
dividing each second infrared sensing data into M groups of third infrared sensing data, wherein the M groups of third infrared sensing data correspond to the M directions one by one;
dividing each second ultrasonic sensing data into K groups of third ultrasonic sensing data, wherein the K groups of third ultrasonic sensing data correspond to the K directions one by one;
obtaining M groups of position sensing data which correspond to the M positions one by one according to the M groups of third infrared sensing data and the K groups of third ultrasonic sensing data;
screening each group of group position sensing data in the M groups of group position sensing data respectively according to a preset data screening mode;
and obtaining M obstacle distances according to each group of the screened azimuth sensing data.
Optionally, the N infrared sensors are respectively distributed at least two height positions on the mobile robot, and the height position of the ultrasonic sensor is located between the highest height position and the lowest height position corresponding to the N infrared sensors.
Optionally, the infrared sensors at each height position are respectively distributed at equal distances in the angle area corresponding to each of the M positions, and the ultrasonic sensors are respectively distributed at equal distances in the angle area corresponding to each of the K positions.
In the embodiment of the application, through N infrared sensors on the mobile robot, obtain the first infrared sensing data in the preset time period before the present moment, and through L ultrasonic sensors on the mobile robot obtain first ultrasonic sensing data in the preset time period can combine infrared sensor and ultrasonic sensor to detect the condition of the obstacle in the environment, and it is respectively right first infrared sensing data with first ultrasonic sensing data carry out the preliminary treatment, obtain M obstacle distances, can reduce the detection error that leads to because the material, the colour, the shape etc. of various obstacles, improve the detection precision to the obstacle.
The Processor 630 may be a Central Processing Unit (CPU), and the Processor 630 may also be other general-purpose processors, Digital Signal Processors (DSPs), Application Specific Integrated Circuits (ASICs), Field Programmable Gate Arrays (FPGAs) or other Programmable logic devices, discrete Gate or transistor logic devices, discrete hardware components, and the like. The general purpose processor may be a microprocessor, any conventional processor, etc.
The memory 631 may be an internal storage unit of the control circuit 63 in some embodiments, such as a hard disk or a memory of the control circuit 63. The memory 631 may be an external storage device of the control circuit 63 in other embodiments, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the control circuit 63. Further, the memory 631 may include both an internal storage unit of the control circuit 63 and an external storage device. The memory 631 is used for storing an operating system, an application program, a BootLoader (BootLoader), data, other programs, and the like, such as program codes of the computer programs. The above memory 631 may also be used to temporarily store data that has been output or is to be output.
It should be noted that, because the above-mentioned information interaction, execution process, and other contents are based on the same concept as the embodiment of the method of the present application, specific functions and technical effects thereof may be referred to specifically in the section of the embodiment of the method, and are not described herein again.
Those skilled in the art will appreciate that fig. 6 is merely an example of the mobile robot 6, and does not constitute a limitation of the mobile robot 6, and may include more or less components than those shown, or combine some of the components, or different components, such as input and output devices, network access devices, etc.
The embodiment of the present application further provides a computer-readable storage medium, where a computer program is stored, and when the computer program is executed by a processor, the computer program can implement the steps in the above-mentioned mobile robot control method embodiments.
The embodiment of the present application provides a computer program product, which when running on a mobile terminal, enables the mobile terminal to implement the steps in the above-mentioned mobile robot control method embodiments when executed.
The integrated unit may be stored in a computer-readable storage medium if it is implemented in the form of a software functional unit and sold or used as a separate product. Based on such understanding, all or part of the processes in the methods of the embodiments described above can be implemented by a computer program, which can be stored in a computer-readable storage medium and can implement the steps of the embodiments of the methods described above when the computer program is executed by a processor. The computer program includes computer program code, and the computer program code may be in a source code form, an object code form, an executable file or some intermediate form. The computer-readable medium may include at least: any entity or device capable of carrying computer program code to a photographing apparatus/terminal apparatus, a recording medium, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signal, telecommunication signal, and software distribution medium. Such as a usb-disk, a removable hard disk, a magnetic or optical disk, etc. In certain jurisdictions, computer-readable media may not be an electrical carrier signal or a telecommunications signal in accordance with legislative and patent practice.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present application.
In the embodiments provided in the present application, it should be understood that the disclosed apparatus/network device and method may be implemented in other ways. For example, the above-described apparatus/network device embodiments are merely illustrative, and for example, the division of the above modules or units is only one logical function division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
The above embodiments are only used for illustrating the technical solutions of the present application, and not for limiting the same; although the present application has been described in detail with reference to the foregoing embodiments, it should be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present application and are intended to be included within the scope of the present application.

Claims (10)

1. A mobile robot control method, comprising:
acquiring first infrared sensing data in a preset time period before the current time by N infrared sensors on the mobile robot, wherein the N infrared sensors correspond to M directions of the mobile robot, N and M are integers greater than 0, and N is not less than M;
acquiring first ultrasonic sensing data in the preset time period through L ultrasonic sensors on the mobile robot, wherein the L ultrasonic sensors correspond to K directions in the M directions, and K and L are integers greater than 0;
preprocessing the first infrared sensing data and the first ultrasonic sensing data respectively to obtain M obstacle distances, wherein the M obstacle distances correspond to M directions of the mobile robot;
determining the angular speed and the linear speed of the mobile robot according to the M obstacle distances;
and controlling the mobile robot to move according to the angular velocity and the linear velocity.
2. The mobile robot control method according to claim 1, wherein the determining the angular velocity and the linear velocity of the mobile robot based on the M obstacle distances comprises:
obtaining membership degrees corresponding to all fuzzy rules respectively according to the M obstacle distances, a first membership function related to the obstacle distances and a preset fuzzy rule table, wherein the fuzzy rule table comprises all fuzzy rules;
acquiring first values of the linear velocity relative to each fuzzy rule respectively;
acquiring second values of the angular velocity relative to each fuzzy rule respectively;
and determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, the first values and the second values corresponding to each fuzzy rule.
3. The method according to claim 2, wherein the determining the angular velocity and the linear velocity of the mobile robot according to the membership degree, each of the first values, and each of the second values respectively corresponding to each of the fuzzy rules comprises:
determining the linear velocity of the mobile robot according to a first formula, wherein the first formula is as follows:
Figure FDA0002336419630000021
wherein v is a linear velocity of the mobile robot, vkA first value of the linear velocity relative to a k-th fuzzy rule is obtained;
determining the angular velocity of the mobile robot according to a second formula, the second formula being:
Figure FDA0002336419630000022
wherein ω is an angular velocity of the mobile robot, ωkA second value of the angular velocity relative to the kth fuzzy rule;
in the first formula and the second formula, q iskAnd H is the membership degree corresponding to the kth fuzzy rule, and H is the number of the fuzzy rules.
4. The mobile robot control method according to claim 1, wherein the controlling the mobile robot to move in accordance with the angular velocity and the linear velocity includes:
obtaining the right wheel speed and the left wheel speed of the mobile robot according to the angular speed and the linear speed;
and controlling the mobile robot to move according to the right wheel speed and the left wheel speed.
5. The mobile robot control method according to claim 1, wherein the preprocessing the first infrared sensing data and the first ultrasonic sensing data, respectively, to obtain M obstacle distances comprises:
respectively carrying out first filtering processing on part of first infrared sensing data corresponding to each infrared sensor to obtain second infrared sensing data corresponding to each infrared sensor;
respectively carrying out second filtering processing on part of the first ultrasonic sensing data corresponding to each ultrasonic sensor to obtain second ultrasonic sensing data corresponding to each ultrasonic sensor;
dividing each second infrared sensing data into M groups of third infrared sensing data, wherein the M groups of third infrared sensing data correspond to the M directions one by one;
dividing each second ultrasonic sensing data into K groups of third ultrasonic sensing data, wherein the K groups of third ultrasonic sensing data correspond to the K directions one by one;
obtaining M groups of position sensing data which correspond to the M positions one by one according to the M groups of third infrared sensing data and the K groups of third ultrasonic sensing data;
screening each group of group position sensing data in the M groups of group position sensing data respectively according to a preset data screening mode;
and obtaining M obstacle distances according to each group of the screened azimuth sensing data.
6. The mobile robot control method according to any one of claims 1 to 5, wherein the N infrared sensors are respectively distributed at least two height positions on the mobile robot, and a height position of the ultrasonic sensor is located between a highest height position and a lowest height position corresponding to the N infrared sensors.
7. The mobile robot control method according to claim 6, wherein the infrared sensors at each elevation position are respectively distributed at equal distances in the angle area corresponding to each of the M positions, and the ultrasonic sensors are respectively distributed at equal distances in the angle area corresponding to each of the K positions.
8. A mobile robot is characterized by comprising N infrared sensors, L ultrasonic sensors and a control circuit, wherein the control circuit is respectively connected with the N infrared sensors and the L ultrasonic sensors;
the control circuit includes a memory, a processor, and a computer program stored in the memory and executable on the processor, the processor implementing the mobile robot control method according to any one of claims 1 to 5 when executing the computer program.
9. The mobile robot as claimed in claim 8, wherein the N infrared sensors are respectively distributed at least two height positions on the mobile robot, and the height position of the ultrasonic sensor is located between a highest height position and a lowest height position corresponding to the infrared sensors.
10. A computer-readable storage medium storing a computer program, wherein the computer program, when executed by a processor, implements the mobile robot control method according to any one of claims 1 to 5.
CN201911357849.3A 2019-12-25 2019-12-25 Mobile robot control method and mobile robot Active CN111203873B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911357849.3A CN111203873B (en) 2019-12-25 2019-12-25 Mobile robot control method and mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911357849.3A CN111203873B (en) 2019-12-25 2019-12-25 Mobile robot control method and mobile robot

Publications (2)

Publication Number Publication Date
CN111203873A true CN111203873A (en) 2020-05-29
CN111203873B CN111203873B (en) 2022-07-29

Family

ID=70783307

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911357849.3A Active CN111203873B (en) 2019-12-25 2019-12-25 Mobile robot control method and mobile robot

Country Status (1)

Country Link
CN (1) CN111203873B (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1861330A (en) * 2005-05-11 2006-11-15 Lg电子株式会社 Mobile robot having obstacle avoidance function and method therefor
CN101354587A (en) * 2008-09-04 2009-01-28 湖南大学 Mobile robot multi-behavior syncretizing automatic navigation method under unknown environment

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1861330A (en) * 2005-05-11 2006-11-15 Lg电子株式会社 Mobile robot having obstacle avoidance function and method therefor
CN101354587A (en) * 2008-09-04 2009-01-28 湖南大学 Mobile robot multi-behavior syncretizing automatic navigation method under unknown environment

Non-Patent Citations (5)

* Cited by examiner, † Cited by third party
Title
李敬业、高学山等: "人机共存环境下巡检机器人自主移动与避障方法", 《兵工自动化》 *
贺超、刘华平等: "采用Kinect的移动机器人目标跟踪与避障", 《智能系统学报》 *
赵海文: "基于多传感器的移动机器人行为控制研究", 《中国博士学位论文全文数据库 信息科技辑》 *
赵祥敏: "全方位移动机器人的研制", 《中国优秀硕士学位论文全文数据库 信息科技辑》 *
郭娜、李彩虹等: "结合预测和模糊控制的移动机器人路径规划", 《计算机工程与应用》 *

Also Published As

Publication number Publication date
CN111203873B (en) 2022-07-29

Similar Documents

Publication Publication Date Title
JP6571274B2 (en) System and method for laser depth map sampling
CN109059902B (en) Relative pose determination method, device, equipment and medium
CN106767852B (en) A kind of method, apparatus and equipment generating detection target information
CN111273268B (en) Automatic driving obstacle type identification method and device and electronic equipment
KR20200128145A (en) Methods and devices, vehicles, and electronic devices for traffic light detection and intelligent driving
CN110442120B (en) Method for controlling robot to move in different scenes, robot and terminal equipment
CN111324945B (en) Sensor scheme determining method, device, equipment and storage medium
KR102316960B1 (en) Method and apparatus for realtime object detection in unmanned aerial vehicle image
Naujoks et al. An orientation corrected bounding box fit based on the convex hull under real time constraints
CN113256716B (en) Control method of robot and robot
CN111024082B (en) Method and device for planning local path of robot and robot
CN113137968B (en) Repositioning method and repositioning device based on multi-sensor fusion and electronic equipment
CN112581613A (en) Grid map generation method and system, electronic device and storage medium
CN114966651A (en) Drivable region detection method, computer device, storage medium, and vehicle
CN111203873B (en) Mobile robot control method and mobile robot
Huang et al. B-splines for purely vision-based localization and mapping on non-holonomic ground vehicles
Moeys et al. Pred18: Dataset and further experiments with davis event camera in predator-prey robot chasing
Calcroft et al. LiDAR-based obstacle detection and avoidance for autonomous vehicles using raspberry Pi 3B
Mueller et al. Continuous stereo self-calibration on planar roads
US20220221585A1 (en) Systems and methods for monitoring lidar sensor health
CN111753768A (en) Method, apparatus, electronic device and storage medium for representing shape of obstacle
Kogler et al. Embedded stereo vision system for intelligent autonomous vehicles
MOLDER et al. Navigation algorithms with LIDAR for mobile robots
Gireesha et al. Lane Change Assistance Using LiDAR for Autonomous Vehicles
CN115371719B (en) Parameter calibration method and device for detection equipment, storage medium and electronic device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant