CN109739223B - Robot obstacle avoidance control method and device, terminal device and storage medium - Google Patents

Robot obstacle avoidance control method and device, terminal device and storage medium Download PDF

Info

Publication number
CN109739223B
CN109739223B CN201811540871.7A CN201811540871A CN109739223B CN 109739223 B CN109739223 B CN 109739223B CN 201811540871 A CN201811540871 A CN 201811540871A CN 109739223 B CN109739223 B CN 109739223B
Authority
CN
China
Prior art keywords
obstacle
robot
data
obstacle avoidance
infrared sensor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811540871.7A
Other languages
Chinese (zh)
Other versions
CN109739223A (en
Inventor
程俊
高向阳
郭海光
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Institute of Advanced Technology of CAS
Original Assignee
Shenzhen Institute of Advanced Technology of CAS
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Institute of Advanced Technology of CAS filed Critical Shenzhen Institute of Advanced Technology of CAS
Priority to CN201811540871.7A priority Critical patent/CN109739223B/en
Publication of CN109739223A publication Critical patent/CN109739223A/en
Priority to PCT/CN2019/124353 priority patent/WO2020125500A1/en
Application granted granted Critical
Publication of CN109739223B publication Critical patent/CN109739223B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions

Abstract

The invention is applicable to the technical field of robot control, and provides a robot obstacle avoidance control method, a device, terminal equipment and a computer readable storage medium, wherein the method comprises the following steps: obtaining obstacle detection data, the obstacle detection data comprising: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor; acquiring a robot obstacle avoidance decision according to the obstacle detection data; and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision. By the method, the high accuracy of the robot obstacle avoidance control function can be realized on the basis of low production cost.

Description

Robot obstacle avoidance control method and device, terminal device and storage medium
Technical Field
The invention belongs to the technical field of robot control, and particularly relates to a robot obstacle avoidance control method, a robot obstacle avoidance control device, terminal equipment and a computer readable storage medium.
Background
With the continuous development of science and technology, the function of the robot is more and more abundant. The existing robot can realize the functions of voice interaction, visual detection, obstacle avoidance control and the like. The obstacle avoidance control function of the existing robot can be realized by a vibration sensor or an ultrasonic sensor, and the method has low production cost but low accuracy; the obstacle avoidance control function of the existing robot can also be realized by using a vision technology, and although the method has higher accuracy, the production cost is higher. The methods described above cannot have the advantages of low production cost and high accuracy at the same time, resulting in a robot with a low application range.
Disclosure of Invention
In view of this, embodiments of the present invention provide a robot obstacle avoidance control method, apparatus, terminal device, and computer readable storage medium, so as to solve the problem that in the prior art, high accuracy of a robot obstacle avoidance control function cannot be achieved on the basis of low production cost.
A first aspect of an embodiment of the present invention provides a robot obstacle avoidance control method, where the robot at least includes: first infrared sensor, second infrared sensor, camera, current sensor, ultrasonic sensor, first infrared sensor installs in the left place ahead of robot, second infrared sensor installs in the right place ahead of robot, includes:
obtaining obstacle detection data, the obstacle detection data comprising: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor;
acquiring a robot obstacle avoidance decision according to the obstacle detection data;
and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
A second aspect of the embodiments of the present invention provides an obstacle avoidance control device for a robot, where the robot at least includes: first infrared sensor, second infrared sensor, camera, current sensor, ultrasonic sensor, first infrared sensor installs in the left place ahead of robot, second infrared sensor installs in the right place ahead of robot, includes:
an obstacle detection data acquisition unit configured to acquire obstacle detection data including: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor;
the robot obstacle avoidance decision acquisition unit is used for acquiring a robot obstacle avoidance decision according to the obstacle detection data;
and the robot obstacle avoidance control unit is used for controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
A third aspect of an embodiment of the present invention provides a terminal device, including: the robot obstacle avoidance control system comprises a memory, a processor and a computer program stored in the memory and capable of running on the processor, wherein the processor realizes the steps of the robot obstacle avoidance control method when executing the computer program.
A fourth aspect of the embodiments of the present invention provides a computer-readable storage medium, where a computer program is stored, where the computer program, when executed by a processor, implements the steps of the robot obstacle avoidance control method.
Compared with the prior art, the embodiment of the invention has the following beneficial effects: by acquiring obstacle detection data, the obstacle detection data includes: infrared sensor data, camera data, current sensor data, ultrasonic sensor data; acquiring a robot obstacle avoidance decision according to the obstacle detection data; and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision. Because devices such as the infrared sensor, the camera, the current sensor and the ultrasonic sensor have low cost, the robot obstacle avoidance control function is realized through infrared sensor data, camera data, current sensor data and ultrasonic sensor data, and the accuracy of the robot obstacle avoidance control function can be improved. Namely, the high accuracy of the robot obstacle avoidance control function can be realized on the basis of low production cost.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed for the embodiments or the prior art descriptions will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings without creative efforts.
Fig. 1 is a schematic flow chart of a robot obstacle avoidance control method according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a pre-trained obstacle detection data fusion model according to an embodiment of the present invention;
fig. 3 is a schematic structural diagram of a robot obstacle avoidance control device according to an embodiment of the present invention;
fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention.
Detailed Description
In the following description, for purposes of explanation and not limitation, specific details are set forth, such as particular system structures, techniques, etc. in order to provide a thorough understanding of the embodiments of the present application. It will be apparent, however, to one skilled in the art that the present application may be practiced in other embodiments that depart from these specific details. In other instances, detailed descriptions of well-known systems, devices, circuits, and methods are omitted so as not to obscure the description of the present application with unnecessary detail.
In order to explain the technical solution described in the present application, the following description will be given by way of specific examples.
It will be understood that the terms "comprises" and/or "comprising," when used in this specification and the appended claims, specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, components, and/or groups thereof.
It is also to be understood that the terminology used in the description of the present application herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application. As used in the specification of the present application and the appended claims, the singular forms "a," "an," and "the" are intended to include the plural forms as well, unless the context clearly indicates otherwise.
It should be further understood that the term "and/or" as used in this specification and the appended claims refers to and includes any and all possible combinations of one or more of the associated listed items.
As used in this specification and the appended claims, the term "if" may be interpreted contextually as "when", "upon" or "in response to a determination" or "in response to a detection". Similarly, the phrase "if it is determined" or "if a [ described condition or event ] is detected" may be interpreted contextually to mean "upon determining" or "in response to determining" or "upon detecting [ described condition or event ]" or "in response to detecting [ described condition or event ]".
In particular implementations, the terminals described in embodiments of the present application include, but are not limited to, other portable devices such as mobile phones, laptop computers, or tablet computers having touch sensitive surfaces (e.g., touch screen displays and/or touch pads). It should also be understood that in some embodiments, the devices described above are not portable communication devices, but rather are desktop computers having touch-sensitive surfaces (e.g., touch screen displays and/or touch pads).
In the discussion that follows, a terminal that includes a display and a touch-sensitive surface is described. However, it should be understood that the terminal may include one or more other physical user interface devices such as a physical keyboard, mouse, and/or joystick.
The terminal supports various applications, such as one or more of the following: a drawing application, a presentation application, a word processing application, a website creation application, a disc burning application, a spreadsheet application, a gaming application, a telephone application, a video conferencing application, an email application, an instant messaging application, an exercise support application, a photo management application, a digital camera application, a web browsing application, a digital music player application, and/or a digital video player application.
Various applications that may be executed on the terminal may use at least one common physical user interface device, such as a touch-sensitive surface. One or more functions of the touch-sensitive surface and corresponding information displayed on the terminal can be adjusted and/or changed between applications and/or within respective applications. In this way, a common physical architecture (e.g., touch-sensitive surface) of the terminal can support various applications with user interfaces that are intuitive and transparent to the user.
The first embodiment is as follows:
fig. 1 shows a schematic flow chart of a first robot obstacle avoidance control method provided in an embodiment of the present application, which is detailed as follows:
the robot includes at least: the robot comprises a first infrared sensor, a second infrared sensor, a camera, a current sensor and an ultrasonic sensor, wherein the first infrared sensor is installed in the front left of the robot, and the second infrared sensor is installed in the front right of the robot. Of course, if other infrared sensors exist, the infrared sensors can be installed at corresponding positions of the robot according to actual needs.
The first infrared sensor, the second infrared sensor, the camera, the current sensor and the ultrasonic sensor are all devices with lower cost.
Step S11, obtaining obstacle detection data, the obstacle detection data including: first infrared sensor data, second infrared sensor data, camera data, current sensor data, ultrasonic sensor data.
Optionally, the step S11 includes: if the first infrared sensor does not detect a target signal, setting the first infrared sensor data as preset first infrared sensor data, wherein the target signal is an infrared signal radiated by a substance except the robot; and if the second infrared sensor does not detect the target signal, setting the data of the second infrared sensor as preset data of the second infrared sensor.
Specifically, if the target is on the right side of the robot, the first infrared sensor is likely not to detect a target signal corresponding to the target; similarly, if the target is on the left side of the robot, the second infrared sensor is likely not to detect a target signal corresponding to the target. Therefore, when the first infrared sensor or/and the second infrared sensor do not detect the target signal, the existence condition of the obstacle can be analyzed according to the preset first infrared sensor data or/and the preset second infrared sensor data, and the accuracy of obstacle avoidance control of the robot is improved.
Optionally, the robot further comprises a motor. The step S11 includes: and if the current sensor does not detect a target current signal, setting the current sensor data as preset current sensor data, wherein the target current signal is a motor current signal larger than a preset motor current threshold value.
Therefore, when the current sensor is not blocked by external force, the motor current is equal to or smaller than the preset motor current threshold value, the current sensor does not detect a target current signal, the existence condition of obstacles can be analyzed according to the preset current sensor data, and the accuracy of obstacle avoidance control of the robot is improved.
And step S12, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
Optionally, the step S12 includes:
and A1, acquiring obstacle information fusion data according to the obstacle detection data and a pre-trained obstacle information data fusion model.
And A2, acquiring a robot obstacle avoidance decision according to the obstacle information fusion data.
Optionally, the a1 includes extracting obstacle features of the obstacle detection data; acquiring obstacle information data according to the obstacle features, wherein the data format of the obstacle information data is a vector; and acquiring obstacle information fusion data according to the obstacle information data and a pre-trained obstacle detection data fusion model.
Specifically, obstacle features of the obstacle detection data are extracted, and the obstacle features include at least one of: presence or absence of an obstacle, obstacle direction, obstacle distance, and obstacle shape. And then, acquiring obstacle information data according to a preset obstacle information data determination rule and the obstacle characteristics, and preferably acquiring obstacle information fusion data according to the obstacle information data and a pre-trained obstacle detection data fusion model.
For example, obstacle features of the obstacle detection data are extracted, the obstacle features including: presence or absence of an obstacle, obstacle direction, obstacle distance, and obstacle shape. Assuming that the preset obstacle information data determination rule is: the obstacle information data is represented as X ═ a, b, c, d, a represents whether an obstacle exists or not, if an obstacle exists, a is 1, and if no obstacle exists, a is 0; b represents an obstacle direction, and if the obstacle direction is a non-obstacle direction, b is 0, if the obstacle direction is a left front direction, b is 1, if the obstacle direction is a right front direction, b is 2, and if the obstacle direction is a right front direction, b is 3; c represents the obstacle distance, if the obstacle distance is not present, namely the obstacle is absent, the c is infinite, and if the obstacle exists, the c is the actual distance detection value; d represents an obstacle shape, and if the obstacle shape is not a shape, that is, if the obstacle shape is not an obstacle, d is 0, if the obstacle shape is a regular shape, d is 1, and if the obstacle shape is an irregular shape, d is 2. And according to the preset obstacle information data determination rule, acquiring obstacle information data corresponding to the first infrared sensor data, the second infrared sensor data, the camera data, the current sensor data and the ultrasonic sensor data. Taking ultrasonic sensor data as an example, extracting corresponding obstacle features according to the ultrasonic sensor data, assuming that the obstacle features corresponding to the ultrasonic sensor data are obstacles with irregular shapes at a position 0.5 m away from the robot and the obstacles are positioned right in front of the robot, determining a rule according to the preset obstacle information data, and obtaining that the obstacle information data corresponding to the ultrasonic sensor data are (1, 2, 0.5, 2).
Fig. 2 shows an example of a pre-trained obstacle detection data fusion model, which is a three-layer artificial neural network built by using a multi-layer perceptron model of a back propagation algorithm, as shown in fig. 2, and the obstacle detection data fusion model is composed of an input layer, an intermediate layer and an output layer, wherein the input layer is composed of X1, X2, X3, X4, X5 and input layer weights Ai, i ═ {1, 2, 3, 4, 5}, the intermediate layer weights are denoted by Bj, j ═ {1, 2, 3, 4, 5}, Y denotes an output result, i.e. Y is obstacle information fusion data. It is assumed that X1, X2, X3, X4 and X5 respectively indicate obstacle information data corresponding to the first infrared sensor data, obstacle information data corresponding to the second infrared sensor data, obstacle information data corresponding to the camera data, obstacle information data corresponding to the current sensor data and obstacle information data corresponding to the ultrasonic sensor data, and the data are expressed according to a functional relation formula
Figure BDA0001908045330000071
Obstacle information fusion data is obtained.
The obstacle information fusion data keeps the relation among the first infrared sensor data, the second infrared sensor data, the camera data, the current sensor data and the ultrasonic sensor data, so that the accuracy of the obstacle information fusion data is improved, and the accuracy of the robot obstacle avoidance control function is improved.
Optionally, before the a2, comprising: judging whether obstacles exist according to the obstacle information fusion data; correspondingly, the acquiring of the robot obstacle avoidance decision according to the obstacle information fusion data specifically comprises: and if the obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
Specifically, analyzing the obstacle information fusion data, obtaining an obstacle information fusion data analysis result, and judging whether an obstacle exists according to the obstacle information fusion data analysis result; correspondingly, the acquiring of the robot obstacle avoidance decision according to the obstacle information fusion data specifically comprises: and if the obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
Because the robot obstacle avoidance decision is obtained according to the obstacle detection data, the scientificity of the robot obstacle avoidance decision can be improved, and the robot is facilitated to accurately avoid the obstacle.
Optionally, if there is an obstacle, obtaining an obstacle avoidance decision of the robot according to the obstacle detection data includes: if an obstacle exists, acquiring the distance between the obstacle and the robot according to the obstacle information fusion data; and acquiring a robot obstacle avoidance decision according to the distance.
Optionally, the obtaining a robot obstacle avoidance decision according to the distance includes: judging whether the distance is smaller than a preset distance threshold value or not; if the distance is smaller than a preset distance threshold, generating a first robot obstacle avoidance decision, wherein the first robot obstacle avoidance decision comprises decision information of robot backward movement; and if the distance is not smaller than a preset distance threshold, generating a second robot obstacle avoidance decision according to the position of the obstacle, wherein the second robot obstacle avoidance decision comprises decision information indicating that the robot takes the direction far away from the obstacle as the movement direction.
For example, if the distance is 0.2 m and the preset distance threshold is 0.5 m, if the distance is determined to be smaller than the preset distance threshold, a first robot obstacle avoidance decision is generated, where the first robot obstacle avoidance decision includes decision information of robot backward movement.
And step S13, controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
For example, if the robot obstacle avoidance decision is a first robot obstacle avoidance decision, the whole robot is controlled to rotate 180 degrees according to the first robot obstacle avoidance decision, and the robot moves forward in the direction after rotating 180 degrees.
In the embodiment of the present invention, by acquiring obstacle detection data, the obstacle detection data includes: infrared sensor data, camera data, current sensor data, ultrasonic sensor data; acquiring a robot obstacle avoidance decision according to the obstacle detection data; and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision. Because devices such as the infrared sensor, the camera, the current sensor and the ultrasonic sensor have low cost, the robot obstacle avoidance control function is realized through infrared sensor data, camera data, current sensor data and ultrasonic sensor data, and the accuracy of the robot obstacle avoidance control function can be improved. Namely, the high accuracy of the robot obstacle avoidance control function can be realized on the basis of low production cost.
Example two:
fig. 3 is a schematic structural diagram of a robot obstacle avoidance control device according to an embodiment of the present application, and only the relevant parts of the robot obstacle avoidance control device according to the embodiment of the present application are shown for convenience of description.
The robot includes at least: the robot comprises a first infrared sensor, a second infrared sensor, a camera, a current sensor and an ultrasonic sensor, wherein the first infrared sensor is installed in the front left of the robot, and the second infrared sensor is installed in the front right of the robot. Of course, if other infrared sensors exist, the infrared sensors can be installed at corresponding positions of the robot according to actual needs.
This barrier controlling means is kept away to robot includes: the system comprises an obstacle detection data acquisition unit 31, a robot obstacle avoidance decision acquisition unit 32 and a robot obstacle avoidance control unit 33.
An obstacle detection data acquisition unit 31 configured to acquire obstacle detection data including: first infrared sensor data, second infrared sensor data, camera data, current sensor data, ultrasonic sensor data.
Optionally, the obstacle detection data acquiring unit 31 is configured to: if the first infrared sensor does not detect a target signal, setting the first infrared sensor data as preset first infrared sensor data, wherein the target signal is an infrared signal radiated by a substance except the robot; and if the second infrared sensor does not detect the target signal, setting the data of the second infrared sensor as preset data of the second infrared sensor.
Optionally, the robot further comprises a motor. The obstacle detection data acquisition unit 31 is configured to: and if the current sensor does not detect a target current signal, setting the current sensor data as preset current sensor data, wherein the target current signal is a motor current signal larger than a preset motor current threshold value.
And the robot obstacle avoidance decision obtaining unit 32 is configured to obtain a robot obstacle avoidance decision according to the obstacle detection data.
Optionally, the robot obstacle avoidance decision obtaining unit 32 includes: the system comprises an obstacle information fusion data acquisition module and a decision acquisition module.
And the obstacle information fusion data acquisition module is used for acquiring obstacle information fusion data according to the obstacle detection data and a pre-trained obstacle information data fusion model.
And the decision acquisition module is used for acquiring a robot obstacle avoidance decision according to the obstacle information fusion data.
Optionally, the obstacle information fusion data obtaining module is specifically configured to: extracting obstacle features of the obstacle detection data; acquiring obstacle information data according to the obstacle features, wherein the data format of the obstacle information data is a vector; and acquiring obstacle information fusion data according to the obstacle information data and a pre-trained obstacle detection data fusion model.
Optionally, the obstacle avoidance control device for a robot includes: and a determination unit for determining whether an obstacle exists.
The obstacle presence determination unit is configured to: judging whether an obstacle exists according to the obstacle information fusion data before the decision acquisition module executes the obstacle avoidance decision of the robot according to the obstacle information fusion data; correspondingly, the decision obtaining module is specifically configured to: and if the obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
The obstacle presence determination unit is specifically configured to: analyzing the obstacle information fusion data to obtain an obstacle avoidance decision of the robot before the decision obtaining module executes the obstacle avoidance decision according to the obstacle information fusion data, and judging whether an obstacle exists according to the obstacle information fusion data analysis result; correspondingly, the decision obtaining module is specifically configured to: and if the obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
Because the robot obstacle avoidance decision is obtained according to the obstacle detection data, the scientificity of the robot obstacle avoidance decision can be improved, and the robot is facilitated to accurately avoid the obstacle.
Optionally, the decision obtaining module includes: a distance obtaining submodule and a decision obtaining submodule.
And the distance acquisition submodule is used for acquiring the distance between the obstacle and the robot according to the obstacle information fusion data if the obstacle exists.
And the decision acquisition submodule is used for acquiring the obstacle avoidance decision of the robot according to the distance.
Optionally, the decision obtaining sub-module is specifically configured to: judging whether the distance is smaller than a preset distance threshold value or not; if the distance is smaller than a preset distance threshold, generating a first robot obstacle avoidance decision, wherein the first robot obstacle avoidance decision comprises decision information of robot backward movement; and if the distance is not smaller than a preset distance threshold, generating a second robot obstacle avoidance decision according to the position of the obstacle, wherein the second robot obstacle avoidance decision comprises decision information indicating that the robot takes the direction far away from the obstacle as the movement direction.
And the robot obstacle avoidance control unit 33 is used for controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
In the embodiment of the present invention, by acquiring obstacle detection data, the obstacle detection data includes: infrared sensor data, camera data, current sensor data, ultrasonic sensor data; acquiring a robot obstacle avoidance decision according to the obstacle detection data; and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision. Because devices such as the infrared sensor, the camera, the current sensor and the ultrasonic sensor have low cost, the robot obstacle avoidance control function is realized through infrared sensor data, camera data, current sensor data and ultrasonic sensor data, and the accuracy of the robot obstacle avoidance control function can be improved. Namely, the high accuracy of the robot obstacle avoidance control function can be realized on the basis of low production cost.
It should be understood that, the sequence numbers of the steps in the foregoing embodiments do not imply an execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example three:
fig. 4 is a schematic diagram of a terminal device according to an embodiment of the present invention. As shown in fig. 4, the terminal device 4 of this embodiment includes: a processor 40, a memory 41 and a computer program 42 stored in said memory 41 and executable on said processor 40. The processor 40, when executing the computer program 42, implements the steps in each of the above-mentioned robot obstacle avoidance control method embodiments, such as the steps S11 to S13 shown in fig. 1. Alternatively, the processor 40, when executing the computer program 42, implements the functions of the units in the device embodiments described above, such as the functions of the units 31 to 33 shown in fig. 3.
Illustratively, the computer program 42 may be partitioned into one or more modules/units that are stored in the memory 41 and executed by the processor 40 to implement the present invention. The one or more modules/units may be a series of computer program instruction segments capable of performing specific functions, which are used to describe the execution process of the computer program 42 in the terminal device 4. For example, the computer program 42 may be divided into an obstacle detection data acquisition unit, a robot obstacle avoidance decision acquisition unit, and a robot obstacle avoidance control unit, where the specific functions of each unit are as follows:
an obstacle detection data acquisition unit configured to acquire obstacle detection data including: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor;
the robot obstacle avoidance decision acquisition unit is used for acquiring a robot obstacle avoidance decision according to the obstacle detection data;
and the robot obstacle avoidance control unit is used for controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
The terminal device 4 may be a desktop computer, a notebook, a palm computer, a cloud server, or other computing devices. The terminal device may include, but is not limited to, a processor 40, a memory 41. Those skilled in the art will appreciate that fig. 4 is merely an example of a terminal device 4 and does not constitute a limitation of terminal device 4 and may include more or fewer components than shown, or some components may be combined, or different components, e.g., the terminal device may also include input-output devices, network access devices, buses, etc.
The Processor 40 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), an off-the-shelf Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic, discrete hardware components, etc. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 41 may be an internal storage unit of the terminal device 4, such as a hard disk or a memory of the terminal device 4. The memory 41 may also be an external storage device of the terminal device 4, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like, which are provided on the terminal device 4. Further, the memory 41 may also include both an internal storage unit and an external storage device of the terminal device 4. The memory 41 is used for storing the computer program and other programs and data required by the terminal device. The memory 41 may also be used to temporarily store data that has been output or is to be output.
It will be apparent to those skilled in the art that, for convenience and brevity of description, only the above-mentioned division of the functional units and modules is illustrated, and in practical applications, the above-mentioned function distribution may be performed by different functional units and modules according to needs, that is, the internal structure of the apparatus is divided into different functional units or modules to perform all or part of the above-mentioned functions. Each functional unit and module in the embodiments may be integrated in one processing unit, or each unit may exist alone physically, or two or more units are integrated in one unit, and the integrated unit may be implemented in a form of hardware, or in a form of software functional unit. In addition, specific names of the functional units and modules are only for convenience of distinguishing from each other, and are not used for limiting the protection scope of the present application. The specific working processes of the units and modules in the system may refer to the corresponding processes in the foregoing method embodiments, and are not described herein again.
In the above embodiments, the descriptions of the respective embodiments have respective emphasis, and reference may be made to the related descriptions of other embodiments for parts that are not described or illustrated in a certain embodiment.
Those of ordinary skill in the art will appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware or combinations of computer software and electronic hardware. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the implementation. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present invention.
In the embodiments provided in the present invention, it should be understood that the disclosed apparatus/terminal device and method may be implemented in other ways. For example, the above-described embodiments of the apparatus/terminal device are merely illustrative, and for example, the division of the modules or units is only one logical division, and there may be other divisions when actually implemented, for example, a plurality of units or components may be combined or integrated into another system, or some features may be omitted, or not executed. In addition, the shown or discussed mutual coupling or direct coupling or communication connection may be an indirect coupling or communication connection through some interfaces, devices or units, and may be in an electrical, mechanical or other form.
The units described as separate parts may or may not be physically separate, and parts displayed as units may or may not be physical units, may be located in one place, or may be distributed on a plurality of network units. Some or all of the units can be selected according to actual needs to achieve the purpose of the solution of the embodiment.
In addition, functional units in the embodiments of the present invention may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated modules/units, if implemented in the form of software functional units and sold or used as separate products, may be stored in a computer readable storage medium. Based on such understanding, all or part of the flow of the method according to the embodiments of the present invention may also be implemented by a computer program, which may be stored in a computer-readable storage medium, and when the computer program is executed by a processor, the steps of the method embodiments may be implemented. . Wherein the computer program comprises computer program code, which may be in the form of source code, object code, an executable file or some intermediate form, etc. The computer-readable medium may include: any entity or device capable of carrying the computer program code, recording medium, usb disk, removable hard disk, magnetic disk, optical disk, computer Memory, Read-Only Memory (ROM), Random Access Memory (RAM), electrical carrier wave signals, telecommunications signals, software distribution medium, and the like. It should be noted that the computer readable medium may contain content that is subject to appropriate increase or decrease as required by legislation and patent practice in jurisdictions, for example, in some jurisdictions, computer readable media does not include electrical carrier signals and telecommunications signals as is required by legislation and patent practice.
The above-mentioned embodiments are only used for illustrating the technical solutions of the present invention, and not for limiting the same; although the present invention has been described in detail with reference to the foregoing embodiments, it will be understood by those of ordinary skill in the art that: the technical solutions described in the foregoing embodiments may still be modified, or some technical features may be equivalently replaced; such modifications and substitutions do not substantially depart from the spirit and scope of the embodiments of the present invention, and are intended to be included within the scope of the present invention.

Claims (9)

1. A robot obstacle avoidance control method is characterized in that the robot at least comprises: first infrared sensor, second infrared sensor, camera, current sensor, ultrasonic sensor, first infrared sensor installs in the left place ahead of robot, second infrared sensor installs in the right place ahead of robot, includes:
obtaining obstacle detection data, the obstacle detection data comprising: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor;
if the first infrared sensor or the second infrared sensor does not detect the target signal, setting the first infrared sensing data or the second infrared sensor as preset infrared sensor data, and if the current sensor does not detect the target current signal, setting the current sensor data as preset current sensor data, wherein the target current signal is larger than the motor current signal of a preset motor current threshold; the target signal is an infrared signal radiated by a substance except the robot; the target current signal is a motor current signal which is larger than a preset motor current threshold value;
acquiring a robot obstacle avoidance decision according to the obstacle detection data;
and controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
2. The robot obstacle avoidance control method of claim 1, wherein the obtaining a robot obstacle avoidance decision from the obstacle detection data comprises:
acquiring obstacle information fusion data according to the obstacle detection data and a pre-trained obstacle information data fusion model;
and acquiring a robot obstacle avoidance decision according to the obstacle information fusion data.
3. The robot obstacle avoidance control method according to claim 2, wherein the acquiring obstacle information fusion data according to the obstacle detection data and a pre-trained obstacle information data fusion model includes:
extracting obstacle features of the obstacle detection data;
acquiring obstacle information data according to the obstacle features, wherein the data format of the obstacle information data is a vector;
and acquiring obstacle information fusion data according to the obstacle information data and a pre-trained obstacle detection data fusion model.
4. The robot obstacle avoidance control method of claim 2, wherein before the acquiring of the robot obstacle avoidance decision according to the obstacle information fusion data, comprising:
judging whether obstacles exist according to the obstacle information fusion data;
correspondingly, the acquiring of the robot obstacle avoidance decision according to the obstacle information fusion data specifically comprises:
and if the obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle detection data.
5. The robot obstacle avoidance control method of claim 4, wherein, if an obstacle exists, acquiring a robot obstacle avoidance decision according to the obstacle information fusion data comprises:
if an obstacle exists, acquiring the distance between the obstacle and the robot according to the obstacle information fusion data;
and acquiring a robot obstacle avoidance decision according to the distance.
6. The robot obstacle avoidance control method of claim 5, wherein the obtaining a robot obstacle avoidance decision according to the distance comprises:
judging whether the distance is smaller than a preset distance threshold value or not;
if the distance is smaller than a preset distance threshold, generating a first robot obstacle avoidance decision, wherein the first robot obstacle avoidance decision comprises decision information of robot backward movement;
and if the distance is not smaller than a preset distance threshold, generating a second robot obstacle avoidance decision according to the position of the obstacle, wherein the second robot obstacle avoidance decision comprises decision information indicating that the robot takes the direction far away from the obstacle as the movement direction.
7. A robot keeps away barrier controlling means, its characterized in that, the robot includes at least: first infrared sensor, second infrared sensor, camera, current sensor, ultrasonic sensor, first infrared sensor installs in the left place ahead of robot, second infrared sensor installs in the right place ahead of robot, includes:
an obstacle detection data acquisition unit configured to acquire obstacle detection data including: the data of the first infrared sensor, the data of the second infrared sensor, the data of the camera, the data of the current sensor and the data of the ultrasonic sensor; if the first infrared sensor or the second infrared sensor does not detect the target signal, setting the first infrared sensing data or the second infrared sensor as preset infrared sensor data, and if the current sensor does not detect the target current signal, setting the current sensor data as preset current sensor data, wherein the target current signal is larger than the motor current signal of a preset motor current threshold; the target signal is an infrared signal radiated by a substance except the robot; the target current signal is a motor current signal which is larger than a preset motor current threshold value;
the robot obstacle avoidance decision acquisition unit is used for acquiring a robot obstacle avoidance decision according to the obstacle detection data;
and the robot obstacle avoidance control unit is used for controlling the robot to avoid the obstacle according to the robot obstacle avoidance decision.
8. A terminal device comprising a memory, a processor and a computer program stored in the memory and executable on the processor, characterized in that the processor implements the steps of the method according to any of claims 1 to 6 when executing the computer program.
9. A computer-readable storage medium, in which a computer program is stored which, when being executed by a processor, carries out the steps of the method according to any one of claims 1 to 6.
CN201811540871.7A 2018-12-17 2018-12-17 Robot obstacle avoidance control method and device, terminal device and storage medium Active CN109739223B (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CN201811540871.7A CN109739223B (en) 2018-12-17 2018-12-17 Robot obstacle avoidance control method and device, terminal device and storage medium
PCT/CN2019/124353 WO2020125500A1 (en) 2018-12-17 2019-12-10 Control method and apparatus for obstacle avoidance of robot, and terminal device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811540871.7A CN109739223B (en) 2018-12-17 2018-12-17 Robot obstacle avoidance control method and device, terminal device and storage medium

Publications (2)

Publication Number Publication Date
CN109739223A CN109739223A (en) 2019-05-10
CN109739223B true CN109739223B (en) 2020-07-03

Family

ID=66359802

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811540871.7A Active CN109739223B (en) 2018-12-17 2018-12-17 Robot obstacle avoidance control method and device, terminal device and storage medium

Country Status (2)

Country Link
CN (1) CN109739223B (en)
WO (1) WO2020125500A1 (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109739223B (en) * 2018-12-17 2020-07-03 中国科学院深圳先进技术研究院 Robot obstacle avoidance control method and device, terminal device and storage medium
CN111487964A (en) * 2020-04-03 2020-08-04 北京理工大学 Robot trolley and autonomous obstacle avoidance method and device thereof
CN111516777A (en) * 2020-04-03 2020-08-11 北京理工大学 Robot trolley and obstacle identification method thereof
CN112401752B (en) * 2020-11-04 2022-05-17 北京石头创新科技有限公司 Method, device, medium and electronic equipment for detecting unknown obstacles
CN113433935A (en) * 2021-05-31 2021-09-24 惠州市宇林源科技有限公司 Automatic robot path finding method, robot, equipment and medium
CN113885500A (en) * 2021-10-08 2022-01-04 深圳市云鼠科技开发有限公司 LDS and current-based state detection method and device
CN114211486B (en) * 2021-12-13 2024-03-22 中国科学院深圳先进技术研究院 Robot control method, robot and storage medium
CN114275134B (en) * 2021-12-28 2023-06-30 上海海事大学 Unmanned ship propeller waterproof bottom aquatic weed winding method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion
CN105881555A (en) * 2016-06-17 2016-08-24 南京仁义机器人有限公司 Photovoltaic power station cleaning robot achieving direct dust removal on basis of draught fan and working method of photovoltaic power station cleaning robot
US9436185B2 (en) * 2010-12-30 2016-09-06 Irobot Corporation Coverage robot navigating
CN106182037A (en) * 2016-09-06 2016-12-07 广东工业大学 A kind of domestic monitoring robot and system thereof
CN108733044A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 Barrier-avoiding method, device, robot and computer readable storage medium

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6317953B1 (en) * 1981-05-11 2001-11-20 Lmi-Diffracto Vision target based assembly
CN103019245A (en) * 2013-01-07 2013-04-03 西北农林科技大学 Obstacle avoidance system of mountain farming robot on basis of multi-sensor information fusion
CN105629970A (en) * 2014-11-03 2016-06-01 贵州亿丰升华科技机器人有限公司 Robot positioning obstacle-avoiding method based on supersonic wave
TWI558525B (en) * 2014-12-26 2016-11-21 國立交通大學 Robot and control method thereof
CN106383515A (en) * 2016-09-21 2017-02-08 哈尔滨理工大学 Wheel-type moving robot obstacle-avoiding control system based on multi-sensor information fusion
CN106647768A (en) * 2017-01-18 2017-05-10 成都黑盒子电子技术有限公司 Spontaneous movement obstacle avoidance method of service robot
CN206601623U (en) * 2017-02-28 2017-10-31 中原工学院 A kind of big barrier obstruction-avoiding control system of intelligent carriage based on Multi-sensor Fusion
CN107621641B (en) * 2017-09-20 2019-06-21 歌尔股份有限公司 Infrared barrier detection method, apparatus and robot
CN107650151A (en) * 2017-11-08 2018-02-02 徐国聪 A kind of intelligent barrier avoiding robot
CN107976999B (en) * 2017-11-21 2020-11-06 深圳市远弗科技有限公司 Mobile robot and obstacle avoidance and path planning method and system thereof
CN107997691A (en) * 2017-12-05 2018-05-08 北京奇虎科技有限公司 Stall processing method, device and clean robot
CN108784540A (en) * 2018-06-29 2018-11-13 炬大科技有限公司 A kind of sweeping robot automatic obstacle-avoiding moving device and mode of progression
CN109739223B (en) * 2018-12-17 2020-07-03 中国科学院深圳先进技术研究院 Robot obstacle avoidance control method and device, terminal device and storage medium

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9436185B2 (en) * 2010-12-30 2016-09-06 Irobot Corporation Coverage robot navigating
CN203012510U (en) * 2013-01-07 2013-06-19 西北农林科技大学 Mountainous region agricultural robot obstacle-avoiding system based on multi-sensor information fusion
CN105881555A (en) * 2016-06-17 2016-08-24 南京仁义机器人有限公司 Photovoltaic power station cleaning robot achieving direct dust removal on basis of draught fan and working method of photovoltaic power station cleaning robot
CN106182037A (en) * 2016-09-06 2016-12-07 广东工业大学 A kind of domestic monitoring robot and system thereof
CN108733044A (en) * 2017-09-29 2018-11-02 北京猎户星空科技有限公司 Barrier-avoiding method, device, robot and computer readable storage medium

Also Published As

Publication number Publication date
WO2020125500A1 (en) 2020-06-25
CN109739223A (en) 2019-05-10

Similar Documents

Publication Publication Date Title
CN109739223B (en) Robot obstacle avoidance control method and device, terminal device and storage medium
CN108733342B (en) Volume adjusting method, mobile terminal and computer readable storage medium
CN108900770B (en) Method and device for controlling rotation of camera, smart watch and mobile terminal
CN108765340B (en) Blurred image processing method and device and terminal equipment
CN108596955B (en) Image detection method, image detection device and mobile terminal
CN109215037B (en) Target image segmentation method and device and terminal equipment
CN110410353B (en) Fan control method and device and terminal equipment
CN110377215B (en) Model display method and device and terminal equipment
CN110288710B (en) Three-dimensional map processing method and device and terminal equipment
US10732719B2 (en) Performing actions responsive to hovering over an input surface
CN109873980B (en) Video monitoring method and device and terminal equipment
CN111381224B (en) Laser data calibration method and device and mobile terminal
CN111597009B (en) Application program display method and device and terminal equipment
EP2771766B1 (en) Pressure-based interaction for indirect touch input devices
CN107679222B (en) Picture processing method, mobile terminal and computer readable storage medium
CN109444905B (en) Dynamic object detection method and device based on laser and terminal equipment
CN113192639A (en) Training method, device and equipment of information prediction model and storage medium
CN109492249B (en) Rapid generation method and device of design drawing and terminal equipment
CN109165648B (en) Image processing method, image processing device and mobile terminal
CN110874729B (en) Switching method and switching device for electronic red packet identification strategy and mobile terminal
CN108815840B (en) Method and device for controlling application program, mobile terminal and storage medium
CN112203131B (en) Prompting method and device based on display equipment and storage medium
CN114629800A (en) Visual generation method, device, terminal and storage medium for industrial control network target range
CN109005357B (en) Photographing method, photographing device and terminal equipment
CN115096601A (en) ADAS target adjusting method, ADAS target adjusting device, ADAS target adjusting terminal and storage medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant