WO2021212926A1 - 自行走机器人避障方法、装置、机器人和存储介质 - Google Patents

自行走机器人避障方法、装置、机器人和存储介质 Download PDF

Info

Publication number
WO2021212926A1
WO2021212926A1 PCT/CN2021/070912 CN2021070912W WO2021212926A1 WO 2021212926 A1 WO2021212926 A1 WO 2021212926A1 CN 2021070912 W CN2021070912 W CN 2021070912W WO 2021212926 A1 WO2021212926 A1 WO 2021212926A1
Authority
WO
WIPO (PCT)
Prior art keywords
obstacle
designated
type
robot
information
Prior art date
Application number
PCT/CN2021/070912
Other languages
English (en)
French (fr)
Inventor
吴震
谢濠键
彭松
王逸星
胡志颖
Original Assignee
北京石头世纪科技股份有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 北京石头世纪科技股份有限公司 filed Critical 北京石头世纪科技股份有限公司
Priority to EP21791604.8A priority Critical patent/EP4140381A1/en
Priority to US17/996,699 priority patent/US20230225576A1/en
Publication of WO2021212926A1 publication Critical patent/WO2021212926A1/zh

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4061Steering means; Means for avoiding obstacles; Details related to the place where the driver is accommodated
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/009Carrying-vehicles; Arrangements of trollies or wheels; Means for avoiding mechanical obstacles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • A47L9/2826Parameters or conditions being sensed the condition of the floor
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • G05D1/2246
    • G05D1/2247
    • G05D1/243
    • G05D1/622
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/06Control of the cleaning action for autonomous devices; Automatic detection of the surface condition before, during or after cleaning
    • G05D2101/20
    • G05D2105/10
    • G05D2107/40
    • G05D2109/10
    • G05D2111/10

Definitions

  • the present disclosure relates to the technical field of robot control, and in particular to an obstacle avoidance method, device, robot, and storage medium for a self-propelled robot.
  • some sweeping robots equipped with cameras have the ability to recognize the types of obstacles, and can adopt different obstacle avoidance strategies according to the types of obstacles.
  • the recognition results are often wrong, and obstacles are automatically avoided in places where obstacles should not be avoided in the cleaning route, resulting in missed scans in some areas and poor user experience.
  • the embodiment of the present disclosure provides an obstacle avoidance method for a self-propelled robot, which is applied to the robot side, and includes: acquiring and recording information about obstacles encountered during traveling; wherein the information about the obstacle includes the type of the obstacle Information and location information; receiving an operation instruction for a designated obstacle, the operation instruction indicating that when an obstacle of the same type as the designated obstacle is detected within the area marked with the designated obstacle, the operation instruction is not correct
  • the obstacles of the same type mentioned above are used for obstacle avoidance operations.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is determined whether the location of the obstacle is within the area of the designated obstacle; if so, it is confirmed that the designated obstacle is detected in the area marked with the designated obstacle. Obstacles of the same type.
  • receiving an operation instruction for a designated obstacle specifically includes: receiving an operation instruction sent by a user through a human-computer interaction interface to ignore the designated obstacle.
  • the operation instruction includes: type information and location information of the designated obstacle.
  • the method further includes: assigning an identifier to the obstacle after encountering an obstacle during the traveling process, wherein the operation instruction includes: the identifier of the designated obstacle; After the operation instruction of the object, search for corresponding type information and location information according to the identification of the designated obstacle.
  • the area range includes: an area of fixed size and shape with the coordinate position of the designated obstacle as the geometric center; or, the partition where the obstacle is located, wherein the partition is the self-propelled The area divided automatically by the robot or manually set by the user; or all areas that the self-propelled robot can reach.
  • the embodiment of the present disclosure provides an obstacle avoidance device for a self-propelled robot, which is applied to the robot side, and includes: an acquisition unit for acquiring and recording information about obstacles encountered during travel, wherein the information about obstacles includes The type information and position information of the obstacle; a receiving unit for receiving an operation instruction for a designated obstacle, the operation instruction indicating that the designated obstacle is detected within the area marked with the designated obstacle When there are obstacles of the same type, the obstacle avoidance operation is not performed on the obstacles of the same type.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is determined whether the location of the obstacle is within the area of the designated obstacle; if so, it is confirmed that the designated obstacle is detected in the area marked with the designated obstacle. Obstacles of the same type.
  • receiving an operation instruction for a designated obstacle specifically includes: receiving an operation instruction sent by a user through a human-computer interaction interface to ignore the designated obstacle.
  • the operation instruction includes: type information and location information of the designated obstacle.
  • the device further includes: a unit for assigning an identifier to the obstacle after encountering an obstacle during the travel, wherein the operation instruction includes: the identifier of the designated obstacle; and After receiving an operation instruction for a designated obstacle, according to the identifier of the designated obstacle, search for a unit of corresponding type information and location information.
  • the area range includes: an area of fixed size and shape with the coordinate position of the designated obstacle as the geometric center; or, the partition where the obstacle is located, wherein the partition is the self-propelled The area divided automatically by the robot or manually set by the user; or all areas that the self-propelled robot can reach.
  • An embodiment of the present disclosure provides a robot, including a processor and a memory, the memory stores computer program instructions that can be executed by the processor, and when the computer program instructions are executed by the processor, any one of the above The described method steps.
  • the embodiments of the present disclosure provide a non-transitory computer-readable storage medium that stores computer program instructions that, when called and executed by a processor, implement any of the method steps described above.
  • FIG. 1 is a schematic diagram of an application scenario provided by an embodiment of the disclosure
  • FIG. 2 is a perspective view of the structure of a self-propelled robot provided by an embodiment of the disclosure
  • FIG. 3 is a top view of the structure of a self-propelled robot provided by an embodiment of the disclosure
  • FIG. 4 is a bottom view of the structure of the self-propelled robot provided by the embodiment of the disclosure.
  • FIG. 5 is a schematic flowchart of an obstacle avoidance method for a self-propelled robot provided by an embodiment of the disclosure
  • FIG. 6 is a schematic diagram of setting up an obstacle avoidance APP for a self-propelled robot provided by an embodiment of the disclosure
  • FIG. 7 is a schematic structural diagram of an obstacle avoidance device for a self-propelled robot provided by an embodiment of the disclosure.
  • FIG. 8 is a schematic diagram of the electronic structure of a robot provided by an embodiment of the disclosure.
  • first, second, third, etc. may be used to describe... in the embodiments of the present disclosure, these... should not be limited to these terms. These terms are only used to distinguish... from each other.
  • first... can also be referred to as the second..., and similarly, the second... can also be referred to as the first...
  • the embodiments of the present disclosure provide a possible application scenario, and the application scenario includes an automatic cleaning device 100, such as a sweeping robot, a mopping robot, a vacuum cleaner, a lawnmower, and so on.
  • an automatic cleaning device 100 such as a sweeping robot, a mopping robot, a vacuum cleaner, a lawnmower, and so on.
  • a household sweeping robot is taken as an example for description.
  • the sweeping robot can recognize obstacles through the cloud server 300, the local server, or its own storage system, and mark the position as Obstacle location, the next time you travel to that location, it will automatically avoid obstacles.
  • the robot may be provided with a touch-sensitive display or controlled by a mobile terminal to receive operation instructions input by the user.
  • the automatic cleaning equipment can be equipped with various sensors, such as buffers, cliff sensors, ultrasonic sensors, infrared sensors, magnetometers, accelerometers, gyroscopes, odometers and other sensing devices (the specific structure of each sensor is not described in detail, but can be used Any of the above-mentioned sensors are applied to this automatic cleaning equipment), the robot can also be equipped with wireless communication modules such as WIFI module and Bluetooth module to connect with the smart terminal or server, and receive operation instructions transmitted by the smart terminal or server through the wireless communication module .
  • WIFI module wireless communication modules
  • the automatic cleaning device 100 can travel on the ground through various combinations of movement relative to the following three mutually perpendicular axes defined by the main body 110: the front and rear axis X, the lateral axis Y, and the central vertical axis Z.
  • the forward driving direction along the front-rear axis X is denoted as “forward”
  • the backward driving direction along the front-rear axis X is denoted as “rearward”.
  • the direction of the lateral axis Y is essentially a direction extending between the right and left wheels of the robot along the axis defined by the center point of the driving wheel module 141.
  • the automatic cleaning device 100 can rotate around the Y axis.
  • the robot 100 can rotate around the Z axis. In the forward direction of the automatic cleaning device 100, when the automatic cleaning device 100 is tilted to the right of the X axis, it is “turn right”, and when the automatic cleaning device 100 is tilted to the left of the X axis, it is “turn left”.
  • the automatic cleaning equipment 100 includes a machine body 110, a sensing system 120, a control system, a driving system 140, a cleaning system, an energy system, and a human-computer interaction system 180.
  • the main body 110 of the machine includes a front part 111 and a rear part 112, which have an approximate circular shape (the front and rear are circular), and can also have other shapes, including but not limited to the approximate D-shape of the front and rear circles and the rectangular shape of the front and rear. Or square shape.
  • the sensing system 120 includes a position determining device 121 located on the machine body 110, a collision sensor and a proximity sensor arranged on the buffer 122 of the forward part 111 of the machine body 110, and a proximity sensor arranged on the lower part of the machine body.
  • Cliff sensors as well as sensing devices such as magnetometers, accelerometers, gyroscopes (Gyro), and odometers (ODO, full name odograph) installed inside the main body of the machine, are used to provide the control system 130 with various position information and movement of the machine status information.
  • the position determining device 121 includes, but is not limited to, a camera and a laser ranging device (LDS, full name Laser DirectStructuring).
  • LDS laser ranging device
  • the forward part 111 of the machine body 110 can carry a buffer 122.
  • the buffer 122 passes through a sensor system provided on it, such as an infrared sensor.
  • the automatic cleaning device 100 can control the driving wheel module 141 to make the automatic cleaning device 100 come through the events detected by the buffer 122, such as obstacles and walls. Respond to the event, such as staying away from obstacles.
  • the control system 130 is set on the main circuit board in the main body of the machine 110, and includes non-transitory memory, such as hard disk, flash memory, random access memory, and computing processor for communication, such as central processing unit, application processor, and application processing.
  • the device uses positioning algorithms, such as real-time localization and map construction (SLAM, full name Simultaneous Localization And Mapping) based on the obstacle information fed back by the laser ranging device, to draw a real-time map of the environment where the robot is located.
  • SLAM real-time localization and map construction
  • the driving system 140 can manipulate the robot 100 to travel across the ground based on a driving command having distance and angle information (for example, x, y, and ⁇ components).
  • the driving system 140 includes a driving wheel module 141.
  • the driving wheel module 141 can control the left wheel and the right wheel at the same time.
  • the driving wheel module 141 preferably includes a left driving wheel module and a right driving wheel module, respectively.
  • the left and right driving wheel modules are opposed to each other along a transverse axis defined by the main body 110.
  • the robot may include one or more driven wheels 142, and the driven wheels include, but are not limited to, universal wheels.
  • the driving wheel module includes a walking wheel, a driving motor, and a control circuit for controlling the driving motor.
  • the driving wheel module can also be connected to a circuit for measuring the driving current and an odometer.
  • the driving wheel module 141 can be detachably connected to the main body 110 to facilitate disassembly, assembly and maintenance.
  • the driving wheel may have a biased drop suspension system, fastened in a movable manner, for example, attached in a rotatable manner, to the robot body 110, and receives a spring bias that is biased downward and away from the robot body 110.
  • the spring bias allows the driving wheel to maintain contact and traction with the ground with a certain ground force, and at the same time, the cleaning element of the automatic cleaning device 100 also contacts the ground 10 with a certain pressure.
  • the cleaning system may be a dry cleaning system and/or a wet cleaning system.
  • the main cleaning function comes from the cleaning system 151 composed of roller brushes, dust boxes, fans, air outlets, and connecting parts between the four.
  • the rolling brush which has a certain interference with the ground, sweeps the garbage on the ground and rolls it to the front of the dust suction port between the rolling brush and the dust box, and then is sucked into the dust box by the suction gas generated by the fan and passed through the dust box.
  • the dry cleaning system may also include a side brush 152 with a rotating shaft, which is at an angle with respect to the ground for moving debris to the rolling brush area of the cleaning system.
  • the energy system includes rechargeable batteries, such as nickel-metal hydride batteries and lithium batteries.
  • the rechargeable battery can be connected with a charging control circuit, a battery pack charging temperature detection circuit, and a battery undervoltage monitoring circuit.
  • the charging control circuit, battery pack charging temperature detection circuit, and battery undervoltage monitoring circuit are then connected to the single-chip control circuit.
  • the main unit is connected to the charging pile through charging electrodes arranged on the side or below the fuselage for charging. If dust is attached to the exposed charging electrode, the plastic body around the electrode will melt and deform due to the accumulation of electric charges during the charging process, and even cause the electrode itself to deform, and normal charging cannot continue.
  • the human-computer interaction system 180 includes buttons on the host panel for users to select functions; it may also include a display screen and/or indicator lights and/or speakers, which show the user the current state of the machine or Function options; it can also include mobile phone client programs. For route navigation type automatic cleaning equipment, the mobile phone client can show the user a map of the environment where the equipment is located, and the location of the machine, which can provide users with richer and more user-friendly functional items.
  • the obstacle avoidance method of the robot provided by the embodiments of the present disclosure enables the robot to obtain obstacle information in the traveling route in real time according to its own sensor or image acquisition device during the traveling process, and mark the obstacle information in the operation map for subsequent manual judgment When certain types of obstacles do not require obstacle avoidance, they can be ignored, and the subsequent cleaning process will no longer perform obstacle avoidance operations for these types of obstacles.
  • an embodiment of the present disclosure provides an obstacle avoidance method for a self-propelled robot, which is applied to the robot side, and includes the following method steps.
  • Step S502 Obtain and record information about obstacles encountered during travel, where the information about the obstacle includes type information and location information of the obstacle.
  • Obstacle information generally refers to any object that can hinder the robot from moving.
  • the automatic cleaning equipment includes collision sensors, proximity sensors, cliff sensors, and sensor devices such as magnetometers, accelerometers, gyroscopes (Gyro), and odometers (ODO, full name odograph) installed inside the main body of the machine. These sensors provide various position information and motion state information of the machine to the control system 130 at a natural frequency. The control system 130 will obtain various sensor parameters of the automatic cleaning equipment during its travel in real time, and then determine and analyze the current position and working status of the automatic cleaning equipment.
  • the automatic cleaning equipment uses its own camera to take real-time images during the journey, where the image information can be complete photo information, or feature information that reflects the object in the image extracted from the photo, such as image information after feature points are extracted .
  • the feature information is abstract features of the picture, such as corner points, object boundary lines, color gradients, optical flow gradients, and the like, and the positional relationship between the features.
  • the control system 130 will obtain the image information parameters of the automatic cleaning equipment during the travel in real time, and then determine and analyze the current position and working status of the automatic cleaning equipment.
  • the obstacle information includes type information and location information of the obstacle.
  • the type information of the obstacle refers to the automatic recognition of the obstacle as the corresponding type based on the recognition result of the sensor or the image recognition device according to the preset recognition classification, and the type includes but not limited to the socket, the base, and the excretion. Objects, scales, thread balls, shoes, fixtures (such as sofas, beds, tables, chairs, etc.).
  • the system will classify the obstacles according to the different characteristics of the obstacles acquired by the sensor or the image recognition device. For example, the image captured by the image recognition device will be automatically classified according to the obstacle tag, and the obstacle can be classified according to the collision force and the collision position received by the collision sensor.
  • the object is identified as a fixed object.
  • the position information of the obstacle is based on the specific position of the recognized obstacle, such as the coordinate position, and the specific position information is marked in the operation map of the robot.
  • the job map is area information formed by the robot according to the work area.
  • the job map can be obtained by external input after the robot reaches the work area, or it can be automatically obtained by the robot during multiple work processes.
  • the job map usually includes partitions formed according to the architectural characteristics of the job area, such as the family's bedroom area, living room area, and kitchen area; the office area's corridor area, work area, and rest area.
  • obstacle information can be acquired through an image acquisition device, and image information can be acquired in real time during travel.
  • image information can be acquired in real time during travel.
  • the identification specifically includes: the robot acquires real-time image information in the process of travel, when there is an obstacle image, compares the obstacle image with the image model trained by the robot, and compares all the images according to the comparison result.
  • the identified obstacles are classified, for example, when an image of "shoes" is captured, the image is matched with multiple types of models stored by the robot, and when the proportion of matching "shoes" is higher, it is classified For footwear.
  • the obstacle can be recognized from multiple angles.
  • the probability that the front group content may be a coil is 80%, and the probability that it may be excrement is 70%. %, the probability that it may be a ball is 75%.
  • the robot can choose to obtain image information from another angle and perform secondary recognition until it can be recognized as For a certain category, for example, the probability of being recognized as a cluster is 80%, and the probability of being recognized as other obstacles is less than 50%, then the obstacle is classified as a cluster.
  • Obstacle image information generally refers to information that can be judged through images that can cause a robot to get stuck, including but not limited to image information acquired by the image acquisition device of the cleaning robot, radar information acquired by the radar system, and so on.
  • the automatic cleaning equipment uses its own camera to capture images during the journey in real time, where the image information can be complete photo information, or feature information that reflects the image extracted from the photo, such as image information after feature points are extracted.
  • the feature information is abstract features of the picture, such as corner points, object boundary lines, color gradients, optical flow gradients, and the like, and the positional relationship between the features.
  • an area with a certain area and shape around the obstacle position is selected as the obstacle area; wherein the obstacle position is within the obstacle area .
  • the above range can be a circular area with the coordinate point marked by the obstacle as the center and the preset length as the radius; or it can be the coordinate point marked by the obstacle as the center and the preset length as the side length.
  • Quadrilateral The length of the radius and the length of the side length can be fixed, or can be set to different values according to different types of obstacles.
  • the above-mentioned image sensor can determine whether there is an obstacle at the current position. When there is an obstacle, mark its position coordinates on the job map, and record the type information of the obstacle in combination with the image device.
  • the type information includes But not limited to power strips, bases, excrement, scales, thread balls, shoes, fixtures (such as sofas, beds, tables, chairs, etc.).
  • Step S504 Receive an operation instruction for a designated obstacle, and the operation instruction causes an obstacle of the same type as the designated obstacle to be detected within the area marked with the designated obstacle. Perform obstacle avoidance operations on the same obstacle.
  • the receiving an operation instruction for a designated obstacle specifically includes: receiving an operation instruction for ignoring the designated obstacle from a user through a human-computer interaction interface.
  • the human-computer interaction interface includes, but is not limited to, the touch operation of the mobile phone APP interface, the human-computer interaction interface of the robot itself, the voice interaction interface, and the like.
  • the designated obstacles include one or more types, for example, one or more designations of power strips, bases, excrement, weight scales, coils, shoes, and fixtures.
  • the operation instruction includes: type information and location information of the designated obstacle. Specifically, it may be an instruction for a certain type of obstacle at a certain location, for example, an instruction for excrement in the living room. It can also be an instruction for certain types of obstacles in a certain location, such as instructions for excrement and plug-in strips in the passenger hall. It can also be an instruction to a certain type of obstacle in certain locations, such as an instruction to excrement in the living room or bedroom.
  • the obstacle is assigned an identifier, for example, different obstacles such as plug strips, bases, excrement, weight scales, coils, shoes, and fixtures are assigned different
  • the identification can be displayed on the mobile phone operation interface; the operation instruction includes: the identification of a designated obstacle; after the operation instruction for a designated obstacle is received, the corresponding type information is searched for according to the identification of the designated obstacle And location information.
  • the instruction content may be that when an obstacle of the same type as the designated obstacle is detected within the area marked with the designated obstacle, no obstacle avoidance operation is performed on the obstacle of the same type.
  • the area range may be a certain distance from the designated obstacle marked with the designated obstacle, or it may be a zone marked with the designated obstacle, for example, the markers of the obstacles with excrement detected in the bedroom all point to the ignore instruction; or , It can also be all areas that the robot can reach, that is, the obstacle avoidance operation is not performed on the same type of obstacles in the entire space.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle specifically includes: comparing the detected obstacle type with the designated obstacle Type matching; if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected.
  • the area range includes: an area of fixed size and shape with the coordinate position of the designated obstacle as the geometric center; or, the partition where the obstacle is located, wherein the partition is the autonomous robot Divided or manually set by the user to divide the area.
  • the area range includes: a circle with a coordinate position marked with the obstacle information as a center and a fixed length as a radius; or, a zone where the coordinate position marked with the obstacle information is located, wherein:
  • the partition is an area divided according to the job map.
  • the ignored range can be a circle with the coordinate point as the center and a fixed length as the radius, preferably 50cm; for a sweeping robot with partitioning capabilities, the ignored range can be the partition where the coordinate point is located, for example, ignore The slippers are marked in the bedroom; or the shoes are ignored in the entire cleaning space.
  • the walking robot When walking within an area marked with the designated obstacle (such as shoes) (such as a circle with a radius of 50 cm), the walking robot matches the type of obstacle detected in real time with the type of the designated obstacle, for example, judges the current Whether the detected obstacle type is a shoe, if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected, and then an ignoring operation is performed.
  • the designated obstacle such as shoes
  • the walking robot matches the type of obstacle detected in real time with the type of the designated obstacle, for example, judges the current Whether the detected obstacle type is a shoe, if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected, and then an ignoring operation is performed.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle specifically includes: comparing the detected obstacle type with the designated obstacle The type of the object is matched; if the matching is successful, it is determined whether the position of the obstacle is within the area marked with the designated obstacle; if so, it is confirmed that the same is detected in the area marked with the designated obstacle Obstacles of the same type as the designated obstacles.
  • the walking robot obtains the obstacle information within the recognition capability in real time during the walking process, and matches the detected obstacle type with the specified obstacle type (such as shoes); if the matching is successful, it indicates that the obstacle type is Shoes, and belong to the designated obstacle type, then determine whether the position of the obstacle is within the area marked with the designated obstacle; if so, confirm that the detection is within the area marked with the designated obstacle Obstacles of the same type as the designated obstacles are ignored.
  • the specified obstacle type such as shoes
  • the corresponding position on the cleaning map of the APP will display the obstacle types recognized by the cleaning robot during the cleaning process.
  • the obstacle types include but are not limited to inserting Rows, bases, excrement, scales, thread balls, shoes, fixtures (such as sofas, beds, tables, chairs, etc.).
  • the user can manually operate on the above-mentioned obstacles.
  • the user can choose to ignore at least one of the obstacles, and mark the operation instruction for ignoring the obstacle in the work map.
  • the robot detects that the user ignores the operation instruction of an obstacle in a certain position, it records the type of the obstacle and its coordinates.
  • the sweeper detects such an obstacle within a certain range of the coordinate, the obstacle mark is ignored, that is, it is not considered that there will be such an obstacle at the location, and the obstacle avoidance operation is not performed on the obstacle.
  • the user chooses to ignore the slippers mark at a certain position in the bedroom. After that, when the cleaning robot moves to the vicinity of the mark, it continues to clean without obstacle avoidance. Or, when the cleaning robot travels to the bedroom area where the mark is located, it does not need to mark slippers to avoid obstacles.
  • the robot obstacle avoidance method provided by the embodiments of the present disclosure enables the robot to automatically obtain obstacle information during the robot's traveling process according to its own sensor or image acquisition device during the traveling process, and mark the obstacle type and location, after manual confirmation When encountering this type of obstacle information again, the obstacle avoidance of the obstacle avoidance and the cleaning of the cleaning avoid the missed cleaning of the clean area caused by blind avoidance of the obstacle, improve the accuracy of cleaning, and enhance the user experience.
  • an embodiment of the present disclosure provides a robot obstacle avoidance device, which is applied to the robot side, and includes:
  • the acquiring unit 702 is configured to acquire and record information of obstacles encountered during travel; wherein the information of the obstacles includes type information and location information of the obstacles.
  • the receiving unit 704 is configured to receive an operation instruction for a designated obstacle, and the operation instruction causes an obstacle of the same type as the designated obstacle to be detected within the area marked with the designated obstacle.
  • the obstacles of the same type perform obstacle avoidance operations.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is confirmed that an obstacle of the same type as the specified obstacle is detected.
  • detecting an obstacle of the same type as the designated obstacle within the area marked with the designated obstacle includes: comparing the detected obstacle type with the designated obstacle type Matching; if the matching is successful, it is determined whether the location of the obstacle is within the area marked with the designated obstacle; if so, it is confirmed that the designated obstacle is detected in the area marked with the designated obstacle Obstacles of the same type.
  • the receiving an operation instruction for a designated obstacle specifically includes: receiving an operation instruction for ignoring the designated obstacle from a user through a human-computer interaction interface.
  • the operation instruction includes: type information and location information of the designated obstacle.
  • the device further includes: a unit for assigning an identifier to the obstacle after encountering an obstacle during the travel, wherein the operation instruction includes: the identifier of the designated obstacle; and After the operation instruction for a designated obstacle is received, according to the identifier of the designated obstacle, a unit of corresponding type information and position information is searched.
  • the area range includes: an area of fixed size and shape with the coordinate position of the designated obstacle as the geometric center; or, the partition where the obstacle is located, wherein the partition is the self-propelled The area divided automatically by the robot or manually set by the user; or all areas that the self-propelled robot can reach.
  • the robot obstacle avoidance device provided by the embodiments of the present disclosure enables the robot to automatically obtain obstacle information during the robot's traveling process according to its own sensor or image acquisition device during the traveling process, and mark the obstacle type and location, after manual confirmation When encountering this type of obstacle information again, the obstacle avoidance of the obstacle avoidance and the cleaning of the cleaning avoid the missed cleaning of the clean area caused by blind avoidance of the obstacle, improve the accuracy of cleaning, and enhance the user experience.
  • the embodiments of the present disclosure provide a non-transitory computer-readable storage medium that stores computer program instructions that, when called and executed by a processor, implement any of the method steps described above.
  • An embodiment of the present disclosure provides a robot, including a processor and a memory, the memory stores computer program instructions that can be executed by the processor, and when the processor executes the computer program instructions, any one of the foregoing embodiments is implemented Method steps.
  • the robot may include a processing device (such as a central processing unit, a graphics processor, etc.) 801, which can be loaded into a random access memory (RAM) 803 programs to execute various appropriate actions and processing.
  • a processing device such as a central processing unit, a graphics processor, etc.
  • RAM random access memory
  • various programs and data required for the operation of the electronic robot 800 are also stored.
  • the processing device 801, the ROM 802, and the RAM 803 are connected to each other through a bus 804.
  • An input/output (I/O) interface 805 is also connected to the bus 804.
  • the following devices can be connected to the I/O interface 805: including input devices 806 such as touch screen, touch pad, keyboard, mouse, camera, microphone, accelerometer, gyroscope, etc.; including, for example, liquid crystal display (LCD), speakers, vibration An output device 807 such as a computer; a storage device 808 such as a hard disk; and a communication device 809.
  • the communication device 809 may allow the electronic robot to perform wireless or wired communication with other robots to exchange data.
  • 12 to FIG. 8 show an electronic robot with various devices, it should be understood that it is not required to implement or have all of the illustrated devices. It may be implemented alternatively or provided with more or fewer devices.
  • the process described above with reference to the flowchart can be implemented as a robot software program.
  • the embodiments of the present disclosure include a robot software program product, which includes a computer program carried on a readable medium, and the computer program contains program code for executing the method shown in the flowchart.
  • the computer program may be downloaded and installed from the network through the communication device 809, or installed from the storage device 808, or installed from the ROM 802.
  • the processing device 801 the above-mentioned functions defined in the method of the embodiment of the present disclosure are executed.
  • the above-mentioned computer-readable medium in the present disclosure may be a computer-readable signal medium or a computer-readable storage medium, or any combination of the two.
  • the computer-readable storage medium may be, for example, but not limited to, an electrical, magnetic, optical, electromagnetic, infrared, or semiconductor system, device, or device, or a combination of any of the above. More specific examples of computer-readable storage media may include, but are not limited to: electrical connections with one or more wires, portable computer disks, hard disks, random access memory (RAM), read-only memory (ROM), erasable Programmable read only memory (EPROM or flash memory), optical fiber, portable compact disk read only memory (CD-ROM), optical storage device, magnetic storage device, or any suitable combination of the above.
  • a computer-readable storage medium may be any tangible medium that contains or stores a program, and the program may be used by or in combination with an instruction execution system, apparatus, or device.
  • a computer-readable signal medium may include a data signal propagated in a baseband or as a part of a carrier wave, and a computer-readable program code is carried therein. This propagated data signal can take many forms, including but not limited to electromagnetic signals, optical signals, or any suitable combination of the foregoing.
  • the computer-readable signal medium may also be any computer-readable medium other than the computer-readable storage medium.
  • the computer-readable signal medium may send, propagate or transmit the program for use by or in combination with the instruction execution system, apparatus, or device .
  • the program code contained on the computer-readable medium can be transmitted by any suitable medium, including but not limited to: wire, optical cable, RF (Radio Frequency), etc., or any suitable combination of the above.
  • the above-mentioned computer-readable medium may be included in the above-mentioned robot; or it may exist alone without being assembled into the robot.
  • the computer program code used to perform the operations of the present disclosure may be written in one or more programming languages or a combination thereof.
  • the above-mentioned programming languages include object-oriented programming languages—such as Java, Smalltalk, C++, and also conventional Procedural programming language-such as "C" language or similar programming language.
  • the program code can be executed entirely on the user's computer, partly on the user's computer, executed as an independent software package, partly on the user's computer and partly executed on a remote computer, or entirely executed on the remote computer or server.
  • the remote computer can be connected to the user's computer through any kind of network, including a local area network (LAN) or a wide area network (WAN), or it can be connected to an external computer (for example, using an Internet service provider to pass Internet connection).
  • LAN local area network
  • WAN wide area network
  • each block in the flowchart or block diagram can represent a module, program segment, or part of code, and the module, program segment, or part of code contains one or more for realizing the specified logic function.
  • Executable instructions can also occur in a different order from the order marked in the drawings. For example, two blocks shown one after another can actually be executed substantially in parallel, and they can sometimes be executed in the reverse order, depending on the functions involved.
  • each block in the block diagram and/or flowchart, and the combination of the blocks in the block diagram and/or flowchart can be implemented by a dedicated hardware-based system that performs the specified functions or operations Or it can be realized by a combination of dedicated hardware and computer instructions.
  • the device embodiments described above are merely illustrative.
  • the units described as separate components may or may not be physically separated, and the components displayed as units may or may not be physical units, that is, they may be located in One place, or it can be distributed to multiple network units.
  • Some or all of the modules can be selected according to actual needs to achieve the objectives of the solutions of the embodiments. Those of ordinary skill in the art can understand and implement it without creative work.

Abstract

一种自行走机器人避障方法、装置、机器人及存储介质。避障方法包括:获取并记录行进过程中遇到的障碍物的信息,其中,障碍物的信息包括障碍物的类型信息和位置信息(S502);接收对一指定障碍物的操作指令,操作指令指示在标记有指定障碍物的区域范围内检测到与指定障碍物的类型相同的障碍物时,不对类型相同的障碍物进行避障操作(S504)。

Description

自行走机器人避障方法、装置、机器人和存储介质
相关申请的交叉引用
本申请要求2020年4月20日提交的中国专利申请号202010311652.2的优先权,该中国专利申请以其整体通过引用并入本文。
技术领域
本公开涉及机器人控制技术领域,尤其涉及一种自行走机器人避障方法、装置、机器人和存储介质。
背景技术
随着人工智能技术的发展,出现了各种各样的智能化的机器人,比如扫地机器人、拖地机器人、吸尘器、除草机等。这些清洁机器人能够自动识别清扫路线,在清扫过程中也能识别并记录障碍物信息,为后续清扫过程的路线优化带来改进,这不仅解放了劳动力、节约了人力成本,而且提升了清洁效率。
特别是,对于一些配置有摄像头的扫地机器人具备识别障碍物类型的能力,并且能根据障碍物的种类不同采用不同的避障策略。但是,由于识别能力存在不确定性,识别结果经常出错,对于清扫路线中不该避障的地方自动避障,导致部分区域漏扫,用户体验不好。
发明内容
本公开实施例提供一种自行走机器人避障方法,应用于机器人端,包括:获取并记录行进过程中遇到的障碍物的信息;其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息;接收对一指定障碍物的操作指令,所述操作指令指示在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认检测到与所述指定障碍物的类型相同的障碍物。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物的位置是否位于所述指定障碍物的区域范围内;如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
可选的,接收对一指定障碍物的操作指令,具体包括:接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。
可选的,所述操作指令包含:所述指定障碍物的类型信息和位置信息。
可选的,所述方法还包括:在行进过程中遇到障碍物后,为该障碍物分配标识,其中,所述操作指令包含:所述指定障碍物的标识;在接收到对一指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息。
可选的,所述区域范围包括:以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域;或者,所述自行走机器人所能到达的所有地区。
本公开实施例提供一种自行走机器人避障装置,应用于机器人端,并且包括:获取单元,用于获取并记录行进过程中遇到的障碍物的信息,其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息;接收单元,用于接收对一指定障碍物的操作指令,所述操作指令指示在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认检测到与所述指定障碍物的类型相同的障碍物。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物的位置是否位于所述指定障碍物的区域范围内;如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
可选的,接收对一指定障碍物的操作指令,具体包括:接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。
可选的,所述操作指令包含:所述指定障碍物的类型信息和位置信息。
可选的,所述装置还包括:用于在行进过程中遇到障碍物后,为该障碍物分配标识的单元,其中,所述操作指令包含:所述指定障碍物的标识;以及用于在接收到对一指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息的单元。
可选的,所述区域范围包括:以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域;或者,所述自行走机器人所能到达的所有地区。
本公开实施例提供一种机器人,包括处理器和存储器,所述存储器存储有能够被所述处理器执行的计算机程序指令,所述计算机程序指令在被所述处理器执行时,实现如上任一所述的方法步骤。
本公开实施例提供一种非瞬时性计算机可读存储介质,存储有计算机程序指令,所述计算机程序指令在被处理器调用和执行时实现如上任一所述的方法步骤。
附图说明
为了更清楚地说明本公开实施例或现有技术中的技术方案,下面将对实施例或现有技术描述中所需要使用的附图作一简单地介绍,显而易见地,下面描述中的附图是本公开的一些实施例,对于本领域普通技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1为本公开实施例提供的应用场景示意图;
图2为本公开实施例提供的自行走机器人的结构立体图;
图3为本公开实施例提供的自行走机器人的结构俯视图;
图4为本公开实施例提供的自行走机器人的结构仰视图;
图5为本公开实施例提供的自行走机器人避障方法的流程示意图;
图6为本公开实施例提供的自行走机器人避障APP的设置示意图;
图7为本公开实施例提供的自行走机器人避障装置的结构示意图;
图8为本公开实施例提供的机器人的电子结构示意图。
具体实施方式
为使本公开实施例的目的、技术方案和优点更加清楚,下面将结合本公开实施例中的附图,对本公开实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例是本公开一部分实施例,而不是全部的实施例。基于本公开中的实施例,本领域普通技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本公开保护的范围。
应当理解,尽管在本公开实施例中可能采用术语第一、第二、第三等来描述……,但这些……不应限于这些术语。这些术语仅用来将……彼此区分开。例如,在不脱离本公开实施例范围的情况下,第一……也可以被称为第二……,类似地,第二……也可以被称为第一……。
本公开实施例提供一种可能的应用场景,该应用场景包括自动清洁设备100,例如扫地机器人、拖地机器人、吸尘器、除草机等等。在某些实施例中。在本实施例中,如图1所示,以家用式扫地机器人为例进行说明,在扫地机器人工作过程中,其可以根据预置路线或自动规划的路线进行清扫,但不可避免的会在某些地方被卡住而无法进行,例如一把椅子200,一张桌子等等,此时扫地机器人可以通过云端服务器300、或本地服务器、或自身存储系统进行障碍物识别,并将该位置标记为障碍物位置,下次再行进至该位置时将实现自动避障。在本实施例中,机器人可以设置有触敏显示器或者通过移动终端控制,以接收用户输入的操作指令。自动清洁设备可以设置各种传感器,例如缓冲器、悬崖传感器、超声传感器、红外传感器、磁力计、加速度计、陀螺仪、里程计等传感装置(各传感器的具体结构不做详细介绍,可以采用任何一种上述传感器应用于本自动清洁设备),机器人还可以设置有WIFI模块、 Bluetooth模块等无线通讯模块,以与智能终端或服务器连接,并通过无线通讯模块接收智能终端或服务器传输的操作指令。
如图2所示,自动清洁设备100可通过相对于由主体110界定的如下三个相互垂直轴的移动的各种组合在地面上行进:前后轴X、横向轴Y及中心垂直轴Z。沿着前后轴X的前向驱动方向标示为“前向”,且沿着前后轴X的向后驱动方向标示为“后向”。横向轴Y的方向实质上是沿着由驱动轮模块141的中心点界定的轴心在机器人的右轮与左轮之间延伸的方向。
自动清洁设备100可以绕Y轴转动。当自动清洁设备100的前向部分向上倾斜,后向部分向下倾斜时为“上仰”,且当自动清洁设备100的前向部分向下倾斜,后向部分向上倾斜时为“下俯”。另外,机器人100可以绕Z轴转动。在自动清洁设备100的前向方向上,当自动清洁设备100向X轴的右侧倾斜为“右转”,当自动清洁设备100向X轴的左侧倾斜为“左转”。
如图3所示,自动清洁设备100包含机器主体110、感知系统120、控制系统、驱动系统140、清洁系统、能源系统和人机交互系统180。
机器主体110包括前向部分111和后向部分112,具有近似圆形形状(前后都为圆形),也可具有其他形状,包括但不限于前方后圆的近似D形形状及前方后方的矩形或正方形形状。
如图3所示,感知系统120包括位于机器主体110上的位置确定装置121、设置于机器主体110的前向部分111的缓冲器122上的碰撞传感器、近距离传感器,设置于机器主体下部的悬崖传感器,以及设置于机器主体内部的磁力计、加速度计、陀螺仪(Gyro)、里程计(ODO,全称odograph)等传感装置,用于向控制系统130提供机器的各种位置信息和运动状态信息。位置确定装置121包括但不限于摄像头、激光测距装置(LDS,全称Laser DirectStructuring)。
如图3所示,机器主体110的前向部分111可承载缓冲器122,在清洁过程中驱动轮模块141推进机器人在地面行走时,缓冲器122经由设置在其上的传感器系统,例如红外传感器,检测自动清洁设备100的行驶路径中的一或多个事件,自动清洁设备100可通过由缓冲器122检测到的事件,例如障碍物、墙壁,而控制驱动轮模块141使自动清洁设备100来对所述事件做出响应,例如远离障碍物。
控制系统130设置在机器主体110内的电路主板上,包括与非暂时性存储器,例如硬盘、快闪存储器、随机存取存储器,通信的计算处理器,例如中央处理单元、应用处理器,应用处理器根据激光测距装置反馈的障碍物信息利用定位算法,例如即时定位与地图构建(SLAM,全称Simultaneous Localization And Mapping),绘制机器人所在环境中的即时地图。并且结合缓冲器122上所设置传感器、悬崖传感器、磁力计、加速度计、陀螺仪、里程计等传感装置反馈的距离信息、速度信息综合判断扫地机当前处于何种工作状态、位于何位置,以及扫地机当前位姿等,如过门槛,上地毯,位于悬崖处,上方或者下方被卡住,尘盒满,被拿起等等,还会针对不同情况给出具体的下一步动作策略,使得机器人的工作更加符合主人的要求,有更好的用户体验。
如图4所示,驱动系统140可基于具有距离和角度信息(例如x、y及θ分量)的驱动命令而操纵机器人100跨越地面行驶。驱动系统140包含驱动轮模块141,驱动轮模块141可以同时控制左轮和右轮,为了更为精确地控制机器的运动,优选驱动轮模块141分别包括左驱动轮模块和右驱动轮模块。左、右驱动轮模块沿着由主体110界定的横向轴对置。为了机器人能够在地面上更为稳定地运动或者更强的运动能力,机器人可以包括一个或者多个从动轮142,从动轮包括但不限于万向轮。驱动轮模块包括行走轮和驱动马达以及控制驱动马达的控制电路,驱动轮模块还可以连接测量驱动电流的电路和里程计。驱动轮模块141可以可拆卸地连接到主体110上,方便拆装和维修。驱动轮可具有偏置下落式悬挂系统,以可移动方式紧固,例如以可旋转方式附接,到机器人主体110,且接收向下及远离机器人主体110偏置的弹簧偏置。弹簧偏置允许驱动轮以一定的着地力维持与地面的接触及牵引,同时自动清洁设备100的清洁元件也以一定的压力接触地面10。
清洁系统可为干式清洁系统和/或湿式清洁系统。作为干式清洁系统,主要的清洁功能源于滚刷、尘盒、风机、出风口以及四者之间的连接部件所构成的清扫系统151。与地面具有一定干涉的滚刷将地面上的垃圾扫起并卷带到滚刷与尘盒之间的吸尘口前方,然后被风机产生并经过尘盒的有吸力的气体吸入尘盒。干式清洁系统还可包含具有旋转轴的边刷152,旋转轴相对于地面成一定角度,以用于将碎屑移动到清洁系统的滚刷区域中。
能源系统包括充电电池,例如镍氢电池和锂电池。充电电池可以连接有充电控制电路、电池组充电温度检测电路和电池欠压监测电路,充电控制电路、电池组充电温度检测电路、电池欠压监测电路再与单片机控制电路相连。主机通过设置在机身侧方或者下方的充电电极与充电桩连接进行充电。如果裸露的充电电极上沾附有灰尘,会在充电过程中由于电荷的累积效应,导致电极周边的塑料机体融化变形,甚至导致电极本身发生变形,无法继续正常充电。
人机交互系统180包括主机面板上的按键,按键供用户进行功能选择;还可以包括显示屏和/或指示灯和/或喇叭,显示屏、指示灯和喇叭向用户展示当前机器所处状态或者功能选择项;还可以包括手机客户端程序。对于路径导航型自动清洁设备,在手机客户端可以向用户展示设备所在环境的地图,以及机器所处位置,可以向用户提供更为丰富和人性化的功能项。
本公开实施例提供的机器人避障方法,使得机器人在行进过程中根据自身的传感器或图像获取装置实时获取行进路线中的障碍物信息,并将障碍物信息标记到作业地图中,在后续人工判断某些类型的障碍物不需要避障时,可以对其做忽略操作,此后的清洁过程将不再对该类型障碍物执行避障操作。
作为实施方式之一,如图5所示,本公开实施例提供一种自行走机器人避障方法,应用于机器人端,包括如下方法步骤。
步骤S502:获取并记录行进过程中遇到的障碍物的信息,其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息。
障碍物信息是泛指能够阻碍机器人行进的任何物体。如上所述,自动清洁设备包括碰撞传感器、近距离传感器、悬崖传感器,以及设置于机器主体内部的磁力计、加速度计、陀螺仪(Gyro)、里程计(ODO,全称odograph)等传感装置,这些传感器会以固有频率向控制系统130提供机器的各种位置信息和运动状态信息。控制系统130就会实时获取到自动清洁设备在行进过程中的各种传感器参数,进而判断、分析自动清洁设备当前的位置以及工作状态。
自动清洁设备通过自身携带的摄像头实时拍摄行进过程中的图像,其中图像信息可以是完整的照片信息,也可以是通过照片提取出来的反应图像中 物体的特征信息,例如提取特征点后的图像信息。具体的,特征信息是图片抽象特征,比如角点、物体边界线、色彩梯度、光流梯度之类的这些特征以及特征之间的位置关系。控制系统130就会实时获取到自动清洁设备在行进过程中的图像信息参数,进而判断、分析自动清洁设备当前的位置以及工作状态。
障碍物的信息包括所述障碍物的类型信息和位置信息。其中,障碍物的类型信息是指根据预先设置的识别分类,基于传感器或图像识别装置的识别结果,将障碍物自动的识别为相应的类型,所述类型包括但不限于插排、底座、排泄物、体重秤、线团、鞋子、固定物(例如沙发、床、桌子、椅子等)。系统会根据传感器或图像识别装置获取的障碍物的不同特征进行分类,例如图像识别装置拍摄到图像则会根据障碍物标签进行自动归类,根据碰撞传感器接收到的碰撞力度、碰撞位置可以将障碍物识别为固定物。
障碍物的位置信息是根据识别到的障碍物的具体位置,例如坐标位置,将具体的位置信息标示在机器人的作业地图当中。其中,作业地图是机器人根据工作区域而形成的区域信息,作业地图可以是机器人到达工作区域后通过外部输入而获得,也可以是机器人在多次的工作过程中自动获得。作业地图通常包括根据作业区域的建筑特点而形成的分区,例如家庭的卧室区、客厅区、厨房区;办公区域的走廊区、工作区、休息区等。
作为示例之一,可以通过图像获取装置获取障碍物信息,行进过程中的实时获取图像信息,当所述图像信息满足预设模型条件时,确认当前位置存在障碍物,并根据预设模型进行类别标识,具体包括:所述机器人在行进过程中,实时获取行进过程中的图像信息,当存在障碍物图像时,将所述障碍物图像与机器人训练过的图像模型进行比较,根据比较结果将所述识别到的障碍物归类,例如,拍摄到“鞋子”的图像时,将该图像与机器人存储的多个种类的模型进行匹配,当匹配到“鞋子”的比例更高时,将其归为鞋类。当然,如果通过障碍物图像识别不明确时,可以从多个角度对障碍物进行图像识别,例如当识别前面的团装物可能为线团的概率为80%,可能为排泄物的概率为70%,可能为球的概率为75%,三种概率比较接近难以归类时,机器人可以选择从另一个角度再次获取图像信息,进行二次识别,直到能够以 较大的概率差将其识别为某一类为止,例如识别为线团的概率为80%,识别为其他障碍物的概率都在50%以下,则将障碍物归类为线团类。
障碍物图像信息是泛指能够使机器人发生卡困的通过图像进行判断的信息,包括但不限于清洁机器人的图像采集装置获取到的图像信息、雷达系统获取的雷达信息等等。自动清洁设备通过自身携带的摄像头实时拍摄行进过程中的图像,其中图像信息可以是完整的照片信息,也可以是通过照片提取出来的反应图像的特征信息,例如提取特征点后的图像信息。具体的,特征信息是图片抽象特征,比如角点、物体边界线、色彩梯度、光流梯度之类的这些特征以及特征之间的位置关系。
所述将障碍物位置的坐标点进行障碍物标记之后,选定所述障碍物位置周围一定面积及形状的区域作为所述障碍物区域;其中,所述障碍物位置在所述障碍物区域内。在具体实现时,上述范围可以是以被障碍物标记的坐标点为中心、预设长度为半径的圆形区域;也可以是以被障碍物标记的坐标点为中心、预设长度为边长的四边形。半径的长度和边长的长度可以是固定的,也可以根据障碍物的种类不同设置成不同的数值。
机器人在行进过程中,通过上述的图像传感器可以判断当前位置是否存在障碍物,当存在障碍物时,将其位置坐标标记于作业地图,并结合图像设备记录该障碍物的类型信息,类型信息包括但不限于插排、底座、排泄物、体重秤、线团、鞋子、固定物(例如沙发、床、桌子、椅子等)。
步骤S504:接收对一指定障碍物的操作指令,所述操作指令使得在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
作为一种示例,所述接收对一指定障碍物的操作指令,具体包括:接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。所述人机交互界面包括但不限于手机APP界面的触控操作、机器人本身的人机交互界面、语音交互界面等。其中,指定障碍物包括一直或多种,例如对插排、底座、排泄物、体重秤、线团、鞋子、固定物中的一种或多种指定。
作为一种示例,所述操作指令包含:所述指定障碍物的类型信息和位置信息。具体,可以是对某一位置的某一类型的障碍物的指令,例如对客厅中排泄物的指令。也可以是对某一位置的某几类型的障碍物的指令,例如对客 厅中排泄物、插排的指令。还可以是对某几个位置的某一类型的障碍物的指令,例如对客厅、卧室中排泄物的指令。
作为一种示例,在机器人行进过程中遇到障碍物后,为该障碍物分配标识,例如对插排、底座、排泄物、体重秤、线团、鞋子、固定物等不同障碍物分配不同的标识,标识可以显示于手机操作界面;所述操作指令包含:指定障碍物的标识;在所述接收对一指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息。
指令内容可以是,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。所述区域范围可以是距标记有所述指定障碍物一定距离的范围,也可以是标记有指定障碍物所在的分区,例如对于卧室内检测到有排泄物的障碍物标记均指向忽略指令;或者,也可以是机器人所能到达的所有地区,即在整个空间不对所述类型相同的障碍物进行避障操作。
作为实施例之一,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认为检测到与所述指定障碍物的类型相同的障碍物。
其中,所述区域范围包括:以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域。
作为一种示例,所述区域范围包括:以标记有所述障碍物信息的坐标位置为圆心,固定长度为半径的圆;或者,标记有所述障碍物信息的坐标位置所在的分区,其中,所述分区是根据所述作业地图划分的区域。在具体实现过程中,忽略的范围可以是以该坐标点为圆心,固定长度为半径的圆,优选50cm;对于具备分区能力的扫地机器人,忽略的范围可以是该坐标点所在的分区,例如忽略卧室内的拖鞋标记;或者在整个清扫空间,均忽略鞋子。
行走机器人在标记有所述指定障碍物(例如鞋子)的区域范围内(例如50cm半径的圆)行走时,将实时检测到的障碍物类型与所述指定障碍物的类型进行匹配,例如判断当前检测到的障碍物类型是否是鞋子,如匹配成功,则确认为检测到与所述指定障碍物的类型相同的障碍物,则进行忽略操作。
作为另外的实施例之一,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物的位置是否位于标记有所述指定障碍物的区域范围内;如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
行走机器人在行走过程中实时获取识别能力范围内的障碍物信息,将检测到的障碍物类型与所述指定障碍物的类型(例如鞋子)进行匹配;如匹配成功,则说明该障碍物类型是鞋子,且属于指定的障碍物类型,则再判断该障碍物的位置是否位于标记有所述指定障碍物的区域范围内;如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,进行忽略操作。
对于清扫过程中或清扫完成后,APP的清扫地图上的相应位置会显示有扫地机器人在本次清扫过程中所识别出的障碍物类型,如图6所示,障碍物类型包括但不限于插排、底座、排泄物、体重秤、线团、鞋子、固定物(例如沙发、床、桌子、椅子等)。此时,用户可以手动对上述障碍物进行操作,例如,用户可以选择忽略该障碍物至少之一,并将忽略该障碍物的操作指令标记到作业地图当中。机器人检测到用户忽略某位置的某个障碍物的操作指令后,记录该障碍物的种类及其坐标。在之后的清扫过程中,如扫地机在该坐标一定范围内再检测到该类障碍物,则忽略障碍物标记,即不认为该位置会有该类障碍物,不对该障碍物进行避障操作。例如,用户对于卧室内的某一位置的拖鞋标记选择忽略操作,此后,当清洁机器人行进至该标记附近时,继续清扫而不需要避障。或者,当清洁机器人行进至该标记所在的卧室区域时,都不需要对拖鞋标记避障。
本公开实施例提供的机器人避障方法,使得机器人在行进过程中根据自身的传感器或图像采集装置自动获取机器人行进过程中的障碍物信息,并对障碍物类型和位置进行标记,经过人工确认后,当再次遇到该类型的障碍物信息时,该避障的避障,该清扫的清扫,避免了盲目避障而导致的清洁区域漏扫,提高了清扫的准确性,提升了用户体验。
作为实施方式之一,如图7所示,本公开实施例提供一种机器人避障装置,应用于机器人端,包括:
获取单元702,用于获取并记录行进过程中遇到的障碍物的信息;其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息。
接收单元704,用于接收对一指定障碍物的操作指令,所述操作指令使得在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认为检测到与所述指定障碍物的类型相同的障碍物。
可选的,在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物,具体包括:将检测到的障碍物类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物的位置是否位于标记有所述指定障碍物的区域范围内;如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
可选的,所述接收对一指定障碍物的操作指令,具体包括:接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。
可选的,所述操作指令包含:所述指定障碍物的类型信息和位置信息。
可选的,所述装置还包括:用于在行进过程中遇到障碍物后,为该障碍物分配标识的单元,其中,所述操作指令包含:所述指定障碍物的标识;以及用于在所述接收对一指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息的单元。
可选的,所述区域范围包括:以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域;或者,所述自行走机器人所能到达的所有地区。
本公开实施例提供的机器人避障装置,使得机器人在行进过程中根据自身的传感器或图像采集装置自动获取机器人行进过程中的障碍物信息,并对障碍物类型和位置进行标记,经过人工确认后,当再次遇到该类型的障碍物信息时,该避障的避障,该清扫的清扫,避免了盲目避障而导致的清洁区域漏扫,提高了清扫的准确性,提升了用户体验。
本公开实施例提供一种非瞬时性计算机可读存储介质,存储有计算机程序指令,所述计算机程序指令在被处理器调用和执行时实现如上任一所述的方法步骤。
本公开实施例提供一种机器人,包括处理器和存储器,所述存储器存储有能够被所述处理器执行的计算机程序指令,所述处理器执行所述计算机程序指令时,实现前述任一实施例的方法步骤。
如图8所示,机器人可以包括处理装置(例如中央处理器、图形处理器等)801,其可以根据存储在只读存储器(ROM)802中的程序或者从存储装置808加载到随机访问存储器(RAM)803中的程序而执行各种适当的动作和处理。在RAM 803中,还存储有电子机器人800操作所需的各种程序和数据。处理装置801、ROM 802以及RAM 803通过总线804彼此相连。输入/输出(I/O)接口805也连接至总线804。
通常,以下装置可以连接至I/O接口805:包括例如触摸屏、触摸板、键盘、鼠标、摄像头、麦克风、加速度计、陀螺仪等的输入装置806;包括例如液晶显示器(LCD)、扬声器、振动器等的输出装置807;包括例如硬盘等的存储装置808;以及通信装置809。通信装置809可以允许电子机器人与其他机器人进行无线或有线通信以交换数据。虽然12图8示出了具有各种装置的电子机器人,但是应理解的是,并不要求实施或具备所有示出的装置。可以替代地实施或具备更多或更少的装置。
特别地,根据本公开的实施例,上文参考流程图描述的过程可以被实现为机器人软件程序。例如,本公开的实施例包括一种机器人软件程序产品,其包括承载在可读介质上的计算机程序,该计算机程序包含用于执行流程图所示的方法的程序代码。在这样的实施例中,该计算机程序可以通过通信装置809从网络上被下载和安装,或者从存储装置808被安装,或者从ROM 802被安装。在该计算机程序被处理装置801执行时,执行本公开实施例的方法中限定的上述功能。
需要说明的是,本公开上述的计算机可读介质可以是计算机可读信号介质或者计算机可读存储介质或者是上述两者的任意组合。计算机可读存储介质例如可以是——但不限于——电、磁、光、电磁、红外线、或半导体的系统、装置或器件,或者任意以上的组合。计算机可读存储介质的更具体的例 子可以包括但不限于:具有一个或多个导线的电连接、便携式计算机磁盘、硬盘、随机访问存储器(RAM)、只读存储器(ROM)、可擦式可编程只读存储器(EPROM或闪存)、光纤、便携式紧凑磁盘只读存储器(CD-ROM)、光存储器件、磁存储器件、或者上述的任意合适的组合。在本公开中,计算机可读存储介质可以是任何包含或存储程序的有形介质,该程序可以被指令执行系统、装置或者器件使用或者与其结合使用。而在本公开中,计算机可读信号介质可以包括在基带中或者作为载波一部分传播的数据信号,其中承载了计算机可读的程序代码。这种传播的数据信号可以采用多种形式,包括但不限于电磁信号、光信号或上述的任意合适的组合。计算机可读信号介质还可以是计算机可读存储介质以外的任何计算机可读介质,该计算机可读信号介质可以发送、传播或者传输用于由指令执行系统、装置或者器件使用或者与其结合使用的程序。计算机可读介质上包含的程序代码可以用任何适当的介质传输,包括但不限于:电线、光缆、RF(射频)等等,或者上述的任意合适的组合。
上述计算机可读介质可以是上述机器人中所包含的;也可以是单独存在,而未装配入该机器人中。
可以以一种或多种程序设计语言或其组合来编写用于执行本公开的操作的计算机程序代码,上述程序设计语言包括面向对象的程序设计语言—诸如Java、Smalltalk、C++,还包括常规的过程式程序设计语言—诸如“C”语言或类似的程序设计语言。程序代码可以完全地在用户计算机上执行、部分地在用户计算机上执行、作为一个独立的软件包执行、部分在用户计算机上部分在远程计算机上执行、或者完全在远程计算机或服务器上执行。在涉及远程计算机的情形中,远程计算机可以通过任意种类的网络——包括局域网(LAN)或广域网(WAN)—连接到用户计算机,或者,可以连接到外部计算机(例如利用因特网服务提供商来通过因特网连接)。
附图中的流程图和框图,图示了按照本公开各种实施例的系统、方法和计算机程序产品的可能实现的体系架构、功能和操作。在这点上,流程图或框图中的每个方框可以代表一个模块、程序段、或代码的一部分,该模块、程序段、或代码的一部分包含一个或多个用于实现规定的逻辑功能的可执行指令。也应当注意,在有些作为替换的实现中,方框中所标注的功能也可以 以不同于附图中所标注的顺序发生。例如,两个接连地表示的方框实际上可以基本并行地执行,它们有时也可以按相反的顺序执行,这依所涉及的功能而定。也要注意的是,框图和/或流程图中的每个方框、以及框图和/或流程图中的方框的组合,可以用执行规定的功能或操作的专用的基于硬件的系统来实现,或者可以用专用硬件与计算机指令的组合来实现。
以上所描述的装置实施例仅仅是示意性的,其中所述作为分离部件说明的单元可以是或者也可以不是物理上分开的,作为单元显示的部件可以是或者也可以不是物理单元,即可以位于一个地方,或者也可以分布到多个网络单元上。可以根据实际的需要选择其中的部分或者全部模块来实现本实施例方案的目的。本领域普通技术人员在不付出创造性的劳动的情况下,即可以理解并实施。
最后应说明的是:以上实施例仅用以说明本公开的技术方案,而非对其限制;尽管参照前述实施例对本公开进行了详细的说明,本领域的普通技术人员应当理解:其依然可以对前述各实施例所记载的技术方案进行修改,或者对其中部分技术特征进行等同替换;而这些修改或者替换,并不使相应技术方案的本质脱离本公开各实施例技术方案的精神和范围。

Claims (16)

  1. 一种自行走机器人避障方法,应用于机器人端,其特征在于,所述方法包括:
    获取并记录行进过程中遇到的障碍物的信息,其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息;
    接收对一指定障碍物的操作指令,所述操作指令指示在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
  2. 根据权利要求1所述的方法,其特征在于,
    在标记有所述指定障碍物的区域范围内与所述指定障碍物的类型相同的障碍物的检测,包括:
    将检测到的障碍物的类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认检测到与所述指定障碍物的类型相同的障碍物。
  3. 根据权利要求1所述的方法,其特征在于,
    在标记有所述指定障碍物的区域范围内与所述指定障碍物的类型相同的障碍物的检测,包括:
    将检测到的障碍物的类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物是否位于标记有所述指定障碍物的区域范围内;
    如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
  4. 根据权利要求1所述的方法,其特征在于,
    对一指定障碍物的操作指令的接收,包括:
    接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。
  5. 根据权利要求4所述的方法,其特征在于:
    所述操作指令包含:所述指定障碍物的类型信息和位置信息。
  6. 根据权利要求4所述的方法,还包括:
    在行进过程中遇到障碍物后,为该障碍物分配标识,其中,所述操作指令包含:所述指定障碍物的标识;
    在接收到对所述指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息。
  7. 根据权利要求1-6任一所述的方法,其特征在于,所述区域范围包括:
    以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,
    所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域;或者,
    所述自行走机器人所能到达的所有地区。
  8. 一种自行走机器人避障装置,应用于机器人端,其特征在于,所述装置包括:
    获取单元,用于获取并记录行进过程中遇到的障碍物的信息,其中,所述障碍物的信息包括所述障碍物的类型信息和位置信息;
    接收单元,用于接收对一指定障碍物的操作指令,所述操作指令指示在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物时,不对所述类型相同的障碍物进行避障操作。
  9. 根据权利要求8所述的装置,其特征在于,
    在标记有所述指定障碍物的区域范围内与所述指定障碍物的类型相同的障碍物的检测,包括:
    将检测到的障碍物的类型与所述指定障碍物的类型进行匹配;如匹配成功,则确认检测到与所述指定障碍物的类型相同的障碍物。
  10. 根据权利要求8所述的装置,其特征在于,
    在标记有所述指定障碍物的区域范围内与所述指定障碍物的类型相同的障碍物的检测,包括:
    将检测到的障碍物的类型与所述指定障碍物的类型进行匹配;如匹配成功,则判断该障碍物是否位于标记有所述指定障碍物的区域范围内;
    如是,则确认在标记有所述指定障碍物的区域范围内检测到与所述指定障碍物的类型相同的障碍物。
  11. 根据权利要求8所述的装置,其特征在于,
    对一指定障碍物的操作指令的接收,包括:
    接收用户通过人机交互界面发来的忽略所述指定障碍物的操作指令。
  12. 根据权利要求11所述的装置,其特征在于:
    所述操作指令包含:所述指定障碍物的类型信息和位置信息。
  13. 根据权利要求11所述的装置,还包括:
    用于在行进过程中遇到障碍物后,为该障碍物分配标识的部件,其中,所述操作指令包含:所述指定障碍物的标识;
    用于在接收到对所述指定障碍物的操作指令后,根据所述指定障碍物的标识,查找对应的类型信息和位置信息的部件。
  14. 根据权利要求8-13任一所述的装置,其特征在于,所述区域范围包括:
    以所述指定障碍物的坐标位置为几何中心,固定大小和形状的区域;或者,
    所述障碍物所在的分区,其中,所述分区是所述自行走机器人自动划分的或者用户通过手动设置划分的区域;或者,
    所述自行走机器人所能到达的所有地区。
  15. 一种机器人,包括处理器和存储器,所述存储器存储有能够被所述处理器执行的计算机程序指令,所述计算机程序指令在被所述处理器执行时,实现权利要求1-7任一所述的方法步骤。
  16. 一种非瞬时性计算机可读存储介质,存储有计算机程序指令,所述计算机程序指令在被处理器调用和执行时实现权利要求1-7任一所述的方法步骤。
PCT/CN2021/070912 2020-04-20 2021-01-08 自行走机器人避障方法、装置、机器人和存储介质 WO2021212926A1 (zh)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP21791604.8A EP4140381A1 (en) 2020-04-20 2021-01-08 Obstacle avoidance method and apparatus for self-walking robot, robot, and storage medium
US17/996,699 US20230225576A1 (en) 2020-04-20 2021-01-08 Obstacle avoidance method and apparatus for self-walking robot, robot, and storage medium

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202010311652.2 2020-04-20
CN202010311652.2A CN111481105A (zh) 2020-04-20 2020-04-20 一种自行走机器人避障方法、装置、机器人和存储介质

Publications (1)

Publication Number Publication Date
WO2021212926A1 true WO2021212926A1 (zh) 2021-10-28

Family

ID=71789524

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2021/070912 WO2021212926A1 (zh) 2020-04-20 2021-01-08 自行走机器人避障方法、装置、机器人和存储介质

Country Status (4)

Country Link
US (1) US20230225576A1 (zh)
EP (1) EP4140381A1 (zh)
CN (1) CN111481105A (zh)
WO (1) WO2021212926A1 (zh)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296447A (zh) * 2021-12-07 2022-04-08 北京石头世纪科技股份有限公司 自行走设备的控制方法、装置、自行走设备和存储介质

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111481105A (zh) * 2020-04-20 2020-08-04 北京石头世纪科技股份有限公司 一种自行走机器人避障方法、装置、机器人和存储介质
CN112269379B (zh) * 2020-10-14 2024-02-27 北京石头创新科技有限公司 障碍物识别信息反馈方法
CN112380942A (zh) * 2020-11-06 2021-02-19 北京石头世纪科技股份有限公司 一种识别障碍物的方法、装置、介质和电子设备
US11885638B2 (en) * 2020-12-28 2024-01-30 Bear Robotics, Inc. Method, system, and non-transitory computer-readable recording medium for generating a map for a robot
US11960282B2 (en) * 2021-01-05 2024-04-16 Abb Schweiz Ag Systems and methods for servicing a data center using autonomous vehicle
CN112890690B (zh) * 2021-01-28 2022-08-12 深圳拓邦股份有限公司 机器人清扫控制方法、装置及扫地机器人
CN113534813A (zh) * 2021-07-30 2021-10-22 珠海一微半导体股份有限公司 移动机器人作业方法、系统及芯片
CN113670292B (zh) * 2021-08-10 2023-10-20 追觅创新科技(苏州)有限公司 地图的绘制方法和装置、扫地机、存储介质、电子装置
CN113465592A (zh) * 2021-08-20 2021-10-01 北京石头世纪科技股份有限公司 导航方法及自行走装置
CN113907663B (zh) * 2021-09-22 2023-06-23 追觅创新科技(苏州)有限公司 障碍物地图构建方法、清洁机器人及存储介质
CN113855835B (zh) * 2021-09-27 2023-04-07 丰疆智能(深圳)有限公司 一种消毒方法、装置、存储介质及消毒机器人
CN114601399B (zh) * 2021-12-08 2023-07-04 北京石头创新科技有限公司 清洁设备的控制方法、装置、清洁设备和存储介质
CN114274117A (zh) * 2021-11-16 2022-04-05 深圳市普渡科技有限公司 机器人、基于障碍物的机器人交互方法、装置及介质
CN114287830B (zh) * 2021-12-13 2023-07-07 珠海一微半导体股份有限公司 一种盲人用清洁机器人系统及其控制方法
CN116416518A (zh) * 2021-12-22 2023-07-11 广东栗子科技有限公司 智能避开障碍的方法及装置
CN116636775A (zh) * 2022-02-16 2023-08-25 追觅创新科技(苏州)有限公司 清洁设备的运行控制方法及装置、存储介质及电子装置
CN115040033A (zh) * 2022-05-24 2022-09-13 武汉擎朗智能科技有限公司 一种机器人清洁记录显示方法、装置、设备及介质

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180075176A (ko) * 2016-12-26 2018-07-04 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN108344414A (zh) * 2017-12-29 2018-07-31 中兴通讯股份有限公司 一种地图构建、导航方法及装置、系统
CN108628319A (zh) * 2018-07-04 2018-10-09 马书翠 一种扫地机器人智能避障系统
WO2019199027A1 (ko) * 2018-04-09 2019-10-17 엘지전자 주식회사 로봇 청소기
WO2019212174A1 (ko) * 2018-04-30 2019-11-07 엘지전자 주식회사 인공지능 청소기 및 그 제어방법
CN110974088A (zh) * 2019-11-29 2020-04-10 深圳市杉川机器人有限公司 扫地机器人控制方法、扫地机器人及存储介质
CN111481105A (zh) * 2020-04-20 2020-08-04 北京石头世纪科技股份有限公司 一种自行走机器人避障方法、装置、机器人和存储介质

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6539845B2 (ja) * 2015-03-31 2019-07-10 株式会社日本総合研究所 自走型走行装置、管理装置、及び歩行障害箇所判定システム
JP6601001B2 (ja) * 2015-06-05 2019-11-06 日本精工株式会社 走行装置、案内用ロボット及び走行装置の制御方法
CN105182979B (zh) * 2015-09-23 2018-02-23 上海物景智能科技有限公司 一种移动机器人障碍物检测及避让方法和系统
CN109709945B (zh) * 2017-10-26 2022-04-15 深圳市优必选科技有限公司 一种基于障碍物分类的路径规划方法、装置及机器人
CN110989616A (zh) * 2019-12-20 2020-04-10 华南智能机器人创新研究院 一种机器人自动清扫导航的方法及机器人

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20180075176A (ko) * 2016-12-26 2018-07-04 엘지전자 주식회사 이동 로봇 및 그 제어방법
CN108344414A (zh) * 2017-12-29 2018-07-31 中兴通讯股份有限公司 一种地图构建、导航方法及装置、系统
WO2019199027A1 (ko) * 2018-04-09 2019-10-17 엘지전자 주식회사 로봇 청소기
WO2019212174A1 (ko) * 2018-04-30 2019-11-07 엘지전자 주식회사 인공지능 청소기 및 그 제어방법
CN108628319A (zh) * 2018-07-04 2018-10-09 马书翠 一种扫地机器人智能避障系统
CN110974088A (zh) * 2019-11-29 2020-04-10 深圳市杉川机器人有限公司 扫地机器人控制方法、扫地机器人及存储介质
CN111481105A (zh) * 2020-04-20 2020-08-04 北京石头世纪科技股份有限公司 一种自行走机器人避障方法、装置、机器人和存储介质

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114296447A (zh) * 2021-12-07 2022-04-08 北京石头世纪科技股份有限公司 自行走设备的控制方法、装置、自行走设备和存储介质
CN114296447B (zh) * 2021-12-07 2023-06-30 北京石头世纪科技股份有限公司 自行走设备的控制方法、装置、自行走设备和存储介质

Also Published As

Publication number Publication date
CN111481105A (zh) 2020-08-04
US20230225576A1 (en) 2023-07-20
EP4140381A1 (en) 2023-03-01

Similar Documents

Publication Publication Date Title
WO2021212926A1 (zh) 自行走机器人避障方法、装置、机器人和存储介质
WO2020200282A1 (zh) 机器人工作区域地图构建方法、装置、机器人和介质
WO2021208530A1 (zh) 一种机器人避障方法、装置和存储介质
CN111990929B (zh) 一种障碍物探测方法、装置、自行走机器人和存储介质
EP3424395B1 (en) Method and apparatus for performing cleaning operation by cleaning device
JP7204990B2 (ja) 自律移動ロボットのための地図作成
WO2022041737A1 (zh) 一种测距方法、装置、机器人和存储介质
WO2021043080A1 (zh) 一种清洁机器人及其控制方法
TW202228587A (zh) 移動式機器人系統
WO2023103515A1 (zh) 自行走设备的控制方法、自行走设备和存储介质
US20230393583A1 (en) Obstacle recognition information feedback method and apparatus, robot, and storage medium
KR102490755B1 (ko) 이동 로봇 시스템
CN114595354A (zh) 机器人的建图方法、装置、机器人和存储介质
CN113625700A (zh) 自行走机器人控制方法、装置、自行走机器人和存储介质
TW202214166A (zh) 移動式機器人系統
JP7484015B2 (ja) 障害物検出方法および装置、自走式ロボット並びに記憶媒体
AU2023201499A1 (en) Method and apparatus for detecting obstacle, self-moving robot, and storage medium
WO2022227876A1 (zh) 一种测距方法、装置、机器人和存储介质
TW202215183A (zh) 移動式機器人系統

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21791604

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021791604

Country of ref document: EP

Effective date: 20221121