CN111813103B - Control method, control system and storage medium for mobile robot - Google Patents

Control method, control system and storage medium for mobile robot Download PDF

Info

Publication number
CN111813103B
CN111813103B CN202010514690.8A CN202010514690A CN111813103B CN 111813103 B CN111813103 B CN 111813103B CN 202010514690 A CN202010514690 A CN 202010514690A CN 111813103 B CN111813103 B CN 111813103B
Authority
CN
China
Prior art keywords
edge
mobile robot
drag
image
area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010514690.8A
Other languages
Chinese (zh)
Other versions
CN111813103A (en
Inventor
不公告发明人
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Ankobot Shanghai Smart Technologies Co ltd
Shankou Shenzhen Intelligent Technology Co ltd
Original Assignee
Ankobot Shanghai Smart Technologies Co ltd
Shankou Shenzhen Intelligent Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Ankobot Shanghai Smart Technologies Co ltd, Shankou Shenzhen Intelligent Technology Co ltd filed Critical Ankobot Shanghai Smart Technologies Co ltd
Priority to CN202010514690.8A priority Critical patent/CN111813103B/en
Publication of CN111813103A publication Critical patent/CN111813103A/en
Application granted granted Critical
Publication of CN111813103B publication Critical patent/CN111813103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0238Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors
    • G05D1/024Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using obstacle or wall sensors in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0221Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving a learning process
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0251Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting 3D information from a plurality of images taken from different locations, e.g. stereo vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0257Control of position or course in two dimensions specially adapted to land vehicles using a radar
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0259Control of position or course in two dimensions specially adapted to land vehicles using magnetic or electromagnetic means
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle
    • G05D1/028Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle using a RF signal

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • Automation & Control Theory (AREA)
  • General Physics & Mathematics (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Electromagnetism (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Optics & Photonics (AREA)
  • Multimedia (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Manipulator (AREA)

Abstract

The application discloses a control method, a control system and a storage medium of a mobile robot, wherein the control method comprises the following steps: acquiring an image from the image pickup apparatus; identifying the edge of a suspected forbidden and dragged area in the physical space of the mobile robot from the image, and determining the relative spatial position between the edge of the suspected forbidden and dragged area and the mobile robot by using at least the image; when it is determined that the mobile robot moves to the vicinity of the edge of the suspected drag-forbidding area according to the relative spatial position, determining that the suspected drag-forbidding area is the drag-forbidding area by detecting a driving signal in the mobile robot; and adjusting at least one behavior operation of the mobile robot in a mopping mode according to the determined mopping prohibition area. According to the method and the device, the area which is forbidden to be mopped can be effectively detected through the image, and the behavior and/or movement of the mobile robot can be correspondingly controlled according to the detection result.

Description

Control method, control system and storage medium for mobile robot
Technical Field
The present application relates to the field of mobile robot technology, and in particular, to a control method, a control system, and a storage medium for a mobile robot.
Background
The mobile robot is a machine device which automatically executes specific work, can receive human commands, can run a pre-arranged program, and can perform actions according to a principle formulated by an artificial intelligence technology. The mobile robot can be used indoors or outdoors, can be used for industry, business or families, and can have the functions of tour, welcome, ordering, floor cleaning, family accompanying, office assisting and the like.
While a mobile robot (e.g., a cleaning robot) with a floor mopping function usually performs a floor mopping operation to clean the floor following a specific floor mopping route, at present, people usually lay objects for heat preservation or beauty, such as a carpet, on the floor in an indoor environment, and for the carpet, due to the special characteristics of the material, the carpet is not suitable for cleaning by mopping.
Disclosure of Invention
In view of the above-mentioned shortcomings of the related art, an object of the present application is to provide a control method, a control system and a storage medium for a mobile robot, so as to overcome the technical problem in the related art that the mobile robot cannot accurately and effectively detect a carpet.
To achieve the above and other related objects, a first aspect of the present disclosure provides a control method of a mobile robot including an image pickup device, the control method including: acquiring an image from the image pickup apparatus; identifying the edge of a suspected forbidden and dragged area in the physical space of the mobile robot from the image, and determining the relative spatial position between the edge of the suspected forbidden and dragged area and the mobile robot by using at least the image; when it is determined that the mobile robot moves to the vicinity of the edge of the suspected drag-forbidding area according to the relative spatial position, determining that the suspected drag-forbidding area is the drag-forbidding area by detecting a driving signal in the mobile robot; and adjusting at least one behavior operation of the mobile robot in a mopping mode according to the determined mopping prohibition area.
In certain embodiments of the first aspect of the present application, the image is a color image and/or a depth image.
In certain embodiments of the first aspect of the present application, the image is a color image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the image comprises: and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel position of the edge of the suspected drag-forbidding area in the color image and preset physical reference information.
In certain embodiments of the first aspect of the present application, the physical reference information comprises: the device comprises a physical height of the image pickup device from a traveling plane, physical parameters of the image pickup device and an included angle of a main optical axis of the image pickup device relative to a horizontal plane or a vertical plane.
In certain embodiments of the first aspect of the present application, the image is a depth image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the image comprises: and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the depth data of the edge of the suspected drag-forbidding area in the depth image.
In certain embodiments of the first aspect of the present application, the step of determining that the suspected no-drag area is a no-drag area by detecting a driving signal in the mobile robot comprises: and when a preset condition is met between a reference value and the detected value of the detected driving signal, determining that the suspected drag-forbidden area is a drag-forbidden area.
In certain embodiments of the first aspect of the present application, the driving signal is collected from a working component of the mobile robot performing a sweeping operation and/or a working component of the mobile robot performing a moving operation.
In certain embodiments of the first aspect of the present application, the reference value is a preset value or is determined before the mobile robot moves to an edge of the suspected forbidden zone.
In certain embodiments of the first aspect of the present application, the step of adjusting at least one behavior operation of the mobile robot in the mopping mode according to the determined mopping prohibition region comprises at least one of: controlling the mobile robot to change the moving direction to continuously perform a mopping operation and/or a sweeping operation; controlling the mobile robot to turn off a mopping operation in a mopping mode to enter the no-mopping area.
In certain embodiments of the first aspect of the present application, the step of identifying, from the image, an edge of a suspected forbidden dragging area within a physical space in which the mobile robot is located includes: and identifying the edge of the suspected drag-forbidden area from the image by using a preset height condition set based on the traveling plane of the mobile robot.
In certain embodiments of the first aspect of the present application, the information describing the edge of the suspected drag-forbidden area and the pre-stored information describing the edge of the drag-forbidden area are subjected to matching processing, so as to determine that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area; wherein information describing an edge of the suspected drag-forbidden region is determined from the acquired image and/or the relative spatial location; and adjusting at least one behavior operation or moving operation of the mobile robot in a mopping mode according to the determined edge of the mopping forbidden area and the relative spatial position.
In certain embodiments of the first aspect of the present application, the acquired image is acquired by controlling rotation of the image pickup device.
A second aspect of the present disclosure provides a control method of a mobile robot including an image pickup device, the control method including: acquiring an image from the image pickup apparatus; identifying the edge of a suspected drag-forbidding area in the physical space where the mobile robot is located from the acquired image; and determining a relative spatial position between an edge of the suspected no-drag region and the mobile robot using at least the acquired image; matching the information describing the edge of the suspected drag-forbidding area with the pre-stored information describing the edge of the drag-forbidding area to determine that the edge of the suspected drag-forbidding area is the edge of the drag-forbidding area; wherein information describing an edge of the suspected drag-forbidden region is determined from the acquired image and/or the relative spatial location; and adjusting at least one behavior operation or moving operation of the mobile robot in a mopping mode according to the determined edge of the mopping forbidden area and the relative spatial position.
In certain embodiments of the second aspect of the present application, the image is a color image and/or a depth image.
In certain embodiments of the second aspect of the present application, the information describing the edge of the suspected drag-forbidden region includes an image feature determined according to the acquired image, and the step of performing matching processing on the information describing the edge of the suspected drag-forbidden region and the pre-stored information describing the edge of the drag-forbidden region includes: and matching the image characteristics describing the edge of the suspected drag-forbidden area with the image characteristics describing the edge of the drag-forbidden area, which are stored in advance.
In certain embodiments of the second aspect of the present application, the information describing the edge of the suspected drag-forbidden area includes map data determined according to the image and the relative spatial position, and the step of performing a stitching process on the information describing the edge of the suspected drag-forbidden area and the pre-stored information describing the edge of the drag-forbidden area includes: and matching the map data describing the edge of the suspected drag-forbidding area with the map data which is stored in advance and describes the edge of the drag-forbidding area.
In certain embodiments of the second aspect of the present application, the image is a color image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the acquired image comprises: and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel position of the edge of the suspected drag-forbidding area in the color image and preset physical reference information.
In certain embodiments of the second aspect of the present application, the image is a depth image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the acquired image comprises: and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel value of the edge of the suspected drag-forbidding area in the depth image.
In certain embodiments of the second aspect of the present application, the step of adjusting at least one of a behavior operation or a movement operation of the mobile robot in the mopping mode according to the determined edge and relative spatial position of the no-mopping area comprises at least one of: when the mobile robot is determined to move to the vicinity of the edge of the mopping forbidding area according to the relative spatial position, controlling the mobile robot to change the moving direction to continuously perform mopping operation and/or sweeping operation; when the mobile robot is determined to move to the vicinity of the edge of the no-dragging area according to the relative spatial position, controlling the mobile robot to close the dragging operation in the dragging mode to enter the no-dragging area.
In certain embodiments of the second aspect of the present application, the acquired image is acquired by controlling rotation of the image pickup device.
A third aspect of the present disclosure provides a mobile robot including: an image pickup device for picking up an image; the mobile device is used for controlled execution of mobile operation; a cleaning device comprising a mopping assembly; wherein the mopping assembly is used for controlling to execute mopping operation; a storage device for storing at least one program; and the processing device is connected with the mobile device, the cleaning device, the storage device and the image shooting device and is used for calling and executing the at least one program so as to coordinate the mobile device, the cleaning device, the storage device and the image shooting device to execute and realize the control method according to any one of the first aspect of the application.
In certain embodiments of the third aspect of the present application, the cleaning device further comprises a sweeping assembly for controlled performance of a sweeping operation.
A fourth aspect of the present disclosure provides a mobile robot including: an image pickup device for picking up an image; the mobile device is used for controlled execution of mobile operation; a cleaning device comprising a mopping assembly; wherein the mopping assembly is used for controlling to execute mopping work; the storage device is used for storing at least one program and storing information describing the edge of the drag forbidden area; and the processing device is connected with the mobile device, the cleaning device, the storage device and the image shooting device and is used for calling and executing the at least one program so as to coordinate the mobile device, the cleaning device, the storage device and the image shooting device to execute and realize the control method according to the second aspect of the application.
In certain embodiments of the second aspect of the present application, the cleaning device further comprises a sweeping assembly for controlled performance of a sweeping operation.
A fifth aspect of the present disclosure provides a control system of a mobile robot equipped with an image pickup device, the control system including: an interface device for receiving an image picked up from the image pickup device and outputting a control instruction for controlling the mobile robot; a storage device for storing at least one program; and the processing device is connected with the interface device and the storage device and used for calling and executing the at least one program so as to coordinate the execution of the interface device, the storage device and the image pickup device and realize the control method according to any one of the first aspect of the application.
A sixth aspect of the present disclosure provides a control system of a mobile robot equipped with an image pickup device, the control system including: an interface device for receiving an image picked up from the image pickup device and outputting a control instruction for controlling the mobile robot; the storage device is used for storing at least one program and storing information describing the edge of the drag forbidden area; and the processing device is connected with the interface device and the storage device and used for calling and executing the at least one program so as to coordinate the execution of the interface device, the storage device and the image pickup device and realize the control method according to the second aspect of the application.
A seventh aspect of the present disclosure provides a computer-readable storage medium storing at least one program which, when invoked, executes and implements a control method according to any one of the first aspects of the present disclosure or a control method according to any one of the second aspects of the present disclosure.
In summary, according to the control method, the control system and the storage medium of the mobile robot disclosed by the application, the problem that the mobile robot cannot effectively identify the carpet placed on the ground in the prior art can be effectively solved by acquiring the image from the image pickup device and accurately identifying the dragging-forbidden region at least according to the image; and based on the position of the drag-forbidden area and the accurate identification of the drag-forbidden area, at least one behavior operation of the mobile robot in the floor-dragging mode can be effectively adjusted to avoid the mobile robot from performing floor-dragging operation in the drag-forbidden area.
Other aspects and advantages of the present application will be readily apparent to those skilled in the art from the following detailed description. Only exemplary embodiments of the present application have been shown and described in the following detailed description. As those skilled in the art will recognize, the disclosure of the present application enables those skilled in the art to make changes to the specific embodiments disclosed without departing from the spirit and scope of the invention as it is directed to the present application. Accordingly, the descriptions in the drawings and the specification of the present application are illustrative only and not limiting.
Drawings
The specific features of the invention to which this application relates are set forth in the appended claims. The features and advantages of the invention to which this application relates will be better understood by reference to the exemplary embodiments described in detail below and the accompanying drawings. The brief description of the drawings is as follows:
fig. 1 is a block diagram showing a hardware configuration of a control system of a mobile robot according to an embodiment of the present invention.
Fig. 2 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present invention.
Fig. 3 is a schematic structural diagram of the ToF collecting part in the present application.
Fig. 4 is a schematic structural diagram of a motor with an integrated actuator and movable member according to an embodiment of the present invention.
Fig. 5 is a schematic top view of the ToF collecting element disposed in the carrier in the embodiment of the present application.
Fig. 6 shows a schematic diagram of the determination of any line on the upper surface of an object according to the present application based on the edge of the object lying on the travel plane.
Fig. 7 is a schematic diagram illustrating the principle of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot based on the imaging principle according to the present application.
Fig. 8 is a flowchart illustrating a control method of a mobile robot according to another embodiment of the present invention.
Fig. 9 is a schematic structural diagram of a mobile robot according to an embodiment of the present invention.
Detailed Description
The following description of the embodiments of the present application is provided for illustrative purposes, and other advantages and capabilities of the present application will become apparent to those skilled in the art from the present disclosure.
In the following description, reference is made to the accompanying drawings that describe several embodiments of the application. It is to be understood that other embodiments may be utilized and that changes in the module or unit composition, electrical, and operation may be made without departing from the spirit and scope of the present disclosure. The following detailed description is not to be taken in a limiting sense, and the scope of embodiments of the present application is defined only by the claims of the issued patent. The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the application.
As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, unless the context indicates otherwise. It will be further understood that the terms "comprises," "comprising," "includes" and/or "including," when used in this specification, specify the presence of stated features, steps, operations, elements, components, items, species, and/or groups, but do not preclude the presence, or addition of one or more other features, steps, operations, elements, components, species, and/or groups thereof. The terms "or" and/or "as used herein are to be construed as inclusive or meaning any one or any combination. Thus, "A, B or C" or "A, B and/or C" means "any of the following: a; b; c; a and B; a and C; b and C; A. b and C ". An exception to this definition will occur only when a combination of elements, functions, steps or operations are inherently mutually exclusive in some way.
A mobile robot with a cleaning function (e.g., a cleaning robot) can perform floor cleaning in a room under the manual control of a user (e.g., an operator holds a remote controller) or by himself or herself according to a certain set rule. At present, people usually lay objects with low height such as carpets and crawling mats used for beauty or warm keeping on the ground, and also usually lay objects with low height such as yoga mats used for exercising bodies on the ground, and in order to not wet the objects laid on the ground such as the carpets, people usually want the mobile robot not to perform mopping operation on the objects laid on the ground such as the carpets in a mopping mode, so that three-dimensional objects with mopping forbidding properties such as the carpets, crawling mats and yoga mats laid on the ground in a physical space form a mopping forbidding area.
In other words, the non-mopping area is a space area formed by the volume of a three-dimensional object with non-mopping properties, such as a carpet, a crawling mat, or a yoga mat laid on the ground. According to different three-dimensional objects placed on the ground, the three-dimensional shapes of the drag prohibiting areas are different, and the three-dimensional shapes of the drag prohibiting areas comprise: rectangular solids, cylinders, irregular shapes, and the like. An intersection line of a side surface of a three-dimensional object with drag prohibiting properties placed on the ground and the ground, or an intersection line of a side surface of a three-dimensional object with drag prohibiting properties placed on the ground and an upper surface is an edge of the drag prohibiting area, wherein according to a shape of the drag prohibiting area, a curve shape of the edge of the drag prohibiting area may include: straight lines, arcs, irregular curves, etc., wherein the sides of the three-dimensional object are surfaces that are perpendicular to the ground or at an angle to the ground; according to the difference of the shapes of the drag forbidden areas, the plane shape of the upper surface comprises: circular, rectangular, irregular, etc.
In some examples, the mobile robot identifies the no-drag area by detecting a current in a rolling brush of the mobile robot, or by detecting a reflectivity of the ground using an infrared sensor, but the rolling brush of the mobile robot is usually located at a bottom center position thereof, so that when the no-drag area is identified by detecting a current in the rolling brush of the mobile robot, a floor-dragging component of the mobile robot tends to contact the no-drag area when the rolling brush of the mobile robot contacts the no-drag area in a floor-dragging mode, thereby wetting the no-drag area.
Therefore, in order to solve the defects in the drag prohibiting area detection method, realize accurate and effective detection of the drag prohibiting area, and control navigation movement, behavior and the like of the mobile robot in the drag mode according to the identified relative spatial position between the edge of the drag prohibiting area and the mobile robot, the present application provides a control method of the mobile robot, which can be used for accurately determining the edge of the drag prohibiting area in the physical space where the mobile robot is located according to at least the image shot by the image shooting device in the drag mode of the mobile robot, so that the mobile robot does not perform drag operation on the drag prohibiting area in the drag mode, and does not wet the drag prohibiting area.
Here, the mobile robot configures at least one image pickup device. Wherein the image capture device is a device for providing a color image and/or a depth image at a preset pixel resolution. Each depth image represents an object image in a shot field of view by using depth data of each pixel, wherein the depth data of each pixel in each depth image comprises: the pixel position of each pixel in the depth image and the pixel value of each pixel. The depth image may directly reflect the geometry of the visible surfaces of objects in the captured physical scene. The depth image can be converted into three-dimensional point cloud data through coordinate conversion. Each color image represents an object image within a captured field of view by using color data of each pixel, the color data including a pixel position of each pixel in the color image and a pixel value of each pixel, wherein the pixel value of each pixel in each color image includes: a single color pixel value or a color pixel value; such as a grayscale pixel value, an R pixel value, a G pixel value, a B pixel value, or an RGB pixel value, etc. The color image is exemplified by a Bayer image, an RGB image, a grayscale image, or the like.
Here, the image pickup apparatus includes, but is not limited to: an image pickup device including a CCD, an image pickup device including a CMOS, an image pickup device including a depth measuring unit, an image pickup device (such as a ToF collecting section) integrated with a depth measuring unit and an infrared sensor, and the like.
Wherein, the depth measurement unit can capture depth information of each pixel point for forming a two-dimensional surface, the depth measurement unit includes a depth measurement unit based on a surface array, a depth measurement unit based on a lattice, which includes but is not limited to: the depth measuring system comprises a laser radar depth measuring unit, a depth measuring unit based on binocular stereo vision, a depth measuring unit based on flight time, a depth measuring unit based on structured light technology and the like. For example, the depth measurement unit includes a light emitter and a light receiving array, wherein the light emitter projects a specific light signal to the object surface and reflects the light signal to the light receiving array. The light receiving array calculates depth information of the object according to a change of the light signal caused by the object.
The mobile robot performs the control method by means of a control system disposed therein. Referring to fig. 1, fig. 1 is a block diagram illustrating a hardware structure of a control system of a mobile robot according to an embodiment of the present invention. The control system 10 comprises storage means 11, interface means 12, and processing means 13.
The interface device 12 is used for receiving the image captured by the image capturing device 20. The interface device 12 is connected to at least one image pickup device 20 according to an image pickup device configured by the actual mobile robot, and is used for reading the image which is picked up by the corresponding image pickup device 20 and contains the object in the visual field range of the corresponding image pickup device 20. The interface device 12 is also used for outputting a control command for controlling the mobile robot, for example, the interface device is connected to a driving motor for driving the side brush, the rolling brush, the floor mopping assembly, or the traveling mechanism, so as to output the control command for controlling the rotation of the side brush, the rolling brush, or the traveling mechanism. The control instructions are generated by the processing means 13 on the basis of the recognition result in combination with the control strategy in the storage means 11. Wherein the control strategy is a control logic described by a program for execution by a processing device. For example, when it is determined from at least the acquired image that the edge of the suspected mopping prohibited area is the edge of the mopping prohibited area, the processing device generates a control instruction for stopping the mopping operation in combination with the behavior control strategy, and outputs the control instruction to the driving motor for driving the mopping assembly through the interface device 12. The interface device 12 includes but is not limited to: a serial interface such as an HDMI interface or a USB interface, or a parallel interface, etc.
The storage device 11 is configured to store at least one program, and the at least one program is used for the processing device 13 to execute the control method of the mobile robot. The storage device 11 further stores a control strategy corresponding to the drag-forbidden zone, wherein the control strategy is used for generating a control instruction for controlling the mobile robot based on the recognition condition of the mobile robot to the drag-forbidden zone to be output through the interface device 12. In practical applications, the control strategy may include a mobility control strategy, a behavior control strategy, and the like. The mobile control strategy is used for controlling the mobile mode of the mobile robot according to the relative space position of the mobile robot positioned in real time relative to the confirmed edge of the drag prohibiting area. The behavior control strategy is used for controlling the floor mopping behavior mode of the mobile robot according to the mopping forbidden region.
Here, the storage device 11 includes, but is not limited to: Read-Only Memory (ROM), Random Access Memory (RAM), and non-volatile Memory (NVRAM). For example, the storage 11 includes a flash memory device or other non-volatile solid-state storage device. In certain embodiments, the storage 11 may also include memory remote from the one or more processing devices 13, such as network-attached memory accessed via RF circuitry or external ports and a communications network, which may be the internet, one or more intranets, local area networks, wide area networks, storage area networks, and the like, or suitable combinations thereof. The memory controller may control access to the memory by other components of the device, such as the CPU and peripheral interfaces.
The processing means 13 are connected to said interface means 12 and to the storage means 11. The processing means 13 comprise one or more processors. The processing means 13 is operable to perform data read and write operations with the storage means 11. The processing means 13 performs such actions as capturing images, temporarily storing image features, performing image feature matching, etc. The processing device 13 includes one or more general purpose microprocessors, one or more application specific processors (ASICs), one or more Digital Signal Processors (DSPs), one or more Field Programmable logic arrays (FPGAs), or any combination thereof. The processing device 13 is also operatively coupled with I/O ports that enable the mobile robot to interact with various other electronic devices, and input structures that enable a user to interact with the mobile robot. For example, a configuration operation such as inputting a preset height value is performed. Thus, the input structures may include buttons, keyboards, mice, touch pads, and the like. Such other electronic devices include, but are not limited to: a motor in the mobile device in the mobile robot, or a slave processor in the mobile robot dedicated to control the mobile device and the cleaning device, such as a Micro Controller Unit (MCU).
Referring to fig. 2, fig. 2 is a flowchart illustrating a control method of a mobile robot according to an embodiment of the present disclosure. Wherein the control method may be performed by the control system of the mobile robot shown in fig. 2. The processing device coordinates hardware such as the storage device and the interface device to execute the following steps.
In step S110, an image from the image pickup apparatus is acquired. Here, the processing device acquires an image captured by the image capturing device of the mobile robot in the mopping mode in real time or according to a preset time interval.
In order to enable the processing device to identify the edge of the drag forbidden area in the physical space where the mobile robot is located according to the image captured by the image capturing device 20, the image should include the traveling plane in the physical space where the mobile robot is located.
Based on this, the assembly inclination angle of the image pickup device may be any angle from 0 ° to 90 °, wherein the angle is an angle between a horizontal line of the traveling direction of the mobile robot and the optical axis or optical axis of the image pickup device. However, the angle is not limited to this, and the angle may also be an angle between an optical axis or an optical axis of the image capturing device and a horizontal line of the traveling direction of the mobile robot.
In an embodiment, the image capturing device is mounted at a position of a front end surface in a traveling direction of the mobile robot, and an optical axis of the image capturing device is parallel to a traveling plane, so that the included angle is 0 °.
In another embodiment, the image capturing device is mounted on the upper surface (i.e. the surface perpendicular to the traveling direction) of the mobile robot, the optical axis of the image capturing device is perpendicular to the traveling plane, the included angle is 90 °, and the image capturing device is, for example, a top-view camera.
In a further embodiment, the image capturing device is mounted on the upper surface (i.e. the surface perpendicular to the traveling direction) of the mobile robot, but the image capturing device is obliquely placed in a recessed structure, and the included angle between the optical axis of the image capturing device and the traveling plane is in the range of 10 ° to 80 °, and in a more preferred embodiment, the included angle between the optical axis of the image capturing device and the traveling plane is in the range of 30 ° to 60 °.
In yet another embodiment, the acquired image is captured by the image capturing device (e.g., ToF capturing component) during rotation, and by rotating the image capturing device, the image capturing device can capture an image of a wider field of view. For example, the acquired image is an image taken by the image pickup device during rotation. For another example, the acquired image is formed by splicing at least two images shot by the image pickup device in the rotation process, and the processing device can more easily identify the edge of the suspected drag-forbidden area by splicing more obstacles in the acquired image, including the physical space where the mobile robot is located. The obstacle may be any object that can be captured by an image capturing device of the mobile robot, such as a table, a sofa, a bed, or the like, or an object having at least a part of an edge detected from an image by an edge detection method; the manner of determining the relative spatial position between the obstacle and the mobile robot is the same as or similar to the manner of determining the relative spatial position in step S120, which will be described in detail in step S120.
In an embodiment, when the processing device determines that an obstacle exists in the physical space where the mobile robot is located, the processing device further controls the mobile robot to rotate according to the relative spatial position between the obstacle and the mobile robot to obtain a full-scale image including the obstacle, so as to execute step S120. The method for determining the existence of the obstacle in the physical space of the mobile robot includes but is not limited to: the manner of sensor detection, or the manner of image recognition.
For example, when the mobile robot is performing a floor-dragging operation in a preset route (a zigzag route), the image pickup device remains stationary, but when an image area including an incomplete obstacle in an image is identified from the image pickup device, the processing device turns the image pickup device to a direction in which the obstacle is located according to a position of the incomplete obstacle image area in the image to acquire a comprehensive image including the obstacle; or the processing device turns the image pickup device to the direction of the obstacle according to the determined relative space position between the incomplete obstacle and the mobile robot so as to acquire a comprehensive image containing the obstacle.
For another example, when a detection sensor such as a laser radar arranged on the mobile robot detects an obstacle outside the current view angle of the image capturing device, the processing device controls the image capturing device to turn to the obstacle according to the detected azimuth angle so as to acquire a comprehensive image including the obstacle. The processing device may more accurately identify the edge of the suspected forbidden zone in step S120 through the overall image containing the edge of the object. It should be noted that the overall image is not limited to include all the identified obstacles, but rather, the overall information of the identified obstacles is obtained as much as possible under the preset rotation limit.
It should be noted that the image capturing apparatus may be in a state of being kept still before being rotated or in a state of being continuously rotated, thereby acquiring at least one image of a wider field of view, so as to determine a suspected drag-forbidden region or a boundary position of the drag-forbidden region in advance.
Referring to fig. 3, fig. 3 is a schematic structural diagram illustrating a ToF collecting component in the present application, where the processing device 13 is connected to a driving component and controls the driving component 202 to drive the ToF collecting component 201 to rotate so as to obtain the image. Referring also to fig. 4, the driving unit 202 includes: a movable member 2021 and a driver 2022.
Specifically, the movable element 2021 is connected to and movable to drive the ToF collecting component 201. The ToF collecting component 201 and the movable component 2021 can be connected in a positioning manner or through a transmission structure. Wherein the positioning connection comprises: any one or more of snap connection, riveting, bonding, and welding. In an example of positioning connection, as shown in fig. 4, the movable element 2021 is, for example, a driving rod capable of rotating laterally, and the ToF collecting component 201 has a concave hole (not shown) that is fitted with the driving rod in a form-fitting manner, so long as the sections of the driving rod and the concave hole are non-circular, the ToF collecting component 201 can rotate laterally with the driving rod; in some examples of the transmission structure, the movable member is, for example, a screw rod, a connection seat on the screw rod is translated along with the rotation of the screw rod, and the connection seat is fixed with the ToF collecting part 201, so that the ToF collecting part 201 can move along with the connection seat. In some examples of the transmission structure, the ToF collecting component 201 and the movable component may also be connected through one or more of a tooth portion, a gear, a rack, a toothed chain, etc. to realize the movement of the movable component to the ToF collecting component 201.
Illustratively, the actuator 2022 and the movable element 2021 may be integral. For example, as shown in fig. 4, the driving component 202 itself may be a motor, and the movable component 2021 may be an output shaft of the motor, which rotates transversely to drive the ToF collecting component 201 sleeved with the output shaft to rotate transversely.
Referring to fig. 5, fig. 5 is a schematic top view of a ToF collecting component disposed in a carrier according to an embodiment of the present disclosure. When the ToF collecting part 201 is mounted on the main body of the mobile robot through the carrier 102, the processing device controls the driving part 202 connected thereto to drive the ToF collecting part 201 to rotate, so as to obtain the image required in step S110.
The processing device of the mobile robot may perform step S120 after obtaining the image including the travel plane.
In step S120, an edge of a suspected forbidden and dragged area in the physical space where the mobile robot is located is identified from the image, and a relative spatial position between the edge of the suspected forbidden and dragged area and the mobile robot is determined by using at least the image.
In real life, a carpet, a crawling mat, a yoga mat or other three-dimensional objects with the dragging-prohibiting attribute laid on a traveling plane in a physical space by a user form a dragging-prohibiting area. For example, since the carpet material a user does not want the carpet placed on the running surface to be wetted, the carpet laid on the running surface constitutes a no-clean area. In another example, the yoga mat laid on the traveling plane also constitutes a no-dragging area because the user does not want to be wetted and can exercise normally on the yoga mat when the user exercises the yoga mat. Based on the above understanding, the three-dimensional object having the drag prohibition attribute placed on the travel plane constitutes the drag prohibition region. In other words, the drag forbidden region is a spatial region formed by a volume of a three-dimensional object having drag forbidden properties. According to the difference of three-dimensional objects placed on a traveling plane, the three-dimensional shape of the drag-forbidden region is different, and the three-dimensional shape of the drag-forbidden region comprises: rectangular solids, cylinders, irregular shapes, and the like. An intersection line of a side surface of a three-dimensional object with drag prohibiting properties placed on a traveling plane and the traveling plane, or an intersection line of a side surface of a three-dimensional object with drag prohibiting properties placed on the traveling plane and an upper surface is an edge of the drag prohibiting area, wherein the curve shape of the edge of the drag prohibiting area may include: straight lines, arcs, irregular curves, etc., wherein the side of the three-dimensional object is a surface that is perpendicular to or at an angle to the plane of travel; according to the difference of the shapes of the drag forbidden areas, the plane shape of the upper surface comprises: circular, rectangular, irregular, etc.
Wherein the travel plane includes, but is not limited to, the following categories: cement floors, painted floors, floors with composite floors, floors with solid wood floors, etc.
The edge of the suspected drag-forbidden region is the edge of a spatial region in the physical space determined by the processing device by identifying image features in the image that meet the attributes of the drag-forbidden region. Examples of the edge of the suspected drag-forbidden region include: the edge of a real no-drag area (e.g., the edge formed by a carpet and a floor), the edge of a false no-drag area (e.g., a floor gap, a step edge, a threshold edge, etc.). According to the acquired image, the edge of a suspected drag-forbidden area in the physical space where the mobile robot is located can be identified, and the edge of the suspected drag-forbidden area and the edge of a real drag-forbidden area have similar properties in the image. For example, the two are similar in the change of the pixel value in the image.
Specifically, in the process of performing edge detection on the acquired image, the processing device may identify an edge of an object according to a sudden change in depth, a sudden change in color, a sudden change in gray level, a sudden change in texture, or the like in the image, and further, the processing device determines an edge of a suspected dragging-forbidden region according to an object edge corresponding to a continuous and regular image feature; wherein the continuous and regular image features are characterized by depth data or color data, including but not limited to: the continuous straight image feature, the continuous circular arc image feature, or the continuous regular curve image feature is related to the type of image by the manner of edge detection of the acquired image.
Taking the image as a gray image as an example, the processing device performs edge detection on the gray image to determine an area in the gray image where a sudden change occurs in the gray value, and further determines the edge of the object in the physical space. The image is an RGB image, the RGB image can be converted into a gray image and then subjected to edge detection, the RGB image can also be decomposed into an R color image, a G color image and a B color image, the color image of each color component is subjected to edge detection respectively, and then the detection result of each color image is subjected to comprehensive processing to determine the edge of an object in a physical space; wherein, the method for detecting the edge of the color image comprises but is not limited to: an edge detection method based on a differential operator, an edge detection method based on self-adaptive smooth filtering, a relaxation iteration edge detection method, an edge detection method based on a neural network, an edge detection method based on a wavelet, an edge detection method based on a gray correlation degree and the like.
Taking the image as a depth image as an example, edge detection can be directly performed on the depth image to determine a region where a pixel value in the depth image changes suddenly, so as to determine the edge of an object in the physical space; the method for edge detection of the depth image includes but is not limited to: a scanning line iteration edge detection method, a bidirectional curvature edge detection method and an edge detection method based on differential operators.
Based on the above understanding, by performing edge detection on the acquired image, the edge of the object in the physical space can be identified. In an embodiment, the edge of the object whose edge-detected image feature is continuous and regular may be directly used as the edge of the suspected forbidden and dragged area and the mobile robot may be controlled to move to the edge of the suspected forbidden and dragged area.
In another embodiment, the edge of the suspected forbidden dragging area may also be identified from the image by using a preset height condition set based on the traveling plane of the mobile robot. Specifically, a curve of a preset relative position to an edge is determined according to the edge of the object whose image feature is determined to be continuous and regular in the above embodiment, and then a height of the curve from a travel plane is determined, and when the height satisfies a preset height condition, the edge of the object is determined to be an edge of a suspected no-drag area, and the mobile robot is controlled to move to the edge of the suspected no-drag area. The preset height condition is used for finding objects with a certain height, and edges such as floor gaps can be filtered. The preset height condition is related to the height of the non-mopping area, and the non-mopping area can be formed by three-dimensional objects such as a carpet, a yoga mat and the like placed on the ground, so the value range of the preset height condition can be 0.5 cm-2.5 cm, but not limited to this. The preset height condition may also be set in practical applications, for example, the user sets the preset height condition according to the height of a carpet laid on the floor. The preset relative position is used to find any one of curves in the upper surface of the object corresponding to the edge of the object whose image feature is continuous and regular, for example, any one of curves in the upper surface of a floor, any one of curves in the upper surface of a carpet, or any one of curves in the upper surface of a step. The preset relative positions include: the distance and the azimuth angle are preset. The preset azimuth angle represents an azimuth angle between the edge and two points on the curve in one-to-one correspondence in the actual physical space, and the preset distance represents a distance between the edge and two points on the curve in one-to-one correspondence in the actual physical space.
In a specific embodiment, the processing device determines an edge of the suspected drag-forbidden region by using a depth image. The processing device converts the depth data of the depth image into three-dimensional point cloud data, fits a fitting advancing plane according to the three-dimensional point cloud data, can determine the edge of an object positioned on the advancing plane according to the fitting advancing plane, can determine a curve which is away from the edge by a preset relative position according to the edge of the object positioned on the advancing plane, can determine the height of the curve from the advancing plane according to the three-dimensional point cloud data of the curve, and further takes the edge of the object corresponding to the curve meeting a preset height condition as the edge of the suspected mopping forbidding area, wherein the height of the curve from the advancing plane can be obtained according to the height of at least one point on the curve from the advancing plane.
For example, referring to fig. 6, the processing device determines a pixel position of a curve b in the image, which is away from an edge a by a preset depth value (a preset distance), according to the determined pixel position of the edge a on the traveling plane in the image, and determines a distance of the curve from the traveling plane according to three-dimensional point cloud data corresponding to at least one point on the curve b.
In an example, the processing device may determine an image region representing a plane of travel based on an angle of a main optical axis of the image capture device relative to a horizontal or vertical plane, and a fitted plane of travel may be fitted from partial three-dimensional point cloud data of the image region representing the plane of travel. For example, the angle of the main optical axis of the image pickup device of the mobile robot with respect to the vertical plane (the plane perpendicular to the traveling plane) at which the mobile robot photographs the image looking down on the ground is θ, and the storage device stores in advance a predicted image region corresponding to the angle (for example, the lower half image region in one image, or one tenth of the lower half image region). The processing device can directly fit a fitting advancing plane according to the three-dimensional point cloud data corresponding to the predicted image area. The method for fitting the three-dimensional point cloud data comprises the following steps: least squares, eigenvalue, and Random Sample Consensus (RANSAC), among others. Based on the fitted traveling plane, the distance from the remaining point cloud data in the three-dimensional point cloud data to the fitted traveling plane can be calculated to determine the height of the curve from the traveling plane. The processing device can determine the height of any point on the curve from the travel plane in the above manner, and takes the height of the point from the travel plane as the height of the curve from the travel plane; the processing device may further traverse the plurality of three-dimensional point cloud data on the curve in the above manner to determine heights of the plurality of position points from the travel plane, respectively, and further, the processing device may determine a maximum height, a mean height, and the like of the curve from the travel plane in the above manner, and use the maximum height or the mean height as the height of the curve from the travel plane.
In another example, the processing device may obtain partial three-dimensional point cloud data representing a traveling plane by clustering all point cloud data of the depth image, and obtain a fitting traveling plane by fitting the partial three-dimensional point cloud data representing the traveling plane, where the method of fitting the three-dimensional point cloud data is the same as or similar to that described above, and will not be described in detail herein. For example, assume that the equation fitting the travel plane determined By the present application is Ax + By + Cz + D ═ 0, where A, B, C, and D are constants, and A, B, and C are not zero at the same time, (x, y, z) are three-dimensional spatial coordinates. Based on the fitted traveling plane, the distance from the remaining point cloud data in the three-dimensional point cloud data to the fitted traveling plane can be calculated to determine the height of the curve from the traveling plane. The processing device can determine the height of any point on the curve from the travel plane in the above manner, and takes the height of the point from the travel plane as the height of the curve from the travel plane; the processing device may further traverse the plurality of three-dimensional point cloud data on the curve in the above-described manner to determine heights of the plurality of position points of the curve from the travel plane, respectively, and further, the processing device may determine a maximum height, a mean height, and the like of the curve from the travel plane in the above-described manner, and use the maximum height or the mean height as the height of the upper surface of the object from the travel plane.
In another specific embodiment, the processing device determines the edge of the suspected forbidden and dragged area by using a color image and a depth image, wherein the pixel positions of the color image and the pixel positions of the depth image have a one-to-one correspondence relationship. The processing device may obtain the relative spatial position of the edge of the object having continuous and regular image features from the mobile robot according to the embodiment of determining the relative spatial position in step S120, determine the relative spatial position between the curve having the preset relative position from the edge and the mobile robot according to the preset relative position, determine the pixel position of the curve in the color image according to the relative spatial position of the curve from the mobile robot, determine the pixel position of the curve in the depth image based on the one-to-one correspondence relationship, and determine the height of the curve from the travel plane according to the three-dimensional point cloud data corresponding to the curve.
When the height determined according to the foregoing embodiment is within the range of the preset height condition, the processing device regards an edge of an object in the physical space where the mobile robot is located, which corresponds to the curve, as an edge of a suspected prohibited-dragging area.
The processing device may determine, by using at least the image, a relative spatial position between the edge of the suspected forbidden dragging area and the mobile robot according to the determined edge of the suspected forbidden dragging area in any of the above embodiments.
In an embodiment, the image is a color image, and the relative spatial position between the edge of the suspected drag-forbidden area and the mobile robot is determined according to the pixel position of the edge of the suspected drag-forbidden area in the color image and preset physical reference information determined in the foregoing embodiment.
Wherein the physical reference information includes: the device comprises a physical height of the image pickup device from a traveling plane, physical parameters of the image pickup device and an included angle of a main optical axis of the image pickup device relative to a horizontal plane or a vertical plane. Here, the technician measures in advance the distance between the imaging center of the image pickup device and the travel plane, and saves the distance in the storage device as the physical height or an initial value of the physical height. The physical height may also be obtained by calculating the design parameters of the mobile robot in advance. The included angle of the main optical axis of the image shooting device relative to the horizontal plane or the vertical plane or the initial value of the included angle can be obtained according to the design parameters of the mobile robot. For the mobile robot with the adjustable image pickup device, the saved included angle may be determined after increasing/decreasing the adjusted deflection angle based on the initial value of the included angle, and the saved physical height may be determined after increasing/decreasing the adjusted height based on the initial value of the physical height. The physical parameters of the image capturing device include the angle of view and/or the focal length of the lens group, and the like.
After determining the edge of a suspected forbidden and dragged area in the physical space where the mobile robot is located from the acquired image, the processing device determines the relative spatial position between the edge of the suspected forbidden and dragged area in the physical space and the mobile robot based on the physical reference information and the imaging principle, where the relative spatial position includes: the distance and the azimuth angle between the edge of the suspected drag forbidding area and the mobile robot. It should be noted that the edge of the suspected no-touch area may be higher than the traveling plane (e.g., the intersection of the side of the carpet placed on the traveling plane and the upper surface), but the edge of the suspected no-touch area is considered to be in the traveling plane when the relative spatial position is determined by the single color image.
Referring to fig. 7, fig. 7 is a schematic diagram illustrating the principle of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot according to the present application based on the imaging principle, and the diagram includes three coordinate systems: image coordinate system UO1V, world coordinate system XO3Y, with O2A camera coordinate system with a circular point, wherein the edge of the suspected drag-forbidden area in the physical space is assumed to include a point P, the height of an object from the image capturing device to the traveling plane is known to be H, and a world coordinate point M corresponding to the image coordinate center and an origin O of the world coordinate system are known to be3A distance O of3M, image coordinate O of lens central point1Measuring image coordinate P of pixel point1The length and width of the actual pixel, and the focal length of the image capturing device, O can be obtained by derivation calculation3Length of P, thereby, according to the O3The length of P may result in the physical distance between the location of the mobile robot and the P point of the edge.
In order to determine the azimuth angle between the point P of the edge and the position of the mobile robot, the processing device calculates the azimuth angle between the mobile robot and the point P of the edge according to the corresponding relation between each pixel point of the image and the actual physical azimuth angle which are stored in the storage device in advance. Each pixel point corresponds to an azimuth angle, and the azimuth angle can be calculated based on parameters such as the number of pixels, the focal length of the image capturing device, the viewing angle and the like.
Based on the above manner, the processing device determines, based on preset physical reference information and a pixel position of an edge of a suspected drag-forbidden area in an image, a relative spatial position between at least one point on the edge of the suspected drag-forbidden area and the mobile robot, where the processing device may determine, according to the above manner, a relative spatial position between any point on the edge of the suspected drag-forbidden area and the mobile robot; the processing device may also traverse each pixel point or feature point of the edge of the suspected drag-forbidding area in the image according to the above manner to determine the relative spatial positions between the plurality of position points on the edge of the suspected drag-forbidding area and the mobile robot, respectively, and then the processing device may determine the closest spatial position, the average spatial position, and the like of the edge of the suspected drag-forbidding area and the mobile robot according to the above manner. For example, the processing device determines a spatial position where the edge of the suspected drag-forbidden area is closest to the mobile robot through image positions of a plurality of points of the edge of the suspected drag-forbidden area in the image, thereby facilitating timely adjustment of the operation and/or the movement behavior of the mobile robot.
In another embodiment, the image is a depth image, and the processing device may determine the relative spatial position between the edge of the suspected drag-forbidden region and the mobile robot according to the pixel value of the edge of the suspected drag-forbidden region in the depth image.
For example, based on the pixel values of the edge of the suspected drag-forbidden area in the depth image and the pixel positions of the corresponding pixels in the depth image, the physical distance and the azimuth angle between the edge of the suspected drag-forbidden area and the mobile robot can be obtained, and then the relative spatial position can be determined.
For another example, the azimuth angle and the physical distance between the mobile robot and the edge of the suspected prohibited and dragged area, which are obtained based on the depth image, may be combined with measurement data obtained by other measurement sensors in the mobile robot, and the relative spatial position may be obtained according to respective weights. Wherein the other measuring sensors include angle sensors and distance sensors, such as laser sensors integrated with angle and distance measurement.
Since the edge of the suspected no-drag area determined from the image may be the edge of a false no-drag area, for example, the edge of a floor gap and a carpet on the floor have edge attributes in the image, after obtaining the relative spatial position between the mobile robot and the edge of the suspected no-drag area, the processing device executes step S130 to determine that the suspected no-drag area is the no-drag area.
In step S130, when it is determined that the mobile robot moves to the vicinity of the edge of the suspected drag-forbidden area according to the relative spatial position, it is determined that the suspected drag-forbidden area is the drag-forbidden area by detecting a driving signal in the mobile robot.
After the mobile robot determines the relative spatial position, the processing device controls the mobile robot to move to the vicinity of the edge of the suspected forbidden dragging area according to the relative spatial position, and detects a driving signal in the mobile robot at the vicinity of the edge of the suspected forbidden dragging area. The driving signal is a condition for reflecting the operation of a motor in the mobile robot, and may be represented by a voltage signal or a current signal.
Since the surface material of the non-mopping area is rougher than that of the traveling plane, the surface material of the non-mopping area affects the working assembly for performing sweeping operation and/or the working assembly for performing moving operation. Taking the mobile robot as an example of a cleaning robot, when the mobile robot performs a floor mopping operation on a carpet, the upper surface of the carpet may obstruct movement of the traveling mechanism and rotation of the side brush or the rolling brush, and in order to drive the side brush, the rolling brush and the traveling mechanism to work normally, the amplitude of the driving signal is increased by the corresponding motor for maintaining the output power. Based on the above understanding, the driving signal is collected from a working component of the mobile robot performing a sweeping operation and/or a working component of the mobile robot performing a moving operation, and the processing device determines that the suspected no-drag area is a no-drag area based on a change of the collected driving signal.
The work assembly for executing the floor sweeping operation comprises a side brush, a rolling brush and a rolling brush motor, wherein the side brush and the rolling brush motor are located at the bottom of the mobile robot shell, the side brush motor is used for controlling the side brush, the rolling brush motor is used for controlling the rolling brush, and the cleaning assembly is not limited by the side brush, the rolling brush and the rolling brush motor. In an embodiment, when the processing device moves to the edge of the suspected drag-forbidden area according to the relative spatial position, the mobile robot detects a driving signal in the edge-brush motor or the rolling-brush motor at the edge of the suspected drag-forbidden area, and determines whether the suspected drag-forbidden area is the drag-forbidden area according to the detected driving signal value. In another embodiment, the processing device determines a position where the edge brush can touch the edge of the suspected no-drag area according to the length of the edge brush and the relative spatial position, and starts to detect the driving signal in the edge brush motor when the mobile robot reaches the position. In other embodiments, in order to avoid that the mobile robot enters the suspected prohibited-to-be-dragged area to perform the dragging operation due to the calculation error of the relative spatial position, the driving signal for performing the sweeping operation may be detected all the time during the movement of the mobile robot. It should be noted that, when the mopping operation is performed on the traveling plane, the sweeping operation may be performed only when the mopping operation is moved to the vicinity of the edge of the mopping prohibition area, or the mopping operation and the sweeping operation may be performed simultaneously on the traveling plane.
The processing device detects a driving signal in the driving motor of the driving walking mechanism at the edge of the suspected no-dragging area.
And after the processing device acquires the detection value of the driving signal, when a preset condition is met between a reference value and the detected detection value of the driving signal, determining that the suspected dragging-forbidding area is the dragging-forbidding area.
The reference value may be a reference value of a voltage signal or a reference value of a current signal. The reference value is a preset value or determined before the mobile robot moves to the edge of the suspected forbidden dragging area. For example, the technician presets different preset values according to the type of the traveling plane (for example, different preset values are corresponding to a wood floor plane and a glass material plane), and determines different reference values according to the material of the traveling plane of the mobile robot when the user performs initial configuration. As another example, the reference values include: the processing may include statistical processing including: the average value of the plurality of detection values is set as a reference value, the median value of the plurality of detection values is set as a reference value, the detection value with the largest number among the plurality of detection values is set as a reference value, and the like. As another example, a driving signal detection value before the mobile robot moves to the edge of the suspected forbidden zone is used as the reference value. Further, in order to avoid that the mobile robot starts to detect the reference value when reaching the edge attachment of the no-dragging area and also to avoid the misjudgment of the suspected no-dragging area by the mobile robot, the mobile robot may detect the reference value at any position in the first half of the route of the mobile robot to the edge of the suspected no-dragging area.
The preset condition is preset according to the material of the dragging forbidding area and the material of the advancing plane. For example, the preset condition is that the difference value is not less than a preset threshold.
Based on the above understanding, when the difference value between the reference value and the detected value of the detected driving signal satisfies the preset condition, it is determined that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area, that is, the suspected drag-forbidden area is the drag-forbidden area. For example, if the absolute value of the difference value is above the preset threshold, it may be determined that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area.
In order to avoid a floor mopping assembly contacting the suspected no-mopping area when the processing device detects the driving signal, when the processing device determines that the mobile robot is approaching the edge of the suspected no-mopping area, the moving speed of the mobile robot is reduced so as to avoid collision on the suspected no-mopping area.
In some embodiments, after determining the edge of the drag-forbidden region, information describing the edge of the drag-forbidden region is also stored in the storage device. The information describing the edge of the drag forbidden region includes but is not limited to: image features of the drag inhibited area edges determined from the acquired images, map data determined from the acquired images and relative spatial locations. The map data includes, but is not limited to: the position of at least part of the edge of the drag forbidden area shot in the image in the map, the position of landmark points such as end points of the at least part of the edge in the map, the curvature of an edge line formed by the landmark points of the at least part of the edge in the map, and the like.
After determining that the suspected drag-forbidden area is the drag-forbidden area, the processing device executes step S140.
In step S140, at least one behavior operation of the mobile robot in the mopping mode is adjusted according to the determined mopping prohibition area.
After determining that the suspected drag-forbidden area is a drag-forbidden area, adjusting at least one behavior operation of the mobile robot in a drag mode to avoid the mobile robot performing a drag operation in the drag-forbidden area.
In one embodiment, the processing device controls the mobile robot to change the moving direction to continue to perform the mopping operation and/or the sweeping operation. For example, the processing device re-plans the moving route based on the identified edge of the no-drag area to perform a mopping operation, a sweeping operation, or a combination thereof, thereby ensuring that the mobile robot does not enter the no-drag area.
In another embodiment, when the mobile robot is near the edge of the drag-forbidden zone, the mobile robot is controlled to close the drag operation in the drag mode to enter the drag-forbidden zone. For example, the mobile robot is controlled not to change the planned navigation route, and during the passing of the drag-forbidden zone, the position of the floor-dragging component of the mobile robot is changed to avoid performing a floor-dragging operation on the drag-forbidden zone. Wherein changing the position of the floor mopping assembly of the mobile robot comprises: and lifting the mopping assembly to a preset height.
In some embodiments, the mobile robot further comprises an alarm device connected to the processing device for sending an alarm message when the processing device determines that the suspected no-drag area is a no-drag area. By utilizing the alarm device, the information of finding the dragging forbidden area can be sent out immediately so that the user can know the area where the dragging operation is not executed.
Based on the above understanding, the mobile robot acquires the image from the image capturing device during the floor mopping operation, the problem that the mobile robot cannot effectively identify the carpet and the like in the prior art can be effectively solved by accurately identifying the mopping prohibited area at least according to the image, and at least one behavior operation of the mobile robot in the floor mopping mode can be effectively adjusted based on the position of the mopping prohibited area determined by the image and the accurate identification of the mopping prohibited area, so as to avoid the mobile robot performing the floor mopping operation in the mopping prohibited area.
In practical applications, the storage device is further configured to store information describing an edge of the drag forbidden area. Wherein. The information describing the edge of the drag forbidden region comprises: the image corresponding to the edge of the drag-forbidden region, the image characteristic of the edge of the drag-forbidden region and the map data corresponding to the edge of the drag-forbidden region. The map data includes, but is not limited to: the position of the edge of the drag-forbidden area in the map, the position of the end point of the edge of the drag-forbidden area in the map, the curvature of each point on the edge of the drag-forbidden area in the map, and the like.
For example, when the mobile robot performs a mopping operation by taking an i-shaped route, the processing device identifies an edge of a suspected mopping-forbidden area according to the acquired image in the mopping mode, and after determining that the edge of the suspected mopping-forbidden area is the edge of the mopping-forbidden area according to a driving signal detected by an edge attachment of the suspected mopping-forbidden area, the processing device further stores information, such as image features describing the edge of the mopping-forbidden area or map data of the edge of the mopping-forbidden area, in the storage device. In the subsequent moving process, if the edge of the suspected drag-forbidden area is identified in the acquired image, the suspected drag-forbidden area can be determined to be the drag-forbidden area without controlling the mobile robot to move to the edge of the suspected drag-forbidden area to detect the driving signal only by performing matching processing on the information describing the edge of the suspected drag-forbidden area and the pre-stored information describing the edge of the drag-forbidden area.
For another example, when configuring the mobile robot, the user controls the image capturing device of the mobile robot to capture an image including an edge of the drag-forbidden area, for example, when initially configuring the drag-forbidden area recognition mode, the user controls the mobile robot to capture an image when traveling right in front of a carpet, and the mobile robot takes the image captured by the image capturing device as an image corresponding to the edge of the drag-forbidden area.
As another example, when the mobile robot is controlled to capture an image including an edge of the drag-forbidden area, the mobile robot further determines the relative spatial position of the mobile robot and the edge of the drag-forbidden area according to the image and the physical reference information, and further determines map data describing the edge of the drag-forbidden area.
Referring to fig. 8, fig. 8 is a flowchart illustrating a control method of a mobile robot according to another embodiment of the present application, where the control method includes steps S210, S220, S230, and S240. The control method is applied to the control system 10 which stores information describing the edge of the drag forbidden area.
In step S210, an image from the image pickup apparatus is acquired. The step S210 is the same as or similar to the step S110, and is not described in detail here.
In step S220, an edge of a suspected forbidden and dragged area in the physical space where the mobile robot is located is identified from the image, and a relative spatial position between the edge of the suspected forbidden and dragged area and the mobile robot is determined by using at least the image. The step S220 is the same as or similar to the step S120, and will not be described in detail here.
In step S230, performing matching processing on the information describing the edge of the suspected drag-forbidden area and the pre-stored information describing the edge of the drag-forbidden area to determine that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area; wherein the information describing the edges of the suspected drag-forbidden region is determined from the acquired image and/or the relative spatial position.
Specifically, after the processing device identifies the edge of the suspected forbidden and draged area and calculates the relative spatial position between the edge and the mobile robot, the processing device determines that the edge of the suspected forbidden and draged area is the edge of the forbidden and draged area based on the information describing the edge of the suspected forbidden and draged area, and then controls the mobile robot based on the relative spatial position and the determined edge of the forbidden and draged area, so that the number of times that the mobile robot confirms the edge of each suspected forbidden and draged area by using a driving signal is reduced.
In an embodiment, the information describing the edge of the suspected drag-forbidden area includes an image feature determined according to the acquired image, the processing device performs matching processing on the image feature describing the edge of the suspected drag-forbidden area and a pre-stored image feature describing the edge of the drag-forbidden area, and when the two image features are matched, it is determined that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area. Wherein the image features are characterized by depth data or color data, feature lines, feature points, and combinations thereof that match the shape or contour of the object; the image features of the edge of the suspected drag-forbidden area and the edge of the drag-forbidden area in the image are continuous and regular, for example, continuous long and straight image features, continuous arc image features, or continuous regular curve image features.
For example, after the image features of the edge of the drag-forbidden area determined according to the foregoing embodiment are stored in the storage device, the image features of the edge of the currently acquired suspected drag-forbidden area are subjected to feature matching with the image features stored in advance, and when it is determined that the two image features have mutually matched image features, it may be determined that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area. When the two partial image features have the image features matched with each other, the edge of the suspected drag-forbidding area in the actual physical space and the edge of the predetermined drag-forbidding area have an overlapping part.
In another embodiment, the information describing the edge of the suspected drag-forbidden area includes map data determined according to the image and the relative spatial position, and the processing device performs matching processing on the map data describing the edge of the suspected drag-forbidden area and pre-stored map data describing the edge of the drag-forbidden area to determine that the edge of the suspected drag-forbidden area is the edge of the drag-forbidden area. Wherein the map data includes: the position of the edge of the drag-forbidden region in the map, the curvature of the edge of the drag-forbidden region in the map, and the like. The distribution positions of a plurality of points on the edge of the drag forbidden area in the map form a curve, and the curvature comprises the curvature of each point on the curve.
For example, when the position of the end point of the edge of the suspected drag-forbidden area in the map is a preset distance away from the position of the end point of the edge of the stored drag-forbidden area in the map, the mobile robot extends the map data of the edge of the drag-forbidden area in the map according to the curvature of the drag-forbidden area to intersect the extended map data with the map data of the edge of the suspected drag-forbidden area in the map, so that the edge of the suspected drag-forbidden area and the edge of the stored drag-forbidden area together describe the same drag-forbidden area in the map. The preset distance may be 0cm to 5cm, but not limited thereto, and the preset distance may also be set differently in practical applications. For example, the user sets the preset distance according to the object placement condition of the indoor environment.
For another example, when a position of an end point of the edge of the suspected drag-forbidden area in the map is a preset distance away from a position of a stored end point of the edge of the drag-forbidden area in the map, and the end point of the edge of the suspected drag-forbidden area and the end point of the edge of the drag-forbidden area have the same or similar curvature, it may be determined that the edge of the suspected drag-forbidden area is a continuation of the edge of the drag-forbidden area.
For another example, when the position of the end point of the edge of the suspected drag-forbidden area in the map at least partially overlaps with the position of the stored edge of the drag-forbidden area in the map, it may be determined that there is an overlapping portion between the edge of the suspected drag-forbidden area and the edge of the drag-forbidden area.
For example, the processing device may further count curvatures of a plurality of points on the edge of the suspected drag-forbidden region, and when a number of the counted curvatures, which is the same as or similar to the curvatures of the plurality of points on the edge of the prestored drag-forbidden region, satisfies a threshold, it may be determined that there is an overlapping portion between the edge of the suspected drag-forbidden region and the edge of the drag-forbidden region.
After determining the edge of the drag prohibiting area in the physical space according to the foregoing steps, the processing apparatus may execute step S240.
In step S240, at least one behavior operation or a movement operation of the mobile robot in the mopping mode is adjusted according to the determined edge of the mopping forbidden area and the relative spatial position.
In one embodiment, when the mobile robot is determined to move to the vicinity of the edge of the mopping forbidden area according to the relative spatial position, the mobile robot is controlled to change the moving direction to continue to perform the mopping operation and/or the sweeping operation.
For example, when the mobile robot drags the floor in the determined no-mopping area based on the current mopping route, when the mobile robot moves to the vicinity of the edge of the no-mopping area based on the mopping route and the relative spatial position, the processing device controls the mobile robot to change the moving direction to continue to perform the mopping operation, the sweeping operation, or the combination thereof, thereby ensuring that the mobile robot does not perform the mopping operation in the no-mopping area.
For another example, the mobile robot may determine the position of the edge of the no-drag area in the map according to the position of the mobile robot in the map and the relative spatial position, so that when the mobile robot identifies that the edge of the suspected no-drag area is related to the position of the edge of the no-drag area in the map in a subsequent mopping process, the mobile robot does not need to move to the edge of the suspected no-drag area again for confirmation, and thus the mobile robot may be directly controlled as described above.
As another example, the processing device may re-plan a route based on the relative spatial locations, the mobile robot moving according to the re-planned route, the re-planned route instructing the mobile robot to change a moving direction when moving to a vicinity of an edge of the no-drag area so as to ensure that the mobile robot does not perform a drag operation in the no-drag area.
In another embodiment, when the mopping assembly of the mobile robot has the function of lifting, when the mobile robot is determined to move to the vicinity of the edge of the mopping forbidden area according to the relative spatial position, the mobile robot is controlled to close the mopping operation in the mopping mode to enter the mopping forbidden area. For example, the mobile robot is controlled not to change the planned mopping route, and during passing through the mopping prohibited area, the position of the mopping component of the mobile robot is changed to avoid performing mopping operations on the mopping prohibited area. Wherein changing the position of the floor mopping assembly of the mobile robot comprises: lifting the mopping assembly to a preset height and turning off the spraying device or the sprinkling device.
In some embodiments, the mobile robot further comprises an alarm device, connected to the processing device, for sending an alarm message when the processing device determines that the suspected no-drag area is a no-drag area. By utilizing the alarm device, the information of finding the dragging forbidding area can be sent out immediately so that the user can change the moving route of the mobile robot forcibly.
Based on the above understanding, when the mobile robot performs a mopping operation in a physical space after the storage device stores the predetermined information describing the edge of the suspected mopping prohibited area, the mobile robot may determine that the edge of the suspected mopping prohibited area is the edge of the mopping prohibited area without detecting a driving signal of the mobile robot moving to the vicinity of the edge of the suspected mopping prohibited area, and determine only by matching the information describing the edge of the suspected mopping prohibited area with the pre-stored information describing the edge of the mopping prohibited area, so as to avoid that the mobile robot needs to move to the edge of the suspected mopping prohibited area each time to determine whether the edge of the suspected mopping prohibited area is the edge of the mopping prohibited area.
Based on the control method of the mobile robot shown in fig. 2 in the present application, the present application also provides a mobile robot, please refer to fig. 9, which is a schematic structural diagram of the mobile robot in an embodiment of the present application, as shown in the figure, the mobile robot includes a storage device 11, an image capturing device 20, a processing device 13, a cleaning device 40, and a moving device 50.
The storage means 11 and the processing means 13 may correspond to the storage means and the processing means in the control system 10 mentioned in the foregoing fig. 1, and will not be described in detail here. The processing device 13 is connected to the image capturing device 20, the moving device 50, and the cleaning device 40 by the interface device 12 in the control system 10.
The image capturing apparatus 20 is configured to capture an image, wherein the image is a color image and/or a depth image. The image capturing device 20 for obtaining the depth image and the color image and the assembly thereof are the same as or similar to those described above, and will not be described in detail here.
The moving means 50 are connected to the processing means 13 for controlled execution of moving operations. In practical embodiments, the moving device 50 may include a traveling mechanism and a driving mechanism, wherein the traveling mechanism may be disposed at the bottom of the mobile robot, and the driving mechanism is disposed in the housing of the mobile robot. Further, the traveling mechanism may be in a traveling wheel manner, and in one implementation, the traveling mechanism may include at least two universal traveling wheels, for example, and the at least two universal traveling wheels realize the movement of advancing, retreating, steering, rotating and the like. In other implementations, the travel mechanism may, for example, comprise a combination of two straight travel wheels and at least one auxiliary steering wheel, wherein the two straight travel wheels are primarily used for forward and reverse travel in the case where the at least one auxiliary steering wheel is not engaged, and wherein steering and rotational etc. movement is achieved in the case where the at least one auxiliary steering wheel is engaged and engaged with the two straight travel wheels. The driving mechanism can be, for example, a driving motor, and the driving motor can be used for driving a traveling wheel in the traveling mechanism to move. In a specific implementation, the driving motor can be a reversible driving motor, for example, and a speed change mechanism can be further arranged between the driving motor and the wheel axle of the travelling wheel.
The working process of the mobile robot is as follows: the processing device 13 performs edge detection on the acquired image to identify an edge of a suspected forbidden to drag area in a physical space where the mobile robot is located, and determines a relative spatial position between the edge of the suspected forbidden to drag area and the mobile robot by using at least the image, and the processing device 13 sends a movement control instruction including a direction and a movement distance, or including a direction and a movement speed, to the mobile device 50 according to the identified edge of the suspected forbidden to drag area and the relative spatial position thereof relative to the mobile robot, so that the mobile device 50 drives the mobile robot to move integrally according to the movement control instruction, so as to execute a movement operation on the edge of the suspected forbidden to drag area, and detect a driving signal value in the driving mechanism.
The cleaning device 40 includes a mopping assembly (not shown) for controlled performance of mopping operations. The mopping assembly comprises: a mop pad, a mop pad carrier, a spraying device, a watering device, etc. The mopping assembly is used for controlling to execute mopping operation in the mopping mode.
For example, when the mobile robot does not reach the edge of the drag prohibited area, a drag operation is performed on the travel plane.
As another example, the working process of the mobile robot is as follows: the processing device 13 performs edge detection on the acquired image to identify an edge of a suspected forbidden to drag area in a physical space where the mobile robot is located, determines a relative spatial position between the edge of the suspected forbidden to drag area and the mobile robot according to at least the acquired image, and the processing device 13 sends a movement control instruction containing a direction and a movement distance, or a direction and a movement speed, to the mobile device 50 according to the identified edge of the suspected forbidden to drag area and the relative spatial position of the edge relative to the mobile robot, so that the mobile device 50 drives the mobile robot to move integrally according to the movement control instruction, controls the floor mopping component to continue to perform floor mopping operation in the process of moving to the edge of the suspected forbidden to drag area, controls the mobile robot to perform movement operation on the edge of the suspected forbidden to drag area and detects a driving signal value in the driving mechanism, when it is determined from the driving signal value that the edge of the suspected no-drag area is the no-drag area, the processing device 13 controls the mobile robot to turn off the drag operation in the drag mode (for example, turn off the sprinkler and lift the drag pad and the drag pad carrier) to enter the no-drag area, or controls the mobile robot to change the moving direction to continue to perform the drag operation.
In one embodiment, the cleaning device 40 further comprises a sweeping assembly (not shown) for performing a sweeping operation under control. The cleaning assembly can comprise an edge brush, a rolling brush and a motor, wherein the edge brush motor is located at the bottom of the shell, the rolling brush is used for controlling the edge brush motor of the edge brush, the rolling brush motor of the rolling brush is used for controlling the rolling brush, the number of the edge brushes can be at least two and are respectively and symmetrically arranged on two opposite sides of the front end of the shell of the mobile robot, and the edge brushes can adopt rotary edge brushes and can rotate under the control of the edge brush motor. The rolling brush is positioned in the middle of the bottom of the mobile robot and can rotate under the control of the rolling brush motor to clean, so that garbage is swept into the cleaning floor and conveyed into the dust collection assembly through the collection inlet. The dust collection assembly can comprise a dust collection chamber and a fan, wherein the dust collection chamber is arranged in the shell, and the fan is used for providing suction force to suck the garbage into the dust collection chamber. The cleaning device 40 is not limited thereto.
When the cleaning device 40 includes a sweeping component, the moving device 50 drives the mobile robot to move integrally to the vicinity of the edge of the suspected mopping prohibition area according to the movement control instruction, and may further control the sweeping component in the cleaning device 40 to perform a sweeping operation, so as to detect a driving signal value in the side brush motor or the rolling brush motor; when the processing device 13 determines that the edge of the suspected mopping-forbidden area is the edge of the mopping-forbidden area according to the driving signal value, the processing device 13 controls the moving device 50 of the mobile robot to change the moving direction to continue to perform the mopping operation and the sweeping operation.
When the storage device 11 stores information describing the edge of the drag prohibition area in advance, the working process of the mobile robot is as follows: the processing device 13 performs edge detection on the acquired image to identify an edge of a suspected prohibited area in a physical space where the mobile robot is located, at least determines a relative spatial position between the edge of the suspected prohibited area and the mobile robot by using the image, the processing device 13 performs matching processing on information describing the edge of the suspected prohibited area and information describing the edge of a prestored prohibited area to determine that the edge of the suspected prohibited area is the edge of the prohibited area, the processing device 13 controls a traveling mechanism of the mobile robot, so that when the mobile robot moves to the vicinity of the edge of the prohibited area based on a preset floor dragging route, the processing device 13 controls the mobile device 50 of the mobile robot to change a moving direction to continuously control a floor dragging component to perform floor dragging operation and floor sweeping component floor sweeping operation, Or a combination thereof, to ensure that the mobile robot does not perform a mopping operation in a no-mopping area.
The present application also provides a computer-readable and writable storage medium storing at least one program that, when invoked, executes and implements at least one embodiment described above with respect to the control method shown in fig. 2 or at least one embodiment described with respect to the control method shown in fig. 8.
The functions, if implemented in the form of software functional units and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application or portions thereof that substantially contribute to the prior art may be embodied in the form of a software product, which is stored in a storage medium and includes instructions for enabling a mobile robot equipped with the storage medium to perform all or part of the steps of the method according to the embodiments of the present application.
In the embodiments provided herein, the computer-readable and writable storage medium may include read-only memory, random-access memory, EEPROM, CD-ROM or other optical disk storage, magnetic disk storage or other magnetic storage devices, flash memory, a USB flash drive, a removable hard disk, or any other medium that can be used to store desired program code in the form of instructions or data structures and that can be accessed by a computer. Also, any connection is properly termed a computer-readable medium. For example, if the instructions are transmitted from a website, server, or other remote source using a coaxial cable, fiber optic cable, twisted pair, Digital Subscriber Line (DSL), or wireless technologies such as infrared, radio, and microwave, then the coaxial cable, fiber optic cable, twisted pair, DSL, or wireless technologies such as infrared, radio, and microwave are included in the definition of medium. It should be understood, however, that computer-readable-writable storage media and data storage media do not include connections, carrier waves, signals, or other transitory media, but are intended to be non-transitory, tangible storage media. Disk and disc, as used in this application, includes Compact Disc (CD), laser disc, optical disc, Digital Versatile Disc (DVD), floppy disk and blu-ray disc where disks usually reproduce data magnetically, while discs reproduce data optically with lasers.
In one or more exemplary aspects, the functions described in the computer program of the methods described herein may be implemented in hardware, software, firmware, or any combination thereof. When implemented in software, the functions may be stored on or transmitted over as one or more instructions or code on a computer-readable medium. The steps of a method or algorithm disclosed herein may be embodied in a processor-executable software module, which may be located on a tangible, non-transitory computer-readable and/or writable storage medium. Tangible, non-transitory computer readable and writable storage media may be any available media that can be accessed by a computer.
The flowcharts and block diagrams in the figures described above of the present application illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present application. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems which perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The above embodiments are merely illustrative of the principles and utilities of the present application and are not intended to limit the application. Any person skilled in the art can modify or change the above-described embodiments without departing from the spirit and scope of the present application. Accordingly, it is intended that all equivalent modifications or changes which can be made by those skilled in the art without departing from the spirit and technical concepts disclosed in the present application shall be covered by the claims of the present application.

Claims (26)

1. A control method of a mobile robot including an image pickup device, the control method comprising:
acquiring an image from the image pickup apparatus;
identifying the edge of a suspected forbidden and dragged area in the physical space of the mobile robot from the image, and determining the relative spatial position between the edge of the suspected forbidden and dragged area and the mobile robot by using at least the image;
when it is determined that the mobile robot moves to the vicinity of the edge of the suspected drag-forbidding area according to the relative spatial position, determining that the suspected drag-forbidding area is the drag-forbidding area by detecting a driving signal in the mobile robot;
adjusting at least one behavior operation of the mobile robot in a mopping mode according to the determined mopping prohibition area;
the control method further comprises the following steps: acquiring a new image during a subsequent movement of the mobile robot determining a drag-forbidden zone; and identifying an edge of a new suspected drag-forbidden region from the new image;
and matching the edge of the identified new suspected drag-forbidden area with the edge of the determined drag-forbidden area, determining that the edge of the new suspected drag-forbidden area is the drag-forbidden area when the edge of the identified new suspected drag-forbidden area is matched with the edge of the determined drag-forbidden area, and adjusting at least one behavior operation of the mobile robot in a drag mode.
2. The method of controlling a mobile robot according to claim 1, wherein the image is a color image and/or a depth image.
3. The method according to claim 1, wherein the image is a color image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the image comprises:
and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel position of the edge of the suspected drag-forbidding area in the color image and preset physical reference information.
4. The method according to claim 3, wherein the physical reference information includes: the device comprises a physical height of the image pickup device from a traveling plane, physical parameters of the image pickup device and an included angle of a main optical axis of the image pickup device relative to a horizontal plane or a vertical plane.
5. The method according to claim 1, wherein the image is a depth image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the image comprises:
and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel value of the edge of the suspected drag-forbidding area in the depth image.
6. The method of claim 1, wherein the step of determining that the suspected no-drag area is a no-drag area by detecting a driving signal in the mobile robot comprises: and when a preset condition is met between a reference value and the detected value of the detected driving signal, determining that the suspected drag-forbidden area is a drag-forbidden area.
7. The method of claim 1 or 6, wherein the driving signal is collected from a working component of the mobile robot performing sweeping operation and/or a working component of the mobile robot performing moving operation.
8. The method of claim 6, wherein the reference value is a predetermined value or is determined before the mobile robot moves to the edge of the suspected forbidden zone.
9. The method of claim 1, wherein the step of adjusting at least one behavior operation of the mobile robot in the mopping mode according to the determined mopping prohibited area comprises at least one of:
controlling the mobile robot to change the moving direction to continuously perform a mopping operation and/or a sweeping operation;
controlling the mobile robot to turn off a mopping operation in a mopping mode to enter the no-mopping area.
10. The method according to claim 1, wherein the step of identifying the edge of the suspected forbidden and dragged area in the physical space in which the mobile robot is located from the image comprises:
and identifying the edge of the suspected drag-forbidden area from the image by using a preset height condition set based on the traveling plane of the mobile robot.
11. The control method of a mobile robot according to claim 1, characterized by further comprising:
matching the information describing the edge of the suspected drag-forbidding area with the pre-stored information describing the edge of the drag-forbidding area to determine that the edge of the suspected drag-forbidding area is the edge of the drag-forbidding area; wherein information describing an edge of the suspected drag-forbidden region is determined from the acquired image and/or the relative spatial location;
and adjusting at least one behavior operation or moving operation of the mobile robot in a mopping mode according to the determined edge of the mopping forbidden area and the relative spatial position.
12. The control method of a mobile robot according to claim 1, wherein the acquired image is acquired by controlling the image pickup device to rotate.
13. A control method of a mobile robot including an image pickup device, the control method comprising:
acquiring an image from the image pickup apparatus;
identifying the edge of a suspected drag-forbidding area in the physical space where the mobile robot is located from the acquired image; and
determining a relative spatial position between an edge of the suspected no-drag region and the mobile robot using at least the acquired image;
matching the information describing the edge of the suspected drag-forbidding area with the pre-stored information describing the edge of the drag-forbidding area to determine that the edge of the suspected drag-forbidding area is the edge of the drag-forbidding area; wherein information describing an edge of the suspected drag-forbidden region is determined from the acquired image and/or the relative spatial location; the pre-stored information describing the edge of the forbidden and draged region is stored after the edge of the suspected forbidden and draged region is determined to be the edge of the forbidden and draged region according to the driving signal detected by the edge attachment of the suspected forbidden and draged region;
and adjusting at least one behavior operation or moving operation of the mobile robot in a mopping mode according to the determined edge of the mopping forbidden area and the relative spatial position.
14. The method of controlling a mobile robot according to claim 13, wherein the image is a color image and/or a depth image.
15. The method according to claim 13, wherein the information describing the edge of the suspected drag-forbidden area includes an image feature determined from the acquired image, and the step of matching the information describing the edge of the suspected drag-forbidden area with pre-stored information describing the edge of the drag-forbidden area includes:
and matching the image characteristics describing the edge of the suspected drag-forbidden area with the image characteristics describing the edge of the drag-forbidden area, which are stored in advance.
16. The method according to claim 13, wherein the image is a color image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the acquired image comprises:
and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel position of the edge of the suspected drag-forbidding area in the color image and preset physical reference information.
17. The method according to claim 13, wherein the image is a depth image, and the step of determining the relative spatial position between the edge of the suspected forbidden zone and the mobile robot using at least the acquired image comprises:
and determining the relative spatial position between the edge of the suspected drag-forbidding area and the mobile robot according to the pixel value of the edge of the suspected drag-forbidding area in the depth image.
18. The method of claim 13, wherein the step of adjusting at least one of a behavior operation or a movement operation of the mobile robot in the mopping mode according to the determined edge and relative spatial position of the mopping prohibited area comprises at least one of:
when the mobile robot is determined to move to the vicinity of the edge of the mopping forbidding area according to the relative spatial position, controlling the mobile robot to change the moving direction to continuously perform mopping operation and/or sweeping operation;
when the mobile robot is determined to move to the vicinity of the edge of the no-dragging area according to the relative spatial position, controlling the mobile robot to close the dragging operation in the dragging mode to enter the no-dragging area.
19. The control method of a mobile robot according to claim 13, wherein the acquired image is acquired by controlling the image pickup device to rotate.
20. A mobile robot, comprising:
an image pickup device for picking up an image;
the mobile device is used for controlled execution of mobile operation;
a cleaning device comprising a mopping assembly; wherein the mopping assembly is used for controlling to execute mopping operation;
a storage device for storing at least one program;
processing means connected to the moving means, the cleaning means, the storage means, and the image pickup means for calling up and executing the at least one program to coordinate the moving means, the cleaning means, the storage means, and the image pickup means to execute and implement the control method according to any one of claims 1 to 12.
21. The mobile robot of claim 20, wherein the cleaning device further comprises a sweeping assembly for controlled performance of sweeping operations.
22. A mobile robot, comprising:
an image pickup device for picking up an image;
the mobile device is used for controlled execution of mobile operation;
a cleaning device comprising a mopping assembly; wherein the mopping assembly is used for controlling to execute mopping work;
the storage device is used for storing at least one program and storing information describing the edge of the drag forbidden area;
processing means connected to the moving means, the cleaning means, the storage means, and the image pickup means for calling up and executing the at least one program to coordinate the moving means, the cleaning means, the storage means, and the image pickup means to execute and implement the control method according to any one of claims 13 to 19.
23. The mobile robot of claim 22, wherein the cleaning device further comprises a sweeping assembly for controlled performance of sweeping operations.
24. A control system of a mobile robot equipped with an image pickup device, comprising:
an interface device for receiving an image picked up from the image pickup device and outputting a control instruction for controlling the mobile robot;
a storage device for storing at least one program;
processing means, connected to said interface means and storage means, for invoking and executing said at least one program to coordinate said interface means, storage means and image capturing means to execute and implement a control method according to any one of claims 1-12.
25. A control system of a mobile robot equipped with an image pickup device, comprising:
an interface device for receiving an image picked up from the image pickup device and outputting a control instruction for controlling the mobile robot;
the storage device is used for storing at least one program and storing information describing the edge of the drag forbidden area;
processing means, connected to said interface means and storage means, for invoking and executing said at least one program to coordinate said interface means, storage means and image capturing means to execute and implement a control method according to any one of claims 13-19.
26. A computer-readable storage medium characterized by storing at least one program that executes and implements the control method according to any one of claims 1 to 12 or the control method according to any one of claims 13 to 19 when being called.
CN202010514690.8A 2020-06-08 2020-06-08 Control method, control system and storage medium for mobile robot Active CN111813103B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010514690.8A CN111813103B (en) 2020-06-08 2020-06-08 Control method, control system and storage medium for mobile robot

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010514690.8A CN111813103B (en) 2020-06-08 2020-06-08 Control method, control system and storage medium for mobile robot

Publications (2)

Publication Number Publication Date
CN111813103A CN111813103A (en) 2020-10-23
CN111813103B true CN111813103B (en) 2021-07-16

Family

ID=72844707

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010514690.8A Active CN111813103B (en) 2020-06-08 2020-06-08 Control method, control system and storage medium for mobile robot

Country Status (1)

Country Link
CN (1) CN111813103B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115429161B (en) * 2022-07-29 2023-09-29 云鲸智能(深圳)有限公司 Control method, device and system of cleaning robot and storage medium

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004252631A (en) * 2003-02-19 2004-09-09 Mitsubishi Heavy Ind Ltd Agv operation control method, its apparatus, and physical distribution system using agv
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN109846427A (en) * 2019-01-16 2019-06-07 深圳乐动机器人有限公司 A kind of control method and clean robot of clean robot
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN110852312A (en) * 2020-01-14 2020-02-28 深圳飞科机器人有限公司 Cliff detection method, mobile robot control method, and mobile robot
CN110989631A (en) * 2019-12-30 2020-04-10 科沃斯机器人股份有限公司 Self-moving robot control method, device, self-moving robot and storage medium
CN111166238A (en) * 2018-11-09 2020-05-19 北京奇虎科技有限公司 Processing method, device and equipment for cleaning forbidden zone and storage medium
CN111220148A (en) * 2020-01-21 2020-06-02 珊口(深圳)智能科技有限公司 Mobile robot positioning method, system and device and mobile robot

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR100847136B1 (en) * 2006-08-14 2008-07-18 한국전자통신연구원 Method and Apparatus for Shoulder-line detection and Gesture spotting detection
JP2009042147A (en) * 2007-08-10 2009-02-26 Nsk Ltd Apparatus and method for recognizing object
KR101949277B1 (en) * 2012-06-18 2019-04-25 엘지전자 주식회사 Autonomous mobile robot
KR102586010B1 (en) * 2014-02-28 2023-10-11 삼성전자주식회사 Cleaning robot and remote controller therein
CN105467985B (en) * 2014-09-05 2018-07-06 科沃斯机器人股份有限公司 From mobile surface walking robot and its image processing method
CN104407346A (en) * 2014-12-01 2015-03-11 中国航空工业集团公司上海航空测控技术研究所 Mobile runway foreign object debris (FOD) monitoring method based on information integration
US10496262B1 (en) * 2015-09-30 2019-12-03 AI Incorporated Robotic floor-cleaning system manager
CN107752927B (en) * 2017-11-17 2020-07-14 北京奇虎科技有限公司 Block adjusting method, device and equipment of cleaning robot and storage medium
WO2019232806A1 (en) * 2018-06-08 2019-12-12 珊口(深圳)智能科技有限公司 Navigation method, navigation system, mobile control system, and mobile robot
CN109074083B (en) * 2018-06-08 2022-02-18 珊口(深圳)智能科技有限公司 Movement control method, mobile robot, and computer storage medium
CN111766877B (en) * 2018-06-27 2021-08-31 北京航空航天大学 Robot

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004252631A (en) * 2003-02-19 2004-09-09 Mitsubishi Heavy Ind Ltd Agv operation control method, its apparatus, and physical distribution system using agv
CN108125622A (en) * 2017-12-15 2018-06-08 珊口(上海)智能科技有限公司 Control method, system and the clean robot being applicable in
CN111166238A (en) * 2018-11-09 2020-05-19 北京奇虎科技有限公司 Processing method, device and equipment for cleaning forbidden zone and storage medium
CN109846427A (en) * 2019-01-16 2019-06-07 深圳乐动机器人有限公司 A kind of control method and clean robot of clean robot
CN110622085A (en) * 2019-08-14 2019-12-27 珊口(深圳)智能科技有限公司 Mobile robot and control method and control system thereof
CN110989631A (en) * 2019-12-30 2020-04-10 科沃斯机器人股份有限公司 Self-moving robot control method, device, self-moving robot and storage medium
CN110852312A (en) * 2020-01-14 2020-02-28 深圳飞科机器人有限公司 Cliff detection method, mobile robot control method, and mobile robot
CN111220148A (en) * 2020-01-21 2020-06-02 珊口(深圳)智能科技有限公司 Mobile robot positioning method, system and device and mobile robot

Also Published As

Publication number Publication date
CN111813103A (en) 2020-10-23

Similar Documents

Publication Publication Date Title
US11042760B2 (en) Mobile robot, control method and control system thereof
JP6633568B2 (en) Autonomous coverage robot
CN106489104B (en) System and method for use of optical odometry sensors in a mobile robot
US11547255B2 (en) Cleaning robot
EP3104194B1 (en) Robot positioning system
US8340901B2 (en) Mobile robot and path planning method thereof for manipulating target objects
CN111035327A (en) Cleaning robot, carpet detection method, and computer-readable storage medium
CN112506181A (en) Mobile robot and control method and control system thereof
CN112034837A (en) Method for determining working environment of mobile robot, control system and storage medium
CN112000093B (en) Control method, control system and storage medium for mobile robot
CN111813103B (en) Control method, control system and storage medium for mobile robot
CN211933898U (en) Cleaning robot
AU2015224421B2 (en) Autonomous coverage robot
JP7325058B2 (en) self-propelled vacuum cleaner
CN219609490U (en) Self-moving equipment
EP2325713B1 (en) Methods and systems for movement of robotic device using video signal
AU2013338354B9 (en) Autonomous coverage robot
JP2022161975A (en) self-propelled vacuum cleaner

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant