CN110604518A - Sweeping robot and control method thereof - Google Patents

Sweeping robot and control method thereof Download PDF

Info

Publication number
CN110604518A
CN110604518A CN201911043683.8A CN201911043683A CN110604518A CN 110604518 A CN110604518 A CN 110604518A CN 201911043683 A CN201911043683 A CN 201911043683A CN 110604518 A CN110604518 A CN 110604518A
Authority
CN
China
Prior art keywords
garbage
controller
sweeping robot
information
chassis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911043683.8A
Other languages
Chinese (zh)
Inventor
韩勇
王科
吴志强
张弛
陈燕飞
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang Zhi Ling Robot Technology Co Ltd
Original Assignee
Zhejiang Zhi Ling Robot Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang Zhi Ling Robot Technology Co Ltd filed Critical Zhejiang Zhi Ling Robot Technology Co Ltd
Priority to CN201911043683.8A priority Critical patent/CN110604518A/en
Publication of CN110604518A publication Critical patent/CN110604518A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/24Floor-sweeping machines, motor-driven
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L11/00Machines for cleaning floors, carpets, furniture, walls, or wall coverings
    • A47L11/40Parts or details of machines not provided for in groups A47L11/02 - A47L11/38, or not restricted to one of these groups, e.g. handles, arrangements of switches, skirts, buffers, levers
    • A47L11/4011Regulation of the cleaning machine by electric means; Control systems and remote control systems therefor

Abstract

The invention provides a sweeping robot and a control method thereof. The sweeping robot comprises a moving chassis, a depth camera, a manipulator and a controller. The depth camera and the manipulator are respectively arranged on the movable chassis, the depth camera is used for collecting image information, and the manipulator is used for picking up garbage. The controller is electrically connected with the camera and the mobile chassis respectively, the controller stores a deep learning model, the controller guides image information into the deep learning model to identify garbage and obtain three-dimensional information of the garbage, and the controller controls the mobile chassis to move to the garbage position and controls the manipulator to pick up the garbage according to the three-dimensional information of the garbage. By applying the technical scheme of the invention, the automatic clearing of garbage with larger volume can be realized, and the use by a user is convenient.

Description

Sweeping robot and control method thereof
Technical Field
The invention relates to the technical field of robots, in particular to a sweeping robot and a control method of the sweeping robot.
Background
The development of the current technology is changing day by day, and the progress of science and technology also brings great convenience to the production and life of human beings. In urban life, science and technology products with various formats are more and more commonly found in thousands of households, and a sweeping robot is taken as one of representative products, can be found in any feasible area of a family to help sweeping dust and garbage.
However, it has a major drawback that it cannot identify the object autonomously, so that the type of the object cannot be determined, and the object with a large volume cannot be removed.
Disclosure of Invention
The invention mainly aims to provide a sweeping robot and a control method thereof, and aims to solve the technical problem that the sweeping robot in the prior art cannot clear large-size garbage.
In order to achieve the above object, according to one aspect of the present invention, there is provided a sweeping robot comprising: moving the chassis; the depth camera is arranged on the movable chassis and used for collecting image information; the manipulator is arranged on the movable chassis and used for picking up the garbage; the controller is electrically connected with the camera and the mobile chassis respectively, the controller stores a deep learning model, the controller guides image information into the deep learning model to identify garbage and obtain three-dimensional information of the garbage, and the controller controls the mobile chassis to move to the garbage position and controls the manipulator to pick up the garbage according to the three-dimensional information of the garbage.
In one embodiment, the sweeping robot further comprises: the controller is electrically connected with the laser radar, constructs a two-dimensional grid map according to the environment information, and controls the moving chassis to move according to the two-dimensional grid map.
In one embodiment, the controller further calculates position information according to the three-dimensional information of the garbage, and controls the moving chassis and the manipulator according to the position information.
In one embodiment, the robot comprises a robot arm mounted on a mobile chassis and a gripper jaw mounted at the end of the robot arm.
In one embodiment, the robotic arm is a five-axis robotic arm.
In one embodiment, the jaws are flexible jaws.
In one embodiment, the controller is in communication electrical connection with the depth camera and/or the lidar and/or the mobile chassis via serial ports, respectively.
In one embodiment, the controller is electrically connected to the robot via a can bus.
In order to achieve the above object, according to an aspect of the present invention, there is provided a control method of a sweeping robot, the control method is used for controlling the sweeping robot, and the control method includes: collecting image information; importing image information into a deep learning model; identifying the garbage and obtaining three-dimensional information of the garbage; calculating position information according to the three-dimensional information; and controlling the movable chassis to move and controlling the mechanical arm to pick up the garbage according to the position information.
By applying the technical scheme of the invention, the deep learning model capable of identifying the garbage is stored in the controller in advance, and after the deep camera collects the image information, the controller guides the image information into the deep learning model to identify the garbage and obtain the three-dimensional information of the garbage, so that the moving chassis can be controlled to move to the garbage position and the manipulator can be controlled to pick up the garbage according to the three-dimensional information, and the garbage with larger volume can be automatically cleared, and the use by a user is facilitated.
In addition to the objects, features and advantages described above, other objects, features and advantages of the present invention are also provided. The present invention will be described in further detail below with reference to the drawings.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification, illustrate exemplary embodiments of the invention and together with the description serve to explain the invention and not to limit the invention. In the drawings:
fig. 1 shows a schematic overall structure diagram of an embodiment of a sweeping robot according to the present invention;
figure 2 shows a schematic connection diagram of the electronic control components of the sweeping robot of figure 1;
fig. 3 shows a flow chart of a control method of the sweeping robot according to the invention.
Detailed Description
It should be noted that the embodiments and features of the embodiments may be combined with each other without conflict. The present invention will be described in detail below with reference to the embodiments with reference to the attached drawings.
In order to make the technical solutions of the present invention better understood, the technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the embodiments of the present invention, and not all of the embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be noted that the terms "first," "second," and the like in the description and claims of the present invention and in the drawings described above are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order. It is to be understood that the terms so used are interchangeable under appropriate circumstances for describing embodiments of the invention herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed, but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
In order to solve the technical problem that the sweeping robot in the prior art cannot remove garbage with a large volume, as shown in fig. 1, the invention provides a sweeping robot, and as shown in fig. 1, the sweeping robot in this embodiment includes a moving chassis 10, a depth camera 30, a manipulator and a controller. The depth camera 30 and the manipulator are respectively arranged on the movable chassis 10, the depth camera 30 is used for collecting image information, and the manipulator is used for picking up garbage. The controller is electrically connected with the camera and the mobile chassis 10 respectively, the controller stores a deep learning model, the controller guides image information into the deep learning model to identify garbage and obtain three-dimensional information of the garbage, and the controller controls the mobile chassis 10 to move to the garbage position and controls the manipulator to pick up the garbage according to the three-dimensional information of the garbage.
By applying the technical scheme of the invention, the deep learning model capable of identifying the garbage is stored in the controller in advance, and after the deep camera 30 collects the image information, the controller guides the image information into the deep learning model to identify the garbage and obtain the three-dimensional information of the garbage, so that the moving chassis 10 can be controlled to move to the garbage position and the manipulator can be controlled to pick up the garbage according to the three-dimensional information, and the automatic cleaning of the garbage with larger volume can be realized, and the use by a user is convenient.
It should be noted that the deep learning model capable of recognizing the garbage needs to be built in advance, and the marked data samples are adopted to train the deep learning model, so that the garbage can be recognized by the deep learning model after the image information is input. Once the garbage is identified, the controller can conveniently derive three-dimensional information of the garbage, thereby controlling the mobile chassis 10 and the manipulator to pick up the garbage.
Further, in the technical solution of the present invention, the controller further calculates position information according to the three-dimensional information of the garbage, and then controls the moving chassis 10 and the manipulator according to the position information. It should be noted that, the three-dimensional information of the garbage needs to be calculated from the obtaining of the three-dimensional information of the garbage to the specific control of the space between the mobile chassis 10 and the manipulator, and the specific position information can be obtained by calculating the three-dimensional information of the garbage, so that the garbage can be picked up by controlling the mobile chassis 10 and the manipulator to execute the position information.
More preferably, as shown in fig. 1, in the technical solution of this embodiment, the sweeping robot further includes: the laser radar 20 is used for detecting environmental information, the controller is further electrically connected with the laser radar 20, and the controller constructs a two-dimensional grid map according to the environmental information and controls the moving chassis 10 to move according to the two-dimensional grid map. Environmental information is detected through the laser radar 20, a two-dimensional grid map can be constructed by the controller according to the environmental information, the two-dimensional grid map can be used for indoor autonomous navigation, and the mobile chassis 10 can move freely without obstacles in an indoor environment, so that the controller can better control the mobile chassis 10 to move according to the two-dimensional grid map.
It should be noted that the technical scheme of the sweeping robot of the present invention is particularly suitable for indoor use, and the laser radar 20 may be used to perform contour scanning on the surrounding environment to construct an indoor two-dimensional grid map, and may also perform matching of feature points according to the data information of the laser radar 20 to calculate the position information of the sweeping robot, so as to perform better positioning.
As shown in fig. 1, in the present embodiment, the robot comprises a robot arm 50 and a clamping jaw 60, the robot arm 50 is mounted on the mobile chassis 10, and the clamping jaw 60 is mounted at the end of the robot arm 50. In the action of picking up the waste, the robot arm 50 effects movement of the jaws 60 to the position of the waste, the jaws 60 effecting the final picking action on the waste. More preferably, in the technical solution of the present invention, the mechanical arm 50 is a five-axis mechanical arm, and the five-axis mechanical arm is more flexible to allow the clamping jaw 60 to more smoothly reach the position of the garbage. More preferably, the gripping jaw 60 is a flexible gripping jaw that prevents damage to the surface of the article during gripping. Specifically, air is sucked through a pneumatic motor controlling the flexible clamping jaw, so that the flexible clamping jaw is driven to bend and wrap the surface of an object, and garbage grabbing is achieved.
It should be noted that the detailed operation process of the robot may be as follows:
the method comprises the steps of researching and developing a mechanical arm, establishing a three-dimensional model, establishing simplification according to a kinematic relationship of the mechanical arm, converting the simplified three-dimensional model into a URDF file available for Moveit! and providing a model file for subsequent motion planning. In the URDF file, it can be defined: connecting rod, joint name, kinematic parameter, kinetic parameter, visual model, collision detection and the like. It should be noted that the URDF file is in an XML language.
Using MoveIt! initialization toolkit (MoveIt SetUp Assistant), its kinematics solver and motion algorithm planning library are created from the URDF file.
A motion planning joint set is provided for the manipulator, which in the present invention is a robotic arm 50 and a gripper 60.
When the target position of the camera head is received, the Moveit!is called! And planning the path to generate a series of path points, interpolating the cubic spline function to make the path points more dense, and releasing planning information.
And when the bus receives data transmitted by the upper computer, the data is analyzed according to a protocol, the stepping motor is controlled to rotate to a target position, and the action of grabbing the article is executed.
When the bottom layer single chip microcomputer receives the grabbing command, the clamping jaws 60 are controlled to pick up the garbage.
Control of the manipulator is based on the Moveit!of the ROS operating system! Module development, obtaining the target point position after receiving image processing, Moveit! The module plans a corresponding path in a cartesian space, then performs cubic spline function interpolation according to a set of planned path points, so that the path points are denser, the movement of the mechanical arm 50 is not greatly jumped and smoother, and finally the mechanical arm 50 is controlled to move according to the path points to reach a final target position. The tail end of the mechanical arm 50 is provided with a flexible clamping jaw, so that damage can not be caused in the process of grabbing soft objects.
As shown in fig. 2, in the technical solution of this embodiment, the controller is electrically connected to the depth camera 30 and/or the laser radar 20 and/or the mobile chassis 10 through serial port communication, respectively. As other alternative embodiments, the controller may also interact with depth camera 30 and/or lidar 20 and/or mobile chassis 10 in other ways. More preferably, in the present embodiment, the controller is electrically connected to the robot through a can bus. Because the operation of the manipulator is relatively complex, the controller and the manipulator can transmit data by adopting the can bus, and the complex control of the motion of the manipulator can be realized.
Preferably, in the technical solution of this embodiment, the controller may adopt NVIDIA Jetson TX2, the controller may communicate with the serial port of the mobile chassis 10 in stm32, and the controller may communicate with the depth camera 30 and/or the laser radar 20 in the TX2 serial port to transmit information.
As shown in fig. 3, the present invention further provides a control method for the sweeping robot, including:
collecting image information;
importing image information into a deep learning model;
identifying the garbage and obtaining three-dimensional information of the garbage;
calculating position information according to the three-dimensional information;
and controlling the mobile chassis 10 to move and controlling the manipulator to pick up the garbage according to the position information.
By adopting the technical scheme of the invention, the garbage can be automatically identified, and the three-dimensional information of the garbage can be obtained, so that the position information is calculated to accurately control the movement of the movable chassis 10 and the mechanical arm to pick up the garbage, the automatic clearing of the garbage with larger volume is realized, and the use by a user is facilitated.
The relative arrangement of the components and steps, the numerical expressions and numerical values set forth in these embodiments do not limit the scope of the present invention unless specifically stated otherwise. Meanwhile, it should be understood that the sizes of the respective portions shown in the drawings are not drawn in an actual proportional relationship for the convenience of description. Techniques, methods, and apparatus known to those of ordinary skill in the relevant art may not be discussed in detail but are intended to be part of the specification where appropriate. In all examples shown and discussed herein, any particular value should be construed as merely illustrative, and not limiting. Thus, other examples of the exemplary embodiments may have different values. It should be noted that: like reference numbers and letters refer to like items in the following figures, and thus, once an item is defined in one figure, further discussion thereof is not required in subsequent figures.
Spatially relative terms, such as "above … …," "above … …," "above … …," "above," and the like, may be used herein for ease of description to describe one device or feature's spatial relationship to another device or feature as illustrated in the figures. It will be understood that the spatially relative terms are intended to encompass different orientations of the device in use or operation in addition to the orientation depicted in the figures. For example, if a device in the figures is turned over, devices described as "above" or "on" other devices or configurations would then be oriented "below" or "under" the other devices or configurations. Thus, the exemplary term "above … …" can include both an orientation of "above … …" and "below … …". The device may be otherwise variously oriented (rotated 90 degrees or at other orientations) and the spatially relative descriptors used herein interpreted accordingly.
In the description of the present invention, it is to be understood that the orientation or positional relationship indicated by the orientation words such as "front, rear, upper, lower, left, right", "lateral, vertical, horizontal" and "top, bottom", etc. are usually based on the orientation or positional relationship shown in the drawings, and are only for convenience of description and simplicity of description, and in the case of not making a reverse description, these orientation words do not indicate and imply that the device or element being referred to must have a specific orientation or be constructed and operated in a specific orientation, and therefore, should not be considered as limiting the scope of the present invention; the terms "inner and outer" refer to the inner and outer relative to the profile of the respective component itself.
The above description is only a preferred embodiment of the present invention and is not intended to limit the present invention, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.

Claims (9)

1. A sweeping robot is characterized by comprising:
a mobile chassis (10);
the depth camera (30) is arranged on the movable chassis (10) and is used for collecting image information;
the mechanical arm is arranged on the moving chassis (10) and is used for picking up garbage;
the controller is electrically connected with the camera and the moving chassis (10) respectively, a deep learning model is stored in the controller, the controller guides the image information into the deep learning model to identify garbage and obtain three-dimensional information of the garbage, and the controller controls the moving chassis (10) to move to the garbage position and controls the mechanical arm to pick up the garbage according to the three-dimensional information of the garbage.
2. The sweeping robot of claim 1, further comprising: the laser radar (20), the laser radar (20) is used for detecting environmental information, the controller is further electrically connected with the laser radar (20), the controller constructs a two-dimensional grid map according to the environmental information, and the controller controls the mobile chassis (10) to move according to the two-dimensional grid map.
3. The sweeping robot according to claim 1, wherein the controller further calculates position information according to three-dimensional information of garbage, and controls the moving chassis (10) and the manipulator according to the position information.
4. A sweeping robot according to claim 1, characterized in that the robot arm comprises a robot arm (50) and a gripper jaw (60), the robot arm (50) being mounted on the moving chassis (10), the gripper jaw (60) being mounted at the end of the robot arm (50).
5. The sweeping robot according to claim 4, characterized in that said robot arm (50) is a five-axis robot arm.
6. A sweeping robot according to claim 4, characterized in that the gripping jaws (60) are flexible gripping jaws.
7. The sweeping robot according to claim 2, characterized in that the controller is electrically connected with the depth camera (30) and/or the lidar (20) and/or the mobile chassis (10) via serial communication, respectively.
8. The sweeping robot of claim 1, wherein the controller is electrically connected to the robot arm via a can bus.
9. A control method of a sweeping robot, which is used for controlling the sweeping robot of any one of claims 1 to 6, and comprises the following steps:
collecting image information;
importing image information into a deep learning model;
identifying the garbage and obtaining three-dimensional information of the garbage;
calculating position information according to the three-dimensional information;
and controlling the moving chassis (10) to move and controlling the mechanical arm to pick up the garbage according to the position information.
CN201911043683.8A 2019-10-30 2019-10-30 Sweeping robot and control method thereof Pending CN110604518A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911043683.8A CN110604518A (en) 2019-10-30 2019-10-30 Sweeping robot and control method thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911043683.8A CN110604518A (en) 2019-10-30 2019-10-30 Sweeping robot and control method thereof

Publications (1)

Publication Number Publication Date
CN110604518A true CN110604518A (en) 2019-12-24

Family

ID=68895539

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911043683.8A Pending CN110604518A (en) 2019-10-30 2019-10-30 Sweeping robot and control method thereof

Country Status (1)

Country Link
CN (1) CN110604518A (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112149573A (en) * 2020-09-24 2020-12-29 湖南大学 Garbage classification and picking robot based on deep learning
CN112192541A (en) * 2020-10-09 2021-01-08 常熟云开智能科技有限公司 Control method for robot to automatically identify scattered workpiece positions and automatically grab
CN113530345A (en) * 2021-07-16 2021-10-22 中国电建集团江西省电力设计院有限公司 Transmission line iron tower
CN114343488A (en) * 2021-12-16 2022-04-15 深圳市安杰信息科技有限公司 Intelligent cleaning robot and intelligent cleaning system

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112149573A (en) * 2020-09-24 2020-12-29 湖南大学 Garbage classification and picking robot based on deep learning
CN112192541A (en) * 2020-10-09 2021-01-08 常熟云开智能科技有限公司 Control method for robot to automatically identify scattered workpiece positions and automatically grab
CN113530345A (en) * 2021-07-16 2021-10-22 中国电建集团江西省电力设计院有限公司 Transmission line iron tower
CN114343488A (en) * 2021-12-16 2022-04-15 深圳市安杰信息科技有限公司 Intelligent cleaning robot and intelligent cleaning system
CN114343488B (en) * 2021-12-16 2023-09-08 深圳市安杰信息科技有限公司 Intelligent cleaning robot and intelligent cleaning system

Similar Documents

Publication Publication Date Title
CN110604518A (en) Sweeping robot and control method thereof
JP6873941B2 (en) Robot work system and control method of robot work system
CN114728417B (en) Method and apparatus for autonomous object learning by remote operator triggered robots
CN111055281B (en) ROS-based autonomous mobile grabbing system and method
CN110640730B (en) Method and system for generating three-dimensional model for robot scene
CN111906784A (en) Pharyngeal swab double-arm sampling robot based on machine vision guidance and sampling method
CN100352623C (en) Control device and method for intelligent mobile robot capable of picking up article automatically
KR20190077481A (en) Robot mapping system and method
US20160136807A1 (en) Determination of Object-Related Gripping Regions Using a Robot
JP2022542241A (en) Systems and methods for augmenting visual output from robotic devices
CN114102585B (en) Article grabbing planning method and system
CN112571415A (en) Robot autonomous door opening method and system based on visual guidance
CN111823212A (en) Garbage bottle cleaning and picking robot and control method
CN112207839A (en) Mobile household service robot and method
CN110605711A (en) Method, device and system for controlling cooperative robot to grab object
CN116494201A (en) Monitoring integrated power machine room inspection robot and unmanned inspection method
US10933526B2 (en) Method and robotic system for manipulating instruments
CN114224226A (en) Obstacle avoidance cleaning robot, robot mechanical arm obstacle avoidance planning system and method
CN210990015U (en) Floor sweeping robot
Wang et al. Research on a mobile manipulator for biochemical sampling tasks
US20220241980A1 (en) Object-Based Robot Control
KR20160116445A (en) Intelligent tools errands robot
CN115033002A (en) Mobile robot control method and device, electronic device and storage medium
CN115157245A (en) Mechanical arm control system and method based on deep learning
CN114888768A (en) Mobile duplex robot cooperative grabbing system and method based on multi-sensor fusion

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination