CN113156929A - Self-moving equipment - Google Patents

Self-moving equipment Download PDF

Info

Publication number
CN113156929A
CN113156929A CN202010642119.4A CN202010642119A CN113156929A CN 113156929 A CN113156929 A CN 113156929A CN 202010642119 A CN202010642119 A CN 202010642119A CN 113156929 A CN113156929 A CN 113156929A
Authority
CN
China
Prior art keywords
self
moving
boundary
working area
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010642119.4A
Other languages
Chinese (zh)
Other versions
CN113156929B (en
Inventor
朱松
何明明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Positec Power Tools Suzhou Co Ltd
Original Assignee
Positec Power Tools Suzhou Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Positec Power Tools Suzhou Co Ltd filed Critical Positec Power Tools Suzhou Co Ltd
Priority to PCT/CN2020/129821 priority Critical patent/WO2021139414A1/en
Priority to PCT/CN2021/070477 priority patent/WO2021139683A1/en
Priority to CN202180005855.1A priority patent/CN114868095A/en
Publication of CN113156929A publication Critical patent/CN113156929A/en
Application granted granted Critical
Publication of CN113156929B publication Critical patent/CN113156929B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0234Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
    • G05D1/0236Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons in combination with a laser
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0214Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory in accordance with safety or protection criteria, e.g. avoiding hazardous areas
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0223Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving speed control of the vehicle
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • G05D1/0253Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means extracting relative motion information from a plurality of images taken successively, e.g. visual odometry, optical flow
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0276Control of position or course in two dimensions specially adapted to land vehicles using signals provided by a source external to the vehicle

Abstract

A self-moving device comprising an edge mode in which the control module is configured to: analyzing whether the boundary of the working area exists in the environment image or not according to the environment image acquired by the image acquisition device; when the working area boundary does not exist in the environment image, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary; when the working area boundary exists in the environment image, dividing the environment image into N sub-images, analyzing whether the working area boundary exists in each sub-image, respectively fitting the working area boundary in each sub-image with the working area boundary into a straight line, generating parameters of the straight line, and controlling the self-moving equipment to move and work along the working area boundary according to the parameters, wherein N is more than or equal to 2.

Description

Self-moving equipment
Technical Field
The invention relates to a self-moving device.
Background
With the continuous progress of computer technology and artificial intelligence technology, automatic mowers similar to intelligent robots have started to slowly move into people's lives. For example, an automatic lawn mower can automatically cut grass and charge the grass in a user's lawn without user intervention. After the automatic working system is set once, the user is freed from tedious and time-consuming and labor-consuming housework such as cleaning, lawn maintenance and the like without being invested in energy management. At present, the automatic mower moves randomly in the working area limited by the boundary line, but the cloth boundary line is very complicated, so that a new automatic mower needs to be designed to solve the problems.
Disclosure of Invention
In order to overcome the defects, the invention adopts the following technical scheme:
a self-moving device for automatically moving and working within a work area, comprising:
a housing;
the moving module is positioned below the shell and used for driving the shell to move;
the working module is arranged on the shell to execute a preset working task;
the image acquisition device is used for acquiring an environment image where the self-moving equipment is located;
the control module is used for autonomously controlling the mobile module to drive the shell to move and autonomously controlling the working module to execute a preset working task;
the self-moving device comprises an edge mode in which the control module is configured to: analyzing whether the boundary of the working area exists in the environment image or not according to the environment image acquired by the image acquisition device; when the working area boundary does not exist in the environment image, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary; when the working area boundary exists in the environment image, dividing the environment image into N sub-images, analyzing whether the working area boundary exists in each sub-image, respectively fitting the working area boundary in each sub-image with the working area boundary into a straight line, generating parameters of the straight line, and controlling the self-moving equipment to move and work along the working area boundary according to the parameters, wherein N is more than or equal to 2.
Further, N is more than or equal to 2 and less than or equal to 8.
Further, in the edge mode, the control module is further configured to: when the working area boundary exists in the environment image, the environment image is divided into N sub-images along the direction far from and near to the self-moving equipment, the self-moving equipment is controlled to move and work according to the sub-image closest to the self-moving equipment, and the follow-up movement and work of the self-moving equipment are predicted according to the remaining sub-images.
Further, in the edge mode, the control module is further configured to: when the working area boundary exists in the environment image, the environment image is divided into two sub-images along the direction far away from the self-moving equipment, the self-moving equipment is controlled to move and work according to the sub-images close to the self-moving equipment, and the follow-up movement and work of the self-moving equipment are predicted according to the sub-images far away from the self-moving equipment.
Further, in the edge mode, the control module is further configured to: judging whether the boundary of the current working area is lost or not in real time, and controlling the mobile module to automatically move to search the boundary of the working area when the boundary of the current working area is lost; and when the boundary of the current working area is not lost, controlling the self-moving equipment to continuously move and work along the boundary of the current working area.
Further, the real-time determining whether the boundary of the current working area is lost includes: counting the proportion of the target object and the non-target object on two sides of the boundary of the current working area, and judging that the boundary of the current working area is not lost when the proportion of the target object and the non-target object is in a preset range; and when the proportion of the target object to the non-target object is not in the preset range, judging that the boundary of the current working area is lost.
Further, when the current working area boundary is lost, controlling the moving module to automatically move to find the working area boundary comprises: when the boundary of the current working area is lost, controlling the mobile module to rotate a certain angle to search the boundary of the working area; and if the working area boundary is not found through rotation, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary.
Further, the self-moving device further comprises a detection mode, in which the control module is configured to: controlling the image acquisition device to acquire the environment image, and judging whether the working area boundary exists in the environment image according to the environment image; when the working area boundary does not exist in the environment image, controlling the mobile module to move and work according to the preset in-edge moving logic; when the working area boundary exists in the environment image, the working area boundary in the environment image is fitted into a straight line, parameters of the straight line are generated, and the self-moving equipment is controlled to move and work in the working area boundary according to the parameters.
Further, in the edge-finding mode, when the working area boundary exists in the environment image, controlling the self-moving device to move and work within the working area boundary according to the parameter includes: and when the self-moving equipment is judged to be close to the boundary of the working area according to the environment image, controlling the self-moving equipment to continuously move for a certain distance along the original direction, and then adjusting the moving direction of the self-moving equipment.
Further, the self-moving device is an automatic mower capable of automatically moving and mowing grass on the lawn, the working module is a mowing module for executing a mowing task, and the working area boundary is the lawn boundary.
Further, the image acquisition device comprises an edge image acquisition device and a detection image acquisition device, and in the edge mode, the control module controls the self-moving equipment to move and cut according to the environment image acquired by the edge image acquisition device; in the edge-detecting mode, the control module controls the self-moving equipment to move and cut according to the environment image acquired by the edge-detecting image acquisition device.
Further, in the left-right direction of the self-moving device, the edge image acquisition device is arranged on one side, close to the boundary of the working area, of the self-moving device, and the edge-detecting image acquisition device is arranged on one side, close to the center of the self-moving device, of the self-moving device.
Further, in the front-rear direction, the edgewise image acquisition device and the edge-exploring image acquisition device are both arranged in front of the advancing direction of the self-moving equipment; in the height direction, the installation height of the edgewise image acquisition device and the installation height of the edge-detecting image acquisition device do not exceed the ground by 20cm upwards; in the left and right directions, the distance from the edge image acquisition device to the side, close to the working area boundary, of the self-moving equipment is S1, S1 is in the range of 0-5cm, the distance from the edge detection image acquisition device to the central axis of the self-moving equipment is S2, and S2 is in the range of 0-4 cm.
Furthermore, the self-moving equipment further comprises a reference object detection module for detecting a reference object, and the control module controls the self-moving equipment to automatically move and cut in at least two sub-working areas according to information detected by the reference object detection module.
Further, after the self-moving device finishes working in one sub-working area, the control module controls the self-moving device to search the reference object along the edge of the self-moving device according to the environment image collected by the image acquisition device, and when the reference object detection module detects the reference object, the control module controls the self-moving device to enter the next working sub-area according to the information detected by the reference object detection module.
Further, the control module may control the self-moving device to perform an action corresponding to the current type of the obstacle according to different types of obstacles in the environment image.
The beneficial effect of this scheme is: under the border mode, the environment image is divided into N sub-images, the working area boundary on each sub-image is fitted into a straight line respectively to generate a parameter representing the straight line, and the mobile equipment is controlled to automatically move and work along the working area boundary according to the parameter, so that the precision of the working area boundary obtained according to the image can be improved, the distorted working area boundary can be removed, the operation time can be prolonged, the operation requirement on a control module is reduced, and the cost is reduced.
Drawings
Fig. 1 is a schematic diagram of a self-moving device according to an embodiment of the invention.
Fig. 2 is a block diagram of a self-moving device in an embodiment of the invention.
Fig. 3 is a block diagram of a self-moving device in an embodiment of the invention.
Fig. 4 is a schematic diagram of an operating state of the self-moving device according to an embodiment of the present invention.
FIG. 5 is a schematic diagram of an edge mode of a self-moving device according to an embodiment of the present invention.
Fig. 6 is a schematic diagram of an edge-seeking mode from a mobile device according to an embodiment of the invention.
Fig. 7 is a schematic diagram of an operating state of the mobile device in an embodiment of the invention.
Fig. 8 is a schematic diagram of the environment image photographed from the mobile device in fig. 7.
Fig. 9 is a schematic diagram of the environment image of fig. 8 divided into two sub-images.
FIG. 10 is a schematic diagram of an automated working system according to an embodiment of the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
Referring to fig. 1 to 10, an embodiment of the present invention provides an automatic work system including a self-moving apparatus 100 automatically moving and working in a work area and a charging station for charging the self-moving apparatus. In this embodiment, the self-moving device 100 is an automatic lawn mower, and the charging station is a charging station for charging the automatic lawn mower. In other embodiments, the self-moving device 100 may also be an automatic leaf sweeper, an automatic sprinkler, a multi-function machine, a sweeping robot, or the like.
As shown in fig. 1 to 3, the mobile device 100 includes a work module 102 disposed in the housing and configured to execute a preset work task, a mobile module 130 located below the housing 110 and configured to drive the housing 110 to move, an image capturing device configured to capture an image of an environment where the mobile device is located, a control module 101 configured to control the mobile module 130 to automatically move and control the work module 102 to automatically work, and an energy module 103 configured to supply power to the mobile module, the work module 102, and the control module 101. The control module 101 is connected with and controls the moving module 130, the working module 102, the energy module 103 and the image acquisition device 140.
In this embodiment, taking an automatic mower as an example, in which the self-moving device automatically moves and mows on a lawn, the boundary of the working area is a lawn boundary; the work module 102 is a mowing module, specifically a cutting member, such as a cutting blade, for performing mowing tasks. The moving module includes an auxiliary wheel located at the front and a driving wheel located at the rear. The work module 102 is driven by a cutting motor (not shown). The center of the work module 102 is located on the central axis of the self-moving device 100, and is disposed below the housing 110 between the auxiliary wheel and the driving wheel.
As shown in fig. 4, an image capturing device 140 is mounted on the housing 110 for capturing images from the environment in which the mobile device is located. In the present embodiment, the environmental image of the environment where the mobile device 100 is located acquired by the image acquisition device 140 refers to the image information of the target area M captured by the image acquisition device 140. The viewing range of the image capturing device 140 has different viewing angle ranges, such as 90 degrees to 120 degrees, according to different types of capturing devices. Of course, in a specific implementation process, a certain angle range within the viewing angle range may be selected as the actual viewing range, for example, a 90-degree range located in the middle within 120 degrees of the viewing angle range may be selected as the actual viewing range. Ground visual information is gathered down to image acquisition device 140 slant, and control module 101 distinguishes meadow and non-meadow region according to the image that image acquisition device 140 gathered, and then discerns the meadow border. Wherein the non-grassy areas include fences, sidewalks, short shrubs, road teeth, wood chips, and the like.
The energy module 103 is used to provide energy for operation from the mobile device 100. The energy source of the energy module 103 may be gasoline, a battery pack, etc., and in this embodiment the energy module 103 includes a rechargeable battery pack disposed within the housing 110. During operation, the battery pack releases power to maintain operation and movement from the mobile device 100. During non-operation, the battery may be connected to an external power source to supplement the power. In particular, for a more human design, the self-moving device 100 automatically seeks a charging station to supplement power when it detects that the battery is low.
The self-moving device further comprises a storage unit for storing a data model, which may contain picture information or other characteristic data of a large number of different lawns, and lawn-surrounding objects, different houses, etc. The image acquisition device 140 acquires images around the mobile device, and the control module 101 compares the acquired images with the stored data model, analyzes the position and environment of the mobile device, and controls the mobile device to move and work on the corresponding lawn.
Referring to fig. 2, the mobile device 100 includes a border mode 108 and a trail mode 109. In the border mode 108, the control module 101 controls the border to be moved and cut from the mobile device 100. In the edge-seeking mode, the control module 101 controls the self-moving apparatus 100 to move and cut within the boundary. The control module 101 is configured to control the self-moving device 100 to automatically switch between the edge mode and the edge detection mode, and specifically, the self-moving device 100 may control the self-moving device 100 to select one of the edge mode and the edge detection mode to operate according to a preset time schedule, where the time schedule may be set when leaving a factory, may be set by a user according to a use habit, and may be set by the self-moving device 100 according to the use habit of the user; of course, a mode switching button can be arranged, and the working mode can be switched by the user.
In the border mode 108, image information of the target area is first acquired by the image acquisition device 140, and the control module 101 searches for a border according to the image information acquired by the image acquisition device, and controls the mobile device 100 to move along the border and cut. Specifically, the control module 101 first obtains image information from the image capturing device, then performs visual recognition based on the image information, performs image processing to find boundary information, and controls the mobile device 100 to perform edge cutting after finding a boundary.
In particular, in the edgewise mode, the control module 101 is configured to: analyzing whether the boundary of the working area exists in the environment image or not according to the environment image acquired by the image acquisition device; when the working area boundary does not exist in the environment image, controlling the moving module to move according to the preset edge finding logic so as to find the working area boundary; when a working area boundary exists in the environment image, the environment image is divided into N sub-images, whether the working area boundary exists in each sub-image is analyzed, the working area boundary in each sub-image with the working area boundary is fitted into a straight line, parameters of the straight line are generated, and the self-moving equipment is controlled to move and work along the working area boundary according to the parameters, wherein N is larger than or equal to 2.
Generally, when an environment image is processed, all boundary data points are usually fit end to form a curve, or all boundary data points representing the boundary of a working area in the environment image are fit directly to form a straight line. If all the boundary data points are directly fitted into a curve, the calculation amount is too large, the requirement on the calculation capability of the control module 101 is high, and time is consumed, and moreover, since jumping and distorted boundary data points may exist in the environment image, if all the data points are directly fitted into a curve, the moving path of the mobile device 100 is distorted, and even the path distortion may be serious; moreover, if the curve is fitted, the moving track of the self-moving device is the curve, at the moment, the direction of the movement of the self-moving device needs to be continuously adjusted, and the moving posture is not good. If all the boundary data points are fitted into a straight line, since the area spanned by one environmental image is large, after all the boundary points are fitted into a straight line, the data may be distorted seriously, which causes the serious distortion of the path finally moved by the mobile device. In the embodiment, the environment image is divided into N sub-images, the working area boundary in each sub-image is fitted into a straight line respectively, and the number of the sub-images is controlled simultaneously, specifically, N is greater than or equal to 2, so that the truth degree of the fitted boundary is improved, the calculation amount is reduced, the frequency of adjusting the direction from the mobile equipment is reduced, the moving posture is better and beautiful, the requirements on the control module 101 are reduced, the cost is reduced, and distorted boundary data points can be filtered when the sub-images are fitted into the straight line, so that the fitted boundary is more accurate.
In the embodiment, the number of the sub-images is further limited, specifically, N is limited to be more than or equal to 2 and less than or equal to 8, the number of the sub-images is limited to be within 2 to 8 (including 2 and 8), the accuracy of the boundary after fitting and the operation difficulty during fitting can be controlled to be optimal, and distorted boundary data points can be effectively removed.
Specifically, the rule for dividing the environment image into N sub-images may be set according to the actual situation, for example, in terms of size, the environment image may be divided into N sub-images with the same size, or may be divided into N sub-images with different sizes; for example, the division may be performed in the left-right direction, the up-down direction, the inward-outward direction, or other rules in the division direction.
In this embodiment, the environment image is segmented in the direction far from and near the mobile device 100, the condition of the area closest to the mobile device is obtained from the sub-image closest to the mobile device, the movement and the work of the mobile device are controlled according to the sub-image closest to the mobile device, and the condition of the area far from the mobile device is obtained from the sub-image far from the mobile device, so that the subsequent movement and the work of the mobile device are predicted.
For example, as shown in fig. 4 and 7, the image acquisition device 140 can acquire an image of the target area M, and the image of the target area M is divided in the front-rear direction of the image of the target area M because the front portion of the image is farther from the mobile device 100 and the rear portion of the image is closer to the mobile device. In particular, the control module 101 is further configured to: when a working area boundary exists in the environment image, the environment image is divided into N sub-images along the direction far away from the mobile equipment, the mobile equipment is controlled to move and work according to the sub-image closest to the mobile equipment, and the follow-up movement and work of the mobile equipment are predicted according to the remaining sub-images. Specifically, when the mobile device is controlled to move and work according to each sub-image, whether a working area boundary exists in each sub-image is judged, the working area boundaries in the sub-images with the working area boundaries are fitted into a straight line respectively, and parameters of the straight line are generated; controlling the self-moving equipment to move and work along the boundary of the current working area according to the parameters generated by the sub-image closest to the self-moving equipment; and predicting subsequent movement and work of the mobile equipment according to the parameters generated by the residual sub-images.
In the present embodiment, as shown in fig. 7 to 9, in order to further reduce the operation difficulty, the environment image is divided into two sub-images along a direction far from and near the mobile device 100, that is, the environment image is divided into two sub-images along a front-back direction of the environment image, where the sub-image in front of the environment image is far from the mobile device, and the sub-image in back of the environment image is near from the mobile device. The control module 101 controls the current movement and work of the mobile device 100 according to the sub-image which is closer to the mobile device 100; subsequent movements and work from the mobile device 100 are predicted based on sub-images that are further away from the mobile device 100.
In particular, in the edgewise mode, the control module 101 is further configured to: when a working area boundary exists in the environment image, the environment image is divided into two sub-images along the direction far from and near the self-moving equipment, the self-moving equipment is controlled to move and work according to the sub-image close to the self-moving equipment, and the subsequent movement and work of the self-moving equipment are predicted according to the sub-image far from the self-moving equipment. Specifically, when the mobile device is controlled to move and work according to the two sub-images, whether a working area boundary exists in the two sub-images is judged, if both the two sub-images exist, the working area boundary in each sub-image is fitted into a straight line respectively, and parameters of the straight line are generated; controlling the self-moving equipment to move and work along the boundary of the current working area according to the parameters generated by the sub-images close to the self-moving equipment; and predicting subsequent movement and work of the self-moving equipment according to the parameters generated by the sub-images far away from the self-moving equipment. Of course, if the working area boundary does not exist in the sub-image close to the self-moving device, the spatial relationship between the working area boundary and the self-moving device can be directly predicted according to the parameters generated in the sub-image far from the self-moving device, and the self-moving device is controlled to move along the corresponding direction to find the working area boundary. If the sub-image far away from the self-moving device does not have the working area boundary, and the sub-image close to the free-space device has the working area boundary, the self-moving device can be controlled to move and work along the working area boundary according to the parameters generated by the sub-image close to the self-moving device, and the subsequent movement of the self-moving device is pre-judged according to the information of the non-working area boundary in the far-away sub-image, for example, the self-moving device is planned to turn in advance, and the like.
As shown in fig. 7 to 9, the ambient image includes a grass boundary, and both sub-images include a grass boundary. As shown in fig. 8, the environment image 300 captured from the mobile device 100 at a certain time, that is, the image of the target area M, is shown, and the environment image 300 includes grass 31 and non-grass 32. As shown in fig. 9, the environment image 300 is divided into two sub-images in the distance direction from the mobile device 100, the two sub-images are a near sub-image 301 and a far sub-image 302. The near sub-image 301 is an image of the near target area M1 closer to the mobile device in fig. 7, and the far sub-image is an image of the far target area M2 farther from the mobile device in fig. 7. The control module 101 further fits the grass and non-grass boundaries in the two sub-images (301, 302) to form a straight line, wherein the grass 31 and non-grass 32 boundaries in the near sub-image 302 are fitted to form a near-boundary straight line 312, and generates a parameter representing the near-boundary straight line 312, wherein the parameter may be an angle of the near-boundary straight line 312, an offset of the near-boundary line with respect to a center, and the like, or may be other parameters representing the near-boundary straight line 312. The boundary of the grass 31 and the non-grass 32 in the far-image 301 is fitted to a far-boundary straight line 311, and a parameter representing the far-boundary straight line 311 is also generated. The control module 101 controls the self-moving device to move and work along the boundary line near the self-moving device according to the parameters of the near boundary line 312, and predicts the subsequent movement and work of the self-moving device according to the parameters of the far boundary line 311.
In the edgewise mode, the control module 101 controls the mobile device to move and work along the boundary of the working area according to the parameters acquired from each sub-image, and determines whether the boundary of the working area is lost in real time. Specifically, in the edge mode, the control module 101 is further configured to: judging whether the boundary of the current working area is lost or not in real time, and controlling the mobile module to automatically move to search the boundary of the working area when the boundary of the current working area is lost; and when the boundary of the current working area is not lost, controlling the self-moving equipment to continuously move and work along the boundary of the current working area.
Wherein, judging whether the boundary of the current working area is lost in real time comprises: counting the proportion of the target object and the non-target object on two sides of the boundary of the current working area, and judging that the boundary of the current working area is not lost when the proportion of the target object and the non-target object is in a preset range; and when the proportion of the target object to the non-target object is not in the preset range, judging that the boundary of the current working area is lost. In this embodiment, the work area is the meadow border, and the selectable grass is as the subject matter, and the accessible is counted the grass and the proportion of non-grass to current meadow border both sides, and then judges whether current meadow border loses. Specifically, in the edge mode, the control module 101 is further configured to: counting the proportion of grass to non-grass on two sides of the current grassland boundary in real time, judging that the current grassland boundary is not lost when the proportion of the grass to the non-grass is in a preset range, and controlling the self-moving equipment to continuously move along the current grassland boundary and mow; when the ratio of grass to non-grass is not in the preset range, the current grassland boundary is judged to be lost, and the moving module is controlled to automatically move to search the grassland boundary. Of course, the above is only one method for determining whether the working area boundary is lost, and in other embodiments, the determination may be performed by other methods.
When the working area boundary is lost, the control mobile module automatically moves to find the working area boundary, which may specifically include: when the boundary of the current working area is lost, controlling the mobile module to rotate a certain angle to search the boundary of the working area; and if the working area boundary is not found through rotation, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary. Specifically, when the mobile module is controlled to move according to the preset edge finding logic, the mobile module can be controlled to find within the range of two self-moving equipment bodies in the working area, and if the boundary cannot be found, the mobile module can find within a larger range. Of course, the above is only one searching method after the working area boundary is lost, and in other embodiments, the working area boundary may be searched through other methods.
In the edge-finding mode, the control module 101 controls the mobile device to move and mow within the boundary of the working area according to the information in the environment image, and specifically, the control module 101 is configured to: controlling an image acquisition device to acquire an environment image, and judging whether a working area boundary exists in the environment image according to the environment image; when the working area boundary does not exist in the environment image, controlling the mobile module to move and work according to the preset edge moving logic; when the working area boundary exists in the environment image, the working area boundary in the environment image is fitted into a straight line, the parameter of the straight line is generated, and the self-moving equipment is controlled to move and work in the working area boundary according to the parameter. In the edge-seeking mode, the self-moving device 100 can cut randomly within the boundaries of the work area and also along a planned path, for example, a planned path using inertial navigation and odometry to make glyphs.
As shown in fig. 4, since the image of the environment captured by the image capturing device has a blind area, the blind area is a distance a, and the distance from the mobile device to the ground is a region, the image capturing device cannot capture the image, so that when there is a grass boundary in the image of the environment captured by the image capturing device, it does not indicate that the mobile device has encountered the grass boundary, and at this time, the distance from the mobile device to the grass boundary is at least a, for example, as shown in fig. 4, when the distance from the mobile device to the grass boundary is B calculated according to the image of the environment, at this time, the distance from the mobile device to the grass boundary is actually B + a. Therefore, in order to make the self-moving device better cut to the edge, when it is calculated from the environment image that the self-moving device has approached the boundary of the working area, the control module 101 controls the self-moving device to move a certain distance in the original moving direction. The certain distance can be preset by directly leaving the factory or the like, for example, the preset distance is smaller than or equal to the blind area distance A, so that the mobile device can better cut to the edge in the working area. Of course, the certain distance may also be directly generated through the current operation or obtained through other manners. Specifically, in the edge-finding mode, when a working area boundary exists in the environment image, controlling the self-moving device to move and work within the working area boundary according to the parameter includes: and when the self-moving equipment is judged to be close to the boundary of the working area according to the environment image, controlling the self-moving equipment to continuously move for a preset distance along the original direction, and then adjusting the moving direction of the self-moving equipment.
As shown in fig. 10, the working area includes at least two sub-working areas, the control module 101 further determines whether the current sub-working area is completely cut, if the current sub-working area is completely cut, the next sub-working area is entered, and if the current sub-working area is not completely cut, the cutting is continued on the sub-working area. Because the self-moving device has the edge mode and the edge-detecting mode, the control module 101 may control the self-moving device to sequentially complete the work tasks on the boundaries of the sub-work areas in the edge mode, and sequentially complete the work tasks in the sub-work areas in the edge-detecting mode. Or, the edge mode and the edge detection mode are switched firstly to complete the work tasks on the boundary and in the boundary of one sub-work area, then the next sub-work area is entered, and the edge mode and the edge detection mode are switched to complete the work tasks on the boundary and in the boundary of the next sub-work area.
Taking the example that the edgewise work task and the work tasks in the boundary of each work area are completed sequentially and completely, and the work task at the boundary of the work area is performed in the edgewise mode in each sub-work area, and then the edge-detecting mode is performed to perform the work task in the work area, the control module 101 judges whether the cutting of the boundary of the work area is completed in real time or at intervals, if the cutting is not completed, the cutting is continued, and if the cutting of the current boundary is completed, the mobile device 100 is controlled to enter the edge-detecting mode to cut the area in the boundary of the sub-work area; and after the area cutting in the boundary of the sub-working area is finished, the next sub-working area is started to work.
The method for specifically judging whether the work task in the work area is completed or not can be realized by setting a reference object, for example, laying a magnetic stripe on the boundary of the work area, or setting a positioning device on the self-moving device.
As shown in fig. 10, the working area includes at least two sub-working areas, the self-moving device further includes a reference object detection module for detecting a reference object, and the control module 101 controls the self-moving device to automatically move and cut within the at least two sub-working areas according to information detected by the reference object detection module. Specifically, after the mobile device finishes working in one sub-working area, the control module 101 controls the mobile device to search the reference object along the edge of the mobile device according to the environment image collected by the image acquisition device, and when the reference object detection module detects the reference object, the control module 101 controls the mobile device to enter the next sub-working area according to the information detected by the reference object detection module. The work of the sub-working area is completed in a single edge mode or in an edge probing mode, or in both the edge probing mode and the edge probing mode.
As shown in fig. 10, the automatic working system includes at least two sub-working areas, in this embodiment, each sub-working area is referred to as a first sub-area 11, a second sub-area 12, a third sub-area 13, and a fourth sub-area 14. The automatic working system further comprises a plurality of magnetic stripes 15 for connecting at least two sub-areas, wherein the magnetic stripes 15 comprise a first magnetic stripe 151 for connecting the first sub-area 11 with the second sub-area 12, a second magnetic stripe 152 for connecting the second sub-area 12 with the third sub-area 13, and a fourth magnetic stripe 153 for connecting the third sub-area 13 with the fourth sub-area 14. The control module 101 determines whether the mobile device 100 completes cutting the corresponding boundary according to the magnetic stripe 15, for example, the control module 101 controls the mobile device 100 to start cutting from the first magnetic stripe 151, and when it detects the first magnetic stripe 151 again, it indicates that the boundary cutting of the first sub-area 11 is completed, and other sub-areas may also detect whether the cutting is completed by the same method.
After the cutting of the boundary of the first sub-area is completed, the self-moving device 100 can directly move from the boundary of the first sub-area 11 to the boundary of the second sub-area 12 through the first magnetic stripe 151 to complete the work of the boundary of the second sub-area 12, and similarly, move from the boundary of the second sub-area 12 to the boundary of the third sub-area 13 through the second magnetic stripe 152, and move from the boundary of the third sub-area 13 to the boundary of the fourth sub-area 14 through the third magnetic stripe 153. Of course, in other embodiments, the self-moving device 100 may also implement the detection of whether the boundary is completely cut, and the switching between the two sub-areas, for example, by means of identification of a mark such as a two-dimensional code or by means of a positioning device provided on the self-moving device 100.
Of course, after the boundary of the first sub-area is cut, the self-moving device 100 may first cut the boundary of the first sub-area, and after the cutting in the boundary of the first sub-area is completed, search the magnetic stripe along the edge, and enter the boundary of the second sub-work area 12 through the first magnetic stripe 151, so as to sequentially complete the work on the boundary and in the boundary of the second sub-work area 12, and similarly, sequentially complete the work of other sub-work areas. In each sub-working area, the sequence of executing the work area on and within the boundary may be set according to the actual situation, for example, the work within the boundary of the working area may be executed first, and then the work on the boundary of the working area may be executed.
Specifically, in the edgewise mode and the edge-finding mode, a specific means for processing the environment image to analyze whether the work area boundary exists in the environment image may be selected according to actual conditions. For example, in the embodiment, in the edgewise mode, the self-moving device 100 may sequentially perform distortion correction, image segmentation, perspective transformation, and other processing on the environment image to generate a plurality of data points representing the boundaries of the working area, then segment the environment image into N sub-images, then fit the boundary data points in each sub-image into a straight line, calculate parameters representing the straight line, for example, calculate the angle and offset of the straight line, and then control the movement and operation of the self-moving device according to the parameters. Of course, in the edgewise mode, the environment image may be directly divided into N sub-images, and then each sub-image may be processed by distortion correction, image division, perspective transformation, line fitting, and other means in sequence to generate parameters of each fitting line. Of course, the above processing method for the environment image is only an example, and other processing means in the art may be adopted.
As shown in fig. 4, the image capturing device is disposed on the self-moving apparatus 100, the mounting angle of the image capturing device is 70-150 degrees, the mounting height (ground clearance) H is 10-40cm (for example, the mounting height is 14-15cm or 20cm or 30cm, etc.), and the angle α is 20-90 degrees, where the distance D refers to the distance seen by the image capturing device, or the distance of the image that can be captured by the image capturing device. The number of the image acquisition devices can be one or two or more.
As shown in fig. 5 and fig. 6, in an embodiment, the self-moving apparatus includes at least two image capturing devices 140, where the two image capturing devices 140 are a edgewise image capturing device 141 for capturing image information in the edgewise mode and a outlying image capturing device 142 for capturing image information in the outlying mode. In the edgewise mode, the control module 101 controls the movement and cutting of the mobile device according to the environment image acquired by the edgewise image acquisition device; in the edge-finding mode, the control module 101 controls the movement and cutting of the mobile device according to the environment image collected by the edge-finding image collecting device.
As shown in fig. 5, in the edgewise mode, the moving direction of the mobile device 100 is parallel to the extending direction of the boundary, that is, the edgewise moving direction of the mobile device 100, because the angle of view of the edgewise image capturing device 141 is limited, and in order to enable the image capturing device 141 to capture the boundary image in real time and avoid the boundary image from being lost, the edgewise image capturing device 141 is disposed on the side of the mobile device 100 close to the boundary, in order to avoid the front view of the edgewise image capturing device 141 being blocked, it is generally disposed in front of the moving direction of the mobile device 1 and close to the side of the boundary, specifically, the edgewise image capturing device 141 may be disposed in front of the moving direction of the mobile device and within a distance S1 from the side of the mobile device close to the boundary, S1 may be determined by the view of the edgewise image capturing device, for example, S1 is 0-5cm in this embodiment, of course, in other embodiments, S1 may also be selected in other ranges according to practical situations. In the height direction, the installation height of the edgewise image acquisition device does not exceed the position 20cm above the ground.
As shown in fig. 6, in the edge-finding mode, the edge-finding image capturing device 142 captures image information of an environment in which the edge moves from the mobile device 100, and the control module 101 controls the mobile device 100 to move and cut within the edge according to the captured image information. As shown in fig. 6, in the edge-finding mode, the edge-finding image capturing device 142 moves perpendicular to the straight line where the boundary is located from the mobile device 100, that is, moves along the direction perpendicular to the straight line where the boundary is located from the mobile device 100, and when the edge-finding image capturing device 142 captures the boundary image, the edge-finding image capturing device continues to move forward by the preset distance L, and then turns around to prevent the edge-finding image capturing device 100 from driving out of the boundary. The edge-detecting image capturing device 142 is disposed within a distance S2 from the central axis of the mobile device, and S2 may be determined by the field of view of the edge-detecting image capturing device, so as to ensure that the edge-detecting image capturing device can capture the boundary image in real time during the movement of the mobile device 100 toward the boundary. For example, in this embodiment, S2 is 0-4cm, but in other embodiments, S2 may be selected in other ranges according to actual situations. In the height direction, the installation height of the edge-detecting image acquisition device does not exceed the position 20cm above the ground.
In summary, as shown in fig. 5 and fig. 6, in the left-right direction of the self-moving device, the edge image capturing device is disposed on a side of the self-moving device close to the boundary of the working area, and the edge image capturing device is disposed on a side of the self-moving device close to the center of the self-moving device; in the front-back direction, the edgewise image acquisition device and the edge-exploring image acquisition device are both arranged in front of the advancing direction of the mobile equipment; in the height direction, the installation heights of the edge image acquisition device and the edge detection image acquisition device do not exceed the ground by 20cm upwards; in the left-right direction, the distance from the edge image acquisition device to the side, close to the boundary of the working area, of the self-moving equipment is S1, S1 is in the range of 0-5cm, the distance from the edge-detecting image acquisition device to the central axis of the self-moving equipment is S2, and S2 is in the range of 0-4 cm.
Of course, in another embodiment, there may be only one image capturing device, for example, one image capturing device is provided which is movable, and when the mobile device 100 is in the edgewise mode, the image capturing device is in the first state, and when the mobile device 100 is in the edgewise mode, the mobile device is in the second state different from the first state, so as to adapt to the image capturing angle required by the mode in which the mobile device is located.
Or, when the installation range of the edge-detecting image capturing device 142 and the installation range of the edge-following image capturing device 141 can be overlapped, only one image capturing device may be provided, and the image capturing device is disposed in the overlapping area, so that the image capturing angle can meet the requirements of the edge-detecting mode and the edge-following mode. For example, in one embodiment, the image capturing device is disposed above 30cm from the ground, and when the distance from the side of the mobile device close to the boundary is S0 (e.g., S0 is 1/4-1/3 of the width of the mobile device), the image capturing device can be used as both the edgewise image capturing device and the edge-detecting image capturing device. Of course, the above is only the distance, and in other embodiments, other numbers of image capturing devices 140 may be provided.
In this embodiment, the image acquisition device may also be used to acquire an environmental image, and the control module may determine whether an obstacle exists in front of the mobile device according to the environmental image acquired by the image acquisition device, and control the mobile device to automatically avoid the obstacle.
Furthermore, the control module acquires the type of the obstacle in the environment image according to the environment image acquired by the image acquisition device, and controls the self-mobile equipment to execute the action corresponding to the current type of the obstacle. Specifically, the type of the front obstacle in the environment image can be automatically identified through the control module, and the self-moving equipment is controlled to execute the action corresponding to the current obstacle type according to the identified different obstacle types. And after the environment image is collected, the image is sent to a cloud end, the obstacle type is identified through cloud end operation, the identification result is sent to the self-moving equipment, the control module obtains the identification result, and the self-moving equipment is controlled to execute the action corresponding to the current obstacle type. In one embodiment, the types of obstacles may include humans, animals, accessible obstacles, inaccessible obstacles, and the like, wherein animals include cats, dogs, hedgehogs, and the like; accessible obstacles include houses, trees, flower beds, fences, road teeth, etc.; non-accessible obstacles include ponds, roads, etc.
When the control module obtains the recognition result that the front obstacle is a person, the control module controls the self-moving equipment to turn around or avoid the person or stop moving to avoid colliding the person; certainly, when a person in front is identified, the self-moving equipment can also perform deceleration movement firstly to avoid the collision with the person due to the fact that the speed is high and the person does not turn around or turn around suddenly; in one embodiment, when the obstacle in front is recognized as a person, the self-moving device can also interact with the person, specifically, through voice interaction, for example, warning voices such as "danger, please pay attention to avoiding" and the like can be played; of course, interaction may also be by other means, such as flashing warning lights, etc. After interacting with the person, if the person is recognized to have left, the self-moving device is controlled to continue to advance, and if the person is recognized not to have left, the self-moving device is controlled to turn or turn around or stop moving so as to avoid colliding with the person.
And when the control module acquires the recognition result that the front obstacle is the animal, for example, a cat, a dog, a hedgehog and the like are recognized in front, the control module can control the self-moving equipment to decelerate and/or send warning voice or light and the like to drive the animal, and if the driving is ineffective, the control module can control the self-moving equipment to turn around or turn around to avoid the animal. Of course, when the speed reduction driving is performed, the non-contact driving is ineffective by voice or light, and the like, the self-moving equipment can be further controlled to collide with the front obstacle at a low speed so as to further drive the animals. After the animals are driven in the mode, if the animals are identified to leave, the self-moving equipment is controlled to continue to advance, and if the animals are identified not to leave, the self-moving equipment is controlled to turn or turn around to avoid the animals. In some embodiments, in order to further ensure the safety of the animals, the self-moving device further comprises a collision sensor, when the self-moving device collides with a front obstacle, the collision sensor detects the collision, and the control module controls the self-moving device to turn or turn around so as to avoid the animals.
When the control module obtains the recognition result that the front obstacle is the contactable obstacle, for example, a house, a tree, a flower bed and the like are recognized in the front, the control module controls the self-moving equipment to approach the front obstacle so as to cut the cuttable area completely as far as possible, and when the front obstacle approaches the contactable obstacle, the control module turns or turns around to avoid the front obstacle. In order to further ensure safety, when the house, the tree, the flower bed and the like which can contact the obstacle in front are identified, the vehicle firstly decelerates to drive to avoid impacting the obstacle in front at too high speed, and decelerates to drive until the vehicle approaches the obstacle in front and then turns or turns around to avoid the obstacle in front. In order to enable the self-moving device to cut to the vicinity of the accessible obstacle as much as possible so as to cut the whole cuttable area, in some embodiments, the self-moving device further comprises a collision sensor, when the self-moving device recognizes that the accessible obstacle such as a house, a tree, a flower bed and the like exists in front, the self-moving device is controlled to run at a reduced speed and advance at a low speed until the accessible obstacle collides with the front obstacle, the collision sensor detects the collision, and the control module controls the self-moving device to turn around or turn around, so that on one hand, the area in the vicinity of the obstacle is cut to the whole as much as possible; on the other hand, a low speed collision ensures that the accessible obstacle is not damaged.
And when the control module obtains the recognition result that the front obstacle is the inaccessible obstacle, for example, when a pond or a boundary in front is recognized, the control module controls the mobile device to drive to the position close to the front obstacle, and turns around or turns around when the front obstacle is close to the front obstacle so as to avoid contacting the inaccessible obstacle, for example, avoiding rushing into the pond or rushing to a road. In order to further ensure safety, when the situation that the front is provided with a pond, a road or other inaccessible obstacles is identified, the vehicle decelerates to approach the front obstacles and then turns around or turns around to avoid contacting the inaccessible obstacles and avoid rushing into or impacting the front obstacles at too high speed.
In the above embodiments where the control module controls the self-moving device to adopt different motion strategies for different front obstacles, the control module should consider the blind area distance a shown in fig. 4 in the process of controlling the self-moving device to approach the front obstacle, so that the self-moving device cuts as close as possible to the obstacle, and the cuttable lawn area is completely cut.
In the process of identifying the obstacle in the environment image, the obstacle type in the environment image can be obtained by performing image detection, image segmentation, image classification and the like on the environment image. The image detection, image segmentation, image classification and other means can automatically extract features based on a neural network method, and certainly, the features can also be extracted by a traditional method for manually setting the features. The neural network model can be operated at a mobile device end to obtain a recognition result, and can also be deployed at a cloud end to obtain the recognition result through cloud end calculation.
In this embodiment, the corresponding motion strategies are illustrated only by the four different obstacle types, and in other embodiments, the control module may also set other obstacle types and corresponding motion strategies, or match the four obstacle types with other motion strategies. This application is not exemplified.
The technical features of the embodiments described above may be arbitrarily combined, and for the sake of brevity, all possible combinations of the technical features in the embodiments described above are not described, but should be considered as being within the scope of the present specification as long as there is no contradiction between the combinations of the technical features.
The above-mentioned embodiments only express several embodiments of the present invention, and the description thereof is more specific and detailed, but not construed as limiting the scope of the invention. It should be noted that, for a person skilled in the art, several variations and modifications can be made without departing from the inventive concept, which falls within the scope of the present invention. Therefore, the protection scope of the present patent shall be subject to the appended claims.

Claims (16)

1. A self-moving apparatus for automatically moving and working within a work area, comprising: a housing;
the moving module is positioned below the shell and used for driving the shell to move;
the working module is arranged on the shell to execute a preset working task;
the image acquisition device is used for acquiring an environment image where the self-moving equipment is located;
the control module is used for autonomously controlling the mobile module to drive the shell to move and autonomously controlling the working module to execute a preset working task;
the self-moving device comprises an edge mode in which the control module is configured to: analyzing whether the boundary of the working area exists in the environment image or not according to the environment image acquired by the image acquisition device; when the working area boundary does not exist in the environment image, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary; when the working area boundary exists in the environment image, dividing the environment image into N sub-images, analyzing whether the working area boundary exists in each sub-image, respectively fitting the working area boundary in each sub-image with the working area boundary into a straight line, generating parameters of the straight line, and controlling the self-moving equipment to move and work along the working area boundary according to the parameters, wherein N is more than or equal to 2.
2. The self-moving apparatus as claimed in claim 1, wherein 2 ≦ N ≦ 8.
3. The self-moving device of claim 1 or 2, wherein in the edge mode, the control module is further configured to: when the working area boundary exists in the environment image, the environment image is divided into N sub-images along the direction far from and near to the self-moving equipment, the self-moving equipment is controlled to move and work according to the sub-image closest to the self-moving equipment, and the follow-up movement and work of the self-moving equipment are predicted according to the remaining sub-images.
4. The self-moving device of claim 3, wherein in the edge mode, the control module is further configured to: when the working area boundary exists in the environment image, the environment image is divided into two sub-images along the direction far away from the self-moving equipment, the self-moving equipment is controlled to move and work according to the sub-images close to the self-moving equipment, and the follow-up movement and work of the self-moving equipment are predicted according to the sub-images far away from the self-moving equipment.
5. The self-moving device of claim 1 or 2, wherein in the edge mode, the control module is further configured to: judging whether the boundary of the current working area is lost or not in real time, and controlling the mobile module to automatically move to search the boundary of the working area when the boundary of the current working area is lost; and when the boundary of the current working area is not lost, controlling the self-moving equipment to continuously move and work along the boundary of the current working area.
6. The self-moving device as claimed in claim 5, wherein said determining in real time whether the current working area boundary is lost comprises: counting the proportion of the target object and the non-target object on two sides of the boundary of the current working area, and judging that the boundary of the current working area is not lost when the proportion of the target object and the non-target object is in a preset range; and when the proportion of the target object to the non-target object is not in the preset range, judging that the boundary of the current working area is lost.
7. The self-moving device of claim 5, wherein controlling the movement module to automatically move to find a work area boundary when the current work area boundary is lost comprises: when the boundary of the current working area is lost, controlling the mobile module to rotate a certain angle to search the boundary of the working area; and if the working area boundary is not found through rotation, controlling the moving module to move according to a preset edge finding logic so as to find the working area boundary.
8. The self-moving device as claimed in claim 1 or 2, wherein the self-moving device further comprises a run-edge mode in which the control module is configured to: controlling the image acquisition device to acquire the environment image, and judging whether the working area boundary exists in the environment image according to the environment image; when the working area boundary does not exist in the environment image, controlling the mobile module to move and work according to the preset in-edge moving logic; when the working area boundary exists in the environment image, the working area boundary in the environment image is fitted into a straight line, parameters of the straight line are generated, and the self-moving equipment is controlled to move and work in the working area boundary according to the parameters.
9. The self-moving device as claimed in claim 8, wherein in the edge-seeking mode, when the working area boundary exists in the environment image, controlling the self-moving device to move and work within the working area boundary according to the parameter comprises: and when the self-moving equipment is judged to be close to the boundary of the working area according to the environment image, controlling the self-moving equipment to continuously move for a certain distance along the original direction, and then adjusting the moving direction of the self-moving equipment.
10. The self-moving apparatus as claimed in claim 1 or 2, wherein the self-moving apparatus is an automatic lawn mower automatically moving and mowing on a lawn, the working module is a mowing module for performing a mowing task, and the working area boundary is the lawn boundary.
11. The self-moving device as claimed in claim 1 or 2, wherein the image acquisition device comprises an edge image acquisition device and a detection image acquisition device, and in the edge mode, the control module controls the movement and cutting of the self-moving device according to the environment image acquired by the edge image acquisition device; in the edge-detecting mode, the control module controls the self-moving equipment to move and cut according to the environment image acquired by the edge-detecting image acquisition device.
12. The self-moving device as claimed in claim 11, wherein the edgewise image capturing device is disposed on a side of the self-moving device near a boundary of the working area, and the edge-approaching image capturing device is disposed on a side of the self-moving device near a center of the self-moving device, in a left-right direction of the self-moving device.
13. The self-moving apparatus according to claim 11, wherein the edgewise image capturing device and the outlying image capturing device are both disposed forward of the self-moving apparatus in a forward-backward direction; in the height direction, the installation height of the edgewise image acquisition device and the installation height of the edge-detecting image acquisition device do not exceed the ground by 20cm upwards; in the left and right directions, the distance from the edge image acquisition device to the side, close to the working area boundary, of the self-moving equipment is S1, S1 is in the range of 0-5cm, the distance from the edge detection image acquisition device to the central axis of the self-moving equipment is S2, and S2 is in the range of 0-4 cm.
14. The self-moving device as claimed in claim 1 or 2, further comprising a reference object detection module for detecting a reference object, wherein the control module controls the self-moving device to automatically move and cut within at least two sub-working areas according to information detected by the reference object detection module.
15. The self-moving device as claimed in claim 14, wherein after the self-moving device is operated in a sub-working area, the control module controls the self-moving device to search the reference object along the edge of the self-moving device according to the environment image collected by the image acquisition device, and when the reference object detection module detects the reference object, the control module controls the self-moving device to enter the next working sub-area according to the information detected by the reference object detection module.
16. The self-moving device as claimed in claim 1, wherein the control module controls the self-moving device to perform an action corresponding to a current type of an obstacle according to different types of obstacles in the environment image.
CN202010642119.4A 2020-01-06 2020-07-06 Self-moving equipment Active CN113156929B (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
PCT/CN2020/129821 WO2021139414A1 (en) 2020-01-06 2020-11-18 Self-moving device
PCT/CN2021/070477 WO2021139683A1 (en) 2020-01-06 2021-01-06 Self-moving device
CN202180005855.1A CN114868095A (en) 2020-01-06 2021-01-06 Self-moving equipment

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
CN2020100112002 2020-01-06
CN202010011200 2020-01-06
CN2020105658145 2020-06-19
CN202010565814 2020-06-19

Publications (2)

Publication Number Publication Date
CN113156929A true CN113156929A (en) 2021-07-23
CN113156929B CN113156929B (en) 2024-02-23

Family

ID=76882224

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010642119.4A Active CN113156929B (en) 2020-01-06 2020-07-06 Self-moving equipment

Country Status (1)

Country Link
CN (1) CN113156929B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116088533A (en) * 2022-03-24 2023-05-09 未岚大陆(北京)科技有限公司 Information determination method, remote terminal, device, mower and storage medium
WO2023198029A1 (en) * 2022-04-14 2023-10-19 苏州宝时得电动工具有限公司 Automatic mower

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111651A (en) * 2013-04-22 2014-10-22 苏州宝时得电动工具有限公司 Automatic walking equipment and method for automatic walking equipment to return to stop station
CN106910198A (en) * 2017-02-21 2017-06-30 昂海松 A kind of boundary determining method of hay mower without electric wire fence
CN108629292A (en) * 2018-04-16 2018-10-09 海信集团有限公司 It is bent method for detecting lane lines, device and terminal
US20190155301A1 (en) * 2017-11-07 2019-05-23 Stocked Robotics, Inc. Camera based system for determining robot heading in indoor environments
CN110347153A (en) * 2019-06-26 2019-10-18 深圳拓邦股份有限公司 A kind of Boundary Recognition method, system and mobile robot
CN110580047A (en) * 2019-09-17 2019-12-17 尚科宁家(中国)科技有限公司 anti-falling traveling method of autonomous robot and autonomous robot

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104111651A (en) * 2013-04-22 2014-10-22 苏州宝时得电动工具有限公司 Automatic walking equipment and method for automatic walking equipment to return to stop station
CN106910198A (en) * 2017-02-21 2017-06-30 昂海松 A kind of boundary determining method of hay mower without electric wire fence
US20190155301A1 (en) * 2017-11-07 2019-05-23 Stocked Robotics, Inc. Camera based system for determining robot heading in indoor environments
CN108629292A (en) * 2018-04-16 2018-10-09 海信集团有限公司 It is bent method for detecting lane lines, device and terminal
CN110347153A (en) * 2019-06-26 2019-10-18 深圳拓邦股份有限公司 A kind of Boundary Recognition method, system and mobile robot
CN110580047A (en) * 2019-09-17 2019-12-17 尚科宁家(中国)科技有限公司 anti-falling traveling method of autonomous robot and autonomous robot

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116088533A (en) * 2022-03-24 2023-05-09 未岚大陆(北京)科技有限公司 Information determination method, remote terminal, device, mower and storage medium
WO2023198029A1 (en) * 2022-04-14 2023-10-19 苏州宝时得电动工具有限公司 Automatic mower

Also Published As

Publication number Publication date
CN113156929B (en) 2024-02-23

Similar Documents

Publication Publication Date Title
CN112584697B (en) Autonomous machine navigation and training using vision system
US11161235B2 (en) Self-moving robot
CN107402573B (en) Automatic working system, automatic moving equipment and control method thereof
CN102771246B (en) Intelligent mower system and intelligent mowing method thereof
CN102662400A (en) Path planning algorithm of mowing robot
CN113156929B (en) Self-moving equipment
US11853063B2 (en) Outdoor power equipment machine with presence detection
US20210185906A1 (en) Autonomous travel work machine
US20230270044A1 (en) Robotic mower having multiple operating modes
AU2024201982A1 (en) Barrier passage system for autonomous working machine
CN114868095A (en) Self-moving equipment
CN111413977A (en) Path planning system and method of mowing robot based on machine vision
JP7184920B2 (en) Autonomous work machine
CN110727270A (en) Automatic working system and method for establishing working area control map thereof
CN113068506B (en) Greenhouse orchard mowing robot and control method thereof
CN114937258B (en) Control method for mowing robot, and computer storage medium
CN211698708U (en) Automatic working system
WO2021139685A1 (en) Automatic operation system
CN117032227A (en) Self-moving equipment, obstacle detection method and obstacle detection module thereof
CN215302577U (en) Multifunctional obstacle-clearing weeding and pesticide-spraying crawler robot
CN114291083B (en) Self-moving device control method, device, system, medium and self-moving device
EP4332716A2 (en) Mapping objects encountered by a robotic garden tool
CN112486157B (en) Automatic working system, steering method thereof and self-moving equipment
KR20230105063A (en) Lawn mower system with markers to set working areas outdoors
WO2023146451A1 (en) Improved operation for a robotic work tool system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant