WO2016000622A1 - 自动行走设备 - Google Patents

自动行走设备 Download PDF

Info

Publication number
WO2016000622A1
WO2016000622A1 PCT/CN2015/083100 CN2015083100W WO2016000622A1 WO 2016000622 A1 WO2016000622 A1 WO 2016000622A1 CN 2015083100 W CN2015083100 W CN 2015083100W WO 2016000622 A1 WO2016000622 A1 WO 2016000622A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
walking device
real
standard
automatic
Prior art date
Application number
PCT/CN2015/083100
Other languages
English (en)
French (fr)
Inventor
盛晓初
孙根
邵勇
Original Assignee
苏州宝时得电动工具有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CN201410312406.3A external-priority patent/CN105334849A/zh
Priority claimed from CN201410311780.1A external-priority patent/CN105334848A/zh
Priority claimed from CN201410386482.9A external-priority patent/CN105334850A/zh
Priority claimed from CN201510003318.XA external-priority patent/CN105825160B/zh
Application filed by 苏州宝时得电动工具有限公司 filed Critical 苏州宝时得电动工具有限公司
Publication of WO2016000622A1 publication Critical patent/WO2016000622A1/zh

Links

Images

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course or altitude of land, water, air, or space vehicles, e.g. automatic pilot
    • G05D1/12Target-seeking control
    • HELECTRICITY
    • H02GENERATION; CONVERSION OR DISTRIBUTION OF ELECTRIC POWER
    • H02JCIRCUIT ARRANGEMENTS OR SYSTEMS FOR SUPPLYING OR DISTRIBUTING ELECTRIC POWER; SYSTEMS FOR STORING ELECTRIC ENERGY
    • H02J7/00Circuit arrangements for charging or depolarising batteries or for supplying loads from batteries

Definitions

  • the present invention relates to an automatic walking device, and more particularly to an automatic walking device capable of automatically aligning a target object, and to a docking system corresponding to the automatic walking device.
  • the automatic walking equipment can realize automatic work without manual operation, such as automatic lawn mower or automatic vacuum cleaner, which realizes mowing or dust removal when the user goes to work or entertaining, which brings great convenience to the user.
  • Autonomous walking equipment typically travels within a predetermined work area and returns to a particular territory (such as a docking station, etc.) to replenish energy when the battery is low, or to return to a particular territory upon completion of work or rain.
  • the existing automatic walking equipment generally returns to the stopping station along the boundary line according to a predetermined direction, and the regression efficiency is low. Since the charging terminal of the docking station is set in a specific direction, if the automatic walking device is controlled to return directly to the docking station at any place, the autonomous walking device may not be successfully docked with the docking station.
  • the positioning scheme of the intelligent lawn mower to the charging station is to arrange color strip marks at certain positions on the charging station, and analyze whether the image captured by the on-board camera of the automatic lawn mower contains matching colored strips, if Yes, the charging station enters the field of view, the relative position information of the mower and the charging station is calculated, and the motion control guides the mower to enter the station; if not, the charging station is not found in the visible range. The mower continues to travel in its original mode.
  • the current solution has certain drawbacks, such as when the camera line of the autonomous walking device is perpendicular to the marking plane, the identification of the marking is most advantageous. The farther the line of sight is from the perpendicular to the marking plane, the weaker the ability to identify the mark. When the autonomous walking device is located at the left, right, and rear of this marking plane, the marking cannot be effectively recognized.
  • the present invention provides an automatic walking apparatus capable of automatically aligning a target object, and by automatically aligning a target object, can efficiently return to a charging station with a target or a territory with a target object.
  • an automatic walking device comprising: an image collecting device for collecting real-time images; and a control module connected to the image collecting device for controlling The automatic walking device works;
  • the control module comprises: an image recognition unit, configured to identify whether a target object appears in the real-time image collected by the image capturing device; the offset determining unit, and the real-time image of the target object collected by the image capturing device
  • the standard image of the object that can be collected when the autonomous walking device is in a standard direction is compared, and the offset of the autonomous walking device with respect to the standard direction is determined according to the change of the shape of the real-time image with respect to the standard image.
  • the standard direction is a direction perpendicular to a surface of the surface on which the object is located.
  • the offset determination unit determines the shape change of the real-time image with respect to the standard image by extracting the real-time image of the target object and the shape feature of the standard image.
  • the shape feature includes a proportional relationship between a specific side of one side of the outer contour of the polygon image and a side length of a specific side of the other side.
  • the determining includes: a ratio of a side length of a specific side of one side of the outer contour of the real image to a side of a specific side of the other side is smaller than a side length ratio of a specific side of the side of the outer contour of the standard image and a side of the specific side of the other side,
  • the autowalk device is biased toward the other side with respect to the standard direction; the ratio of the side length of the specific side of the outer contour side of the real image to the side of the specific side of the other side is greater than the specific side of the side of the outer contour of the standard image and the specificity of the other side
  • the edge length ratio of the edge, the autonomous walking device is biased toward the one side with respect to the standard direction; the ratio of the side length of the specific side of the outer contour side of the real image to the side of the specific side of the other side is equal to the specific side of the side of the outer contour of the standard image
  • the ratio of the sides of a particular side on the other side, the autonomous device is in the standard direction.
  • a ratio of a side of a specific side of a standard image outer side to a side of a specific side of the other side is equal to 1; the judgment includes: a specific side of the outer contour side of the real-time image and another The side length ratio of the specific side of the side is less than 1, and the autonomous walking device is biased toward the other side with respect to the standard direction; the ratio of the side length of the specific side of the real image outer side to the specific side of the other side is greater than 1, the automatic walking The device is biased toward the one side with respect to the standard direction; the ratio of the side length of the specific side of the real image outer side to the specific side of the other side is equal to 1, and the autonomous walking device is in the standard direction.
  • the control module adjusts the walking direction of the automatic walking device until the walking direction thereof is aligned with the standard direction.
  • the control module controls the automatic walking to walk toward the standard direction until walking to the target object.
  • control module further includes a distance calculating unit that calculates a distance between the automatic walking device and the target before the relative standard direction offset determination is performed, and is controlled automatically by the control module.
  • the walking device walks within a predetermined distance from the target.
  • control module further includes a regression direction confirming unit, and the offset determining unit is in progress Before the line is judged against the standard direction offset, the return direction confirming unit judges and controls the orientation of the autonomous walking device to point to the target.
  • the regression direction confirming unit determines that the walking direction of the automatic walking device is toward the target object.
  • the image recognition unit recognizes that the real-time image collected by the image acquisition device is an image containing the target by comparing the real-time image acquired by the image acquisition device with the color feature of the standard image of the target.
  • An automatic walking device comprises: an image collecting device for collecting real-time images; a control module connected to the image collecting device for controlling the operation of the automatic walking device; the control module comprising: an image recognition unit for recognizing the image Whether the target object appears in the real-time image collected by the acquisition device; the offset determination unit, the shape feature of the real-time image of the target object collected by the image acquisition device and the target object that can be collected when the automatic walking device is in a standard direction The shape features of the standard image are compared, and the offset of the autonomous walking device with respect to the standard direction is judged based on the change of the shape of the real-time image with respect to the standard image.
  • the invention also provides a docking system for an automatic walking device, which can achieve fast and accurate docking.
  • a docking system for an automatic walking device comprising: a charging station for providing electric energy to the automatic walking device, and a target for guiding the automatic walking device to return to charge;
  • the device comprises: an image acquisition device for collecting real-time images; a control module connected to the image acquisition device for controlling the operation of the automatic walking device; the control module comprising: an image recognition unit, configured to identify the image acquisition device Whether the target object appears in the real-time image; the regression direction confirming unit is used to ensure that the automatic walking device faces the target object; the offset determining unit locates the shape feature of the real-time image of the target object collected by the image collecting device and the automatic walking device The shape features of the standard image of the target object that can be collected in a standard direction are compared, and the offset of the automatic walking device relative to the standard direction is determined according to the change of the shape of the real-time image with respect to the standard image; when the image recognition unit recognizes The real-time image captured by the image acquisition device
  • the standard direction is a direction perpendicular to a surface of the surface on which the object is located.
  • the image recognition unit identifies whether the real-time image collected by the image acquisition device includes an image of the target by comparing the real-time image acquired by the image acquisition device with the color feature of the standard image of the target.
  • the outer contour shape of the target is rectangular and has at least two different colors.
  • control module further includes a distance calculation unit, the distance calculation unit calculates a distance between the automatic walking device and the target, and controls the automatic walking device before performing the determination with respect to the standard direction offset Walk to a preset distance from the target.
  • distance calculation unit calculates a distance between the automatic walking device and the target, and controls the automatic walking device before performing the determination with respect to the standard direction offset Walk to a preset distance from the target.
  • the shape feature includes a proportional relationship between a specific side of one side of the outer contour of the polygon image and a side length of a specific side of the other side.
  • the determining comprises: the ratio of the side length of the specific side of the outer contour of the real image to the side of the specific side of the other side is smaller than the specific side of the outer side of the standard image and the specific side of the other side
  • the side length ratio, the autonomous walking device is biased toward the other side with respect to the standard direction; the ratio of the side length of the specific side of the outer contour side of the real image to the side of the specific side of the other side is greater than the specific side of the side of the outer contour of the standard image and the other side
  • the ratio of the side of a particular side to the length of a particular side of the other side, the autonomous device is in the standard direction.
  • the automatic walking device of the present invention can automatically align the target, thereby improving the efficiency of the automatic walking device returning to a specific territory.
  • the docking system of the automatic walking device of the invention can perform regression and docking quickly and accurately.
  • the invention also provides an automatic walking device control method and an automatic working system for improving the regression efficiency of the automatic walking device and realizing the effective docking between the automatic walking device and the docking station.
  • the technical solution of the present invention is: an automatic walking device control method for controlling the automatic walking device to return to a docking station, wherein the docking station is provided with a representation of the automatic walking device and the docking station returning to the docking station.
  • the automatic walking device is provided with an image collecting device and a processor, the method comprising the following steps: Step S1: the processor identifies the direction identifier; Step S2: the processor determines the location The traveling direction of the automatic walking device is described; step S3: the processor controls the walking path of the automatic walking device such that the walking direction of the automatic walking device coincides with the docking direction indicated by the direction identifier; step S4: the processing The device controls the autonomous walking device to dock with the docking station in the docking direction indicated by the direction indicator.
  • the docking station is further provided with a docking station identifier, the docking station logo having a specific shape or/and a pattern, and further comprising the following steps before the step S1: step S101: the automatic walking When the device starts to return, the processor identifies the stop station identifier; step S102: the processor determines whether the distance between the automatic walking device and the stop station identifier is equal to the first distance value L1, and then proceeds to step S1. Otherwise, the process proceeds to step S103; step S103: the processor controls the automatic walking device to advance toward the stop station identification.
  • the docking station identifier is located at the top of the docking station, and the docking station identifier has a cylindrical shape.
  • step S103 when the processor controls the automatic walking device to advance toward the docking station identifier, controlling a walking direction of the autonomous walking device, so that an image corresponding to the docking station identifier is always located in the The first specific area in the image acquired by the image acquisition device.
  • the processor determines the distance between the autonomous walking device and the docking station according to the size of the area of the image corresponding to the docking station identifier in the image.
  • the step S3 includes: Step S311: the processor controls the automatic walking device to rotate by a predetermined angle; Step S312: The processor controls the automatic walking device to travel a predetermined interval s; Step S313: The processor controls the automatic walking device to turn and is pointed toward the direction; step S314: the processor determines whether the walking direction of the automatic walking device coincides with the docking direction indicated by the direction identifier, and then proceeds to step S4. Otherwise, the process returns to step S311.
  • the docking station is further provided with a positioning identifier at the front end of the direction marking in the docking direction, and the positioning identifier has a specific shape or/and a pattern.
  • the step S3 includes: step S321: the processor Identifying the positioning identifier, and controlling the automatic walking device to face the positioning identifier; step S322: the image acquired by the image capturing device has a center line dividing the image into two parts, and the processor calculates the image a first angle ⁇ formed between the direction identifier and the center line; step S323: the processor calculates a second distance value L2 according to the first distance value L1 and the first angle ⁇ ; step S324: the processor controls The automatic walking device rotates by a predetermined angle; step S325: the processor controls the automatic walking device to travel a second distance value L2; step S326: the processor controls the automatic walking device to turn and mark the direction .
  • the docking station is further provided with a positioning identifier at the front end of the direction marking in the docking direction, and the positioning identifier has a specific shape or/and a pattern.
  • the step S3 includes: Step S331: the processor Identifying the positioning identifier, and controlling the automatic walking device to face the positioning identifier; step S332: the image acquired by the image capturing device has a center line dividing the image into two parts, and the processor calculates the image a first angle ⁇ formed between the direction identifier and the center line; step S333: the processor according to the point where the autonomous walking device is located, the positioning identifier, and the first corner The degree ⁇ constructs a specific triangle, and calculates another side edge Lx of the first angle ⁇ and a side length of the opposite side L2 according to the first angle ⁇ and the first distance value L1; step S334: the processor calculates a radius R of the inscribed edge Lx at a specific position on the adjacent side Lx and at the same time as the
  • the direction is identified by a straight line pattern, a rectangular pattern or at least one arrow pattern.
  • the image acquired by the image acquisition device has a center line dividing the image into two parts, and the processor determines the automatic when the center line overlaps or substantially overlaps with the image of the direction identifier.
  • the walking direction of the traveling device coincides with the docking direction indicated by the direction indicator.
  • the invention also provides an automatic working system, comprising an automatic walking device and a docking station, wherein the docking station is provided with a direction indicator indicating a docking direction when the autonomous walking device and the docking station are returned to each other, and the automatic walking device is provided with An image acquisition device and a processor, the processor includes: a direction identification module for identifying the direction identifier; a docking direction determining module, configured to determine a walking direction of the automatic walking device; and a regression path control module When the traveling direction of the automatic walking device does not coincide with the docking direction indicated by the direction indicator, the regression path control module controls the walking path of the automatic walking device such that the walking direction of the automatic walking device and the direction identifier are The second regression control module controls the automatic walking device to identify along the direction when the walking direction of the automatic walking device coincides with the docking direction indicated by the direction identifier The docking direction shown interfaces with the docking station.
  • a direction identification module for identifying the direction identifier
  • a docking direction determining module configured to
  • the docking station is further provided with a docking station identifier
  • the docking station identifier has a specific shape or/and a pattern
  • the automatic working system further comprises: a docking station identifier identifying module, configured to identify the docking station a distance determining module, configured to determine whether a distance between the automatic walking device and the landing station identifier is equal to the first distance value L1 during the advancement of the automatic walking device toward the docking station identifier; the first regression control module, For controlling the automatic walking device to advance toward the docking station identification.
  • the docking station identifier is located at the top of the docking station, and the docking station identifier has a cylindrical shape.
  • the image acquired by the image acquisition device has a first specific area, and the first regression control
  • the module controls the automatic walking device to advance toward the landing station identification
  • the walking direction of the automatic walking device is adjusted such that the image corresponding to the landing station identification is always located in the first specific region of the image.
  • the distance determining module determines the distance between the automatic walking device and the docking station according to the size of the area value of the image corresponding to the docking station identifier.
  • the regression path control module includes: a first steering control module, configured to control the automatic walking device to rotate a predetermined angle in a direction indicated by the direction indication in the image; and a spacing control module configured to control the automatic walking device The predetermined distance s is set; the second steering control module controls the rotation direction direction indicator of the automatic walking device after the automatic walking device travels a predetermined distance s; the direction determining module is configured to determine whether the walking direction of the automatic walking device is The direction of the docking indicated by the direction identifier coincides.
  • the docking station is further provided with a positioning identifier located at the front end of the direction marking in the docking direction, the positioning identifier has a specific shape or/and a pattern, and the regression path control module comprises: a positioning identifier recognition module.
  • an angle calculation module wherein the image acquired by the image acquisition device has a center line dividing the image into left and right parts, the angle calculation module is configured to identify the positioning identifier and control the automatic walking device toward the positioning identifier; Calculating a first angle ⁇ formed between the direction identifier and the center line in the image; the distance calculation module is configured to calculate the second distance value L2 according to the first distance value L1 and the first angle ⁇ ; the first steering control module uses Controlling the automatic walking device to rotate a predetermined angle; the distance control module is configured to control the automatic walking device to travel a second distance value L2; and the second steering control module is configured to control the automatic walking device to turn and face the Direction identification.
  • the docking station is further provided with a positioning identifier located at the front end of the direction marking in the docking direction, the positioning identifier has a specific shape or/and a pattern, and the regression path control module comprises: a positioning identifier recognition module.
  • an angle calculation module wherein the image acquired by the image acquisition device has a center line dividing the image into left and right parts, the angle calculation module is configured to identify the positioning identifier and control the automatic walking device toward the positioning identifier; Calculating a first angle ⁇ formed between the direction identifier and the center line in the image; the side length calculation module constructs a specific triangle according to the point where the automatic walking device is located, the positioning identifier, and the first angle ⁇ , according to the An angle ⁇ , a first distance value L1 calculates another side edge Lx of the first angle ⁇ in the triangle and a side length of the opposite side L2; and a radius calculation module for calculating a specific position and the adjacent side on the adjacent side Lx Lx inscribed, at the same time the radius R of the inscribed circle tangent to the opposite side L2; a first steering control module for controlling the indication of the automatic walking device toward the direction indicator The direction of rotation is rotated by a predetermined angle; the point-cutting control module is configured to control the automatic
  • the direction is identified by a straight line pattern, a rectangular pattern or at least one arrow pattern.
  • the automatic walking device control method and the automatic working system of the invention can improve the regression efficiency of the automatic walking device and realize the effective docking between the automatic walking device and the docking station.
  • the invention also provides an image recognition-based positioning device and an locating method thereof for the automatic working system, and the charging station is more accurately positioned.
  • the image recognition-based positioning device of the present invention is used for positioning a charging station, including:
  • the identifier is fixedly disposed on the charging station
  • An image acquisition module is disposed on the walking device, and configured to collect image information of the identifier;
  • An image and position corresponding module wherein the image and position corresponding module is configured to set a correspondence between image information of the identifier and position information of the charging station with respect to the walking device;
  • the position determining module is configured to compare image information of the identifier collected by the image capturing module with the identifier image information set in the image and position corresponding module with respect to the charging station The correspondence between the location information of the walking device is compared, and the location information of the charging station relative to the walking device is determined.
  • the image and location corresponding module includes a distance corresponding module and an orientation corresponding module
  • the distance corresponding module is configured to set a correspondence between image information of the identifier and distance information of the charging station with respect to the walking device;
  • the orientation corresponding module is configured to set a correspondence between image information of the identifier and orientation information of the charging station with respect to the walking device.
  • the location determination module includes a distance determination module and an orientation determination module
  • the distance determining module is configured to use image information of the identifier collected by the image capturing module, and image information of the identifier in the distance corresponding module and the charging station relative to the walking device Comparing the correspondence between the distance information, determining distance information of the charging station with respect to the walking device;
  • the orientation determining module is configured to use an image letter of the identifier collected by the image acquisition module Comparing the correspondence between the image information in the orientation corresponding module and the orientation information of the charging station with respect to the walking device, and determining the orientation information of the charging station relative to the walking device .
  • the identifier is a cylinder, and an image recognition area and an image content area are disposed on an outer surface of the identifier;
  • the image recognition area is for identifying and defining the image content area
  • the image content area includes image information content having a correspondence relationship with position information of the charging station with respect to the walking device.
  • the image content area includes different characters or different color combinations that correspond to location information of the charging station relative to the walking device.
  • a positioning method based on an image recognition and positioning device includes the following steps:
  • the walking device collects image information of a marker disposed on the charging station
  • the preset correspondence includes presetting a correspondence between the identifier image information and distance information of the charging station with respect to the walking device, and preset the identifier. Correspondence between the object image information and the orientation information of the charging station with respect to the traveling device.
  • the collected identifier image information when the collected identifier image information is compared with the preset correspondence, the collected identifier image information and the identifier image information and the charging are included. Comparing the correspondence between the distance information of the station and the traveling device, and determining distance information of the charging station with respect to the walking device;
  • the identifier is a cylinder, and an image recognition area and an image content area are disposed on an outer surface of the identifier;
  • the image recognition area is for identifying and defining the image content area
  • the image content area includes image information content having a correspondence relationship with position information of the charging station with respect to the walking device.
  • the image content area includes different characters or different color combinations that correspond to location information of the charging station relative to the walking device.
  • the image recognition-based positioning device collects the marker image information set on the charging station through the image acquisition module, and walks relative to the marker image information and the charging station preset in the image and position corresponding module through the position determining module. The positional relationship of the devices is compared to determine the position information of the charging station relative to the traveling device.
  • the identifier in the invention is three-dimensional, so that the image acquisition module can collect the image information of the marker in 360 degrees, thereby more accurately positioning.
  • the invention is based on the positioning method of the image recognition and positioning device, and can position the charging station relative to the walking device from multiple directions, high accuracy, such as front, back, left and right.
  • Figure 1 is a schematic illustration of the automated working system of the present invention.
  • FIG. 2 is a schematic diagram of a stop sign, a first specific area or a second specific area, a direction mark, and a center line in an image acquired by the image capture device of FIG. 1.
  • Fig. 3 is a partially enlarged schematic view of Fig. 2;
  • Figure 4 is a block schematic diagram of the automated working system of the present invention.
  • Figure 5 is a schematic illustration of the operation of the first preferred embodiment of the automated working system of the present invention.
  • FIG. 6 is a block schematic diagram of a regression path control module in a first preferred embodiment of the automated working system of the present invention.
  • FIG. 7 is a block schematic diagram of a regression path control module in a second preferred embodiment of the automated working system of the present invention.
  • Figure 8 is a schematic view showing the operation of the second preferred embodiment of the automatic working system of the present invention.
  • FIG. 9 is a block diagram showing a regression path control module in a third preferred embodiment of the automatic working system of the present invention.
  • Figure 10 is a schematic view showing the operation of the third preferred embodiment of the automatic working system of the present invention.
  • Figure 11 is a flow chart showing the control method of the automatic walking device of the present invention.
  • FIG. 12 is a partial flow chart showing a first preferred embodiment of the method for controlling an automatic walking device according to the present invention.
  • Figure 13 is a partial flow chart showing a second preferred embodiment of the method for controlling an autonomous walking apparatus of the present invention.
  • Figure 14 is a partial flow diagram showing a third preferred embodiment of the automatic walking device control method of the present invention intention.
  • Figure 15 is a schematic illustration of another embodiment of the automated working system of the present invention.
  • Figure 16 is a perspective view of the docking station of the automatic working system shown in Figure 15.
  • Figure 17 is a front elevational view of the docking station shown in Figure 16.
  • Figure 18 is a front elevational view of the first design of the target.
  • Figure 19 is a front elevational view of a second design of the target.
  • Figure 20 is a front view of a third design of the target.
  • Figure 21 is a block diagram of the autonomous walking apparatus of the embodiment shown in Figure 15.
  • Figure 22 is a schematic view of the automatic walking device offsetting the left side of the target.
  • Fig. 23 is a view showing the imaging of the object in the imaging region in the case shown in Fig. 22.
  • Figure 24 is a schematic view of the automatic walking device when it is aimed at the target.
  • Fig. 25 is a view showing the imaging of the object in the imaging region in the case shown in Fig. 24.
  • Figure 26 is a schematic view of the automatic walking device offsetting the right side of the target.
  • Figure 27 is a schematic illustration of the imaging of the target in the imaging region in the case shown in Figure 26.
  • Figure 28 is a schematic illustration of the advancement direction of the autonomous walking apparatus not being the return direction.
  • Figure 29 is a schematic illustration of the imaging of the target in the imaging region in the case shown in Figure 28.
  • Figure 30 is a schematic illustration of the regressive direction of the advancing direction of the autonomous walking apparatus.
  • Figure 31 is a schematic illustration of the imaging of the target in the imaging region in the case shown in Figure 30.
  • Fig. 32 is another schematic view showing that the advancing direction of the autonomous traveling apparatus is not the returning direction.
  • Figure 33 is a view showing the imaging of the object in the imaging region in the case shown in Figure 32.
  • Figure 34 is a flow chart of the automatic walking device returning to the stop station of the embodiment shown in Figure 15.
  • Figure 35 is a block diagram of another embodiment of an automatic walking apparatus.
  • 36 is a flow chart of another embodiment of an automatic walking device returning to a docking station.
  • FIG. 37 is a schematic block diagram of an image recognition based positioning device according to another embodiment of the automatic working system of the present invention.
  • 38-1 is a schematic structural diagram of an embodiment of the identifier based on the image recognition and positioning device shown in FIG. 37;
  • 38-2 is a schematic structural diagram of still another embodiment of the identifier based on the image recognition and positioning device shown in FIG. 37;
  • Figure 39-1 shows the image capture device when the auto-traveling device is located directly in front of the marker shown in Figure 38-1. Schematic diagram of the captured image
  • Figure 39-2 is a schematic diagram of an image captured by the image capture device when the automatic walking device is located at the left front of the marker shown in Figure 38-1;
  • Figure 39-3 is a schematic diagram of an image collected by the image acquisition device when the automatic walking device is located to the left of the marker shown in Figure 38-1;
  • Figure 39-4 is a schematic diagram of an image captured by the image acquisition device when the automatic walking device is located at the left rear of the marker shown in Figure 38-1;
  • Figure 39-5 is a schematic diagram of an image captured by the image capture device when the automatic walking device is located directly behind the marker shown in Figure 38-1;
  • 39-6 is a schematic diagram of an image collected by the image acquisition device when the automatic walking device is located at the right rear of the marker shown in FIG. 38-1;
  • 39-7 is a schematic diagram of an image acquired by the image acquisition device when the automatic walking device is located to the right of the identifier shown in FIG. 38-1;
  • 39-8 is a schematic diagram of an image captured by the image acquisition device when the automatic walking device is located at the right front of the marker shown in FIG. 38-1;
  • Figure 40 is a flow chart showing an embodiment of the positioning method based on the image recognition and positioning device shown in Figure 37;
  • 41 is a flow chart showing still another embodiment of the positioning method based on the image recognition positioning device shown in FIG. 37;
  • Figure 42 is a flow chart showing still another embodiment of the positioning method based on the image recognition positioning device shown in Figure 37.
  • a stop identification indicator 132 a first regression control module
  • 133 a distance determination module
  • Image recognition unit 15. Regression direction confirmation unit; 16. Distance calculation unit;
  • image acquisition module 1111, image recognition area 1112, image content area 1120, image acquisition module
  • an embodiment of the present invention provides an automatic working system and an automatic walking device control method.
  • the automated working system includes an autonomous walking device 100, such as an automatic lawn mower or an automatic vacuum cleaner, and a docking station 200.
  • the automated traveling apparatus 100 walks within the work area 400 defined by the predetermined boundary 300 and returns to the stop station 200 to replenish energy when the power is low, and returns to the stop 200 upon completion of work or rain.
  • the front portion of the automatic walking device 100 has at least two docking terminals (not shown), and the docking station 200 has at least two charging terminals 202.
  • the autonomous traveling device 100 is docked with the docking station 200, the docking terminal and the corresponding charging terminal 202 connection.
  • the docking station 200 is located on the boundary 300 of the work area 400, and the charging terminal 202 is disposed in a particular direction, such as toward the boundary 300 on the left or right side of the docking station 200.
  • the direction identifier 220 of the docking direction when the autonomous traveling device 100 is docked with the docking station 200, the docking station identifier 210, and the positioning identifier located in the docking direction and located at the front end of the direction sign 220 are shown. Show).
  • the docking station identifier 210 can be located at any location of the docking station 200, the location marker having a particular shape or/and pattern, such as the pointing marker being the pointed end of the direction marker 220.
  • the landing station identifier 210 is located at the front end of the direction identifier 220 in the docking direction, and the docking station identifier 210 serves as the positioning identifier.
  • the docking station sign 210 is disposed vertically at the top of the docking station 200.
  • the docking station sign 210 is substantially cylindrical in shape so that the area of the map formed by the docking station sign 210 is the same regardless of the direction from which it is viewed in a certain horizontal plane.
  • the docking station identifier 210 is shaped to have a cylindrical top portion 212, a middle portion 214, and a bottom portion 216 from top to bottom; the top portion 212 has the same diameter as the bottom portion 216, and the middle portion 214 has a smaller diameter than the top portion 212 and the bottom portion 216.
  • the docking station logo 210 has a particular pattern, such as: the top 212 has the same first color as the outer perimeter of the bottom 216, and the middle portion 214 has a second color that is significantly different from the first color.
  • a regular first stripe may be provided on the outer circumference of the top portion 212 and the bottom portion 216, and a regular second stripe may be provided in the middle portion 214 or no stripe may be provided.
  • the docking station 200 has a plate 230 for the autonomous walking device to dock, and the plate 230 is laid flat on the ground.
  • the direction mark 220 is located on the upper surface of the flat plate 230, and the direction mark 220 has a straight line pattern, a rectangular pattern, or at least one arrow pattern parallel to the charging terminal 202.
  • the direction indicator 220 is a plurality of end-to-end arrow patterns. Except for the last one arrow pattern, the other arrow patterns only have a diagonal line portion indicating the direction, and no extended straight line portion. All of the arrow patterns point to the charging terminal 202.
  • the automatic walking device 100 includes a housing 110, a plurality of wheels 120 at the bottom of the housing 110, a power system (not shown) inside the housing 110 for driving the wheels 120, a processor 130 located inside the housing 110, and An image capture device 140 is located on the housing 110.
  • the power system includes a battery pack, a transmission mechanism, and the like.
  • the image acquisition device 140 is configured to acquire an image of the stop station 200, and the processor 130 is configured to capture an image.
  • the image acquired by the collecting device 140 performs processing analysis and controls the walking of the automatic walking device 100.
  • the image capture device 140 is a camera.
  • the processor 130 of the automatic working system of the present invention has the following working modules: a stop station identification module 131, a first regression control module 132, a distance determining module 133, a direction identifier identifying module 134, a docking direction determining module 135, The second regression control module 136 and the regression path control module 137.
  • the docking station identification module 131 is configured to acquire the docking station identifier 210 from the image of the docking station 200 according to the image capturing device 140.
  • the docking station identification module 131 can identify whether it is the docking station identifier 210 based on the shape, pattern, or shape and shape of the object in the image.
  • the processor 130 stores a first preset pattern corresponding to the pattern of the docking station identifier 210, and the processor 130 further stores a first preset shape corresponding to the shape of the docking station identifier 210; the docking station identifier identifying module 131 compares the images.
  • the shape of the object and the first preset shape determine whether the shape of the object matches the first preset shape; the docking station identifier recognition module 131 compares the pattern of the object with the first preset pattern, and determines whether the pattern of the object is the same as the first A preset pattern is matched.
  • the first regression control module 132 is configured to control the automated walking device 100 to advance toward the landing station identification 210. Thereby, the automatic walking device 100 is guided to move from a position farther from the stopping station 200 to a position closer to the stopping station 200, so as to facilitate the subsequent identification of the direction indicator 220 and the path of the automatic walking device 100.
  • the image 142 acquired by the image acquisition device 140 has a first specific area 144.
  • the first regression control module 132 adjusts the automatic walking device 100 when controlling the automatic walking device 100 to advance toward the landing station identifier 210.
  • the direction of travel is such that the image corresponding to the stop sign 210 is always in the first particular area 144 of the image 142. In this way, the automatic walking device 100 can be prevented from being deflected, and the regression efficiency is improved.
  • the distance judging module 133 is configured to determine whether the distance between the autonomous traveling device 100 and the docking station 200 is less than or equal to the first distance value L1. The distance judging module 133 judges whether the distance between the autonomous traveling apparatus 100 and the docking station 200 is the first distance value L1 based on the length of the side length or the area value of the landing station identifier 210 in the image 142.
  • the processor 130 stores a predetermined length value, and the processor 130 calculates the side length of at least one side of the stop station identifier 210 in the image 142, and compares the calculated side length with a predetermined length value when the calculated side When the length reaches the predetermined length value, it is judged between the automatic walking device 100 and the stop station 200.
  • the distance is the first distance value L1.
  • the processor 130 may also store a predetermined area value, and the processor 130 calculates an area value of at least a portion of the stop station identifier 210 in the image 142, and compares the calculated area value with a predetermined area value, when the calculated area value reaches a predetermined area value. At this time, it is judged that the distance between the autonomous traveling apparatus 100 and the docking station 200 is the first distance value L1.
  • the processor 130 connects the four end points of the stop station identifier 210 graphic with a connection 218 to form a rectangle. Since the distance from the stop sign 210 is constant regardless of the direction in a certain horizontal plane, the length and width of the rectangle are the same, so the area of the rectangle can be used as the area value of the stop sign 210.
  • the image 142 acquired by the image capture device 140 has a second specific region that matches the shape of the docking station identifier 210.
  • the distance is determined.
  • the module 133 determines that the distance between the autonomous traveling device 100 and the docking station 200 is the first distance value L1.
  • the processor 130 connects the four endpoints of the stop station identifier 210 graphic with a connection 218 to form a rectangle, and the distance determination module 133 determines whether the rectangle overlaps with the second specific area.
  • the second specific area may be the same as the first specific area 144.
  • the direction identifier recognition module 134 identifies the direction identifier 220 based on the image 142 of the docking station 200 acquired by the image capture device 140.
  • the direction identifier 220 has a specific pattern, and the processor 130 stores a corresponding second preset pattern.
  • the direction identifier recognition module 134 compares the pattern in the image 142 with the second preset pattern, if the image 142 has a second preset pattern.
  • the matching pattern identifies the è pattern as the direction indicator 220.
  • the image acquired by the image capture device has a center line 146 that divides the image into two parts, and the docking direction determination module 135 determines whether the center line 146 coincides or substantially coincides with the docking direction indicated by the direction indicator 220.
  • the second regression control module 136 controls the automatic walking device 100 to dock with the docking station in the docking direction indicated by the direction indicator 220.
  • the regression path control module 137 controls the walking path of the autonomous traveling device 100 such that the walking direction of the autonomous traveling device 100 and the direction of the direction indicated by the direction indicator 220 are Coincident or substantially coincident.
  • the autonomous walking device is then controlled by the second regression control module 136.
  • the processor 130 can also directly recognize the direction identifier 220, thereby determining the automatic walking. Whether the traveling direction of the device 100 coincides with the docking direction indicated by the direction indicator 220. If the tracking direction of the device 100 coincides, the automatic walking device 100 is controlled to dock with the docking station in the docking direction indicated by the direction indicator 220. If not, the automatic walking device 100 is controlled. The walking path causes the traveling direction of the autonomous traveling apparatus 100 to coincide with the direction of the docking indicated by the direction indicator 220.
  • the regression path control module 137 includes: a positioning identification module 137a, an angle calculation module 1371, a distance calculation module 1372, and a first steering control. Module 1373, distance control module 1374, and second steering control module 1375.
  • the positioning identifier recognition module 137a identifies the positioning identifier and controls the automatic walking device to face the positioning identifier.
  • the location identifier identifying module 137a is the docking station identifier identifying module 131.
  • the angle calculation module 1371 calculates a first angle ⁇ formed between the direction indicator 220 and the center line 146 with the positioning identifier as a base point.
  • the distance calculation module 1372 is configured to calculate the second distance value L2 according to the first distance value L1 and the first angle ⁇ .
  • the distance calculation module 1372 calculates the second distance value L2 based on the first distance value L1 and the first angle ⁇ .
  • the point where the autonomous walking device 100 is located is a specific acute angle ⁇ 1, such as 60 degrees, and a right triangle, a hypotenuse, is formed by the extension line of the center line 146, the direction indicator 220, and the first angle ⁇ .
  • the length is the second distance value L2, and the distance calculation module 1372 calculates the second distance value L2 based on the first distance value L1 and the acute angle ⁇ 1.
  • the isosceles triangle is constructed with the same length as the first distance value L1 at the side where the direction indicator 220 is located, and the angle bisector of the first angle ⁇ is perpendicular to the opposite side L2 of the first angle ⁇ , according to The first angle ⁇ calculates the length of the opposite side L2.
  • the first steering control module 1373 is configured to control the automatic walking device 100 to rotate a predetermined second angle.
  • the direction indicator 220 since the front automatic traveling device 100 always faces the landing station identifier 210, the direction indicator 220 must be completely located on one side of the center line 146, as shown in the left side of FIG.
  • Steering control module 138 is used to control the direction indicated by the direction indicated by the direction indicator 220 in the image 142 by the autonomous walking device 100, such as to the left, to rotate the second angle.
  • the second angle is 90 degrees.
  • the second angle is a specific acute angle ⁇ 1, such as 60 degrees.
  • the first angle ⁇ is an obtuse angle
  • the other two identical angles of the isosceles triangle are calculated and the angle is taken as the second angle.
  • the distance control module 1374 is for controlling the automatic walking device 100 to travel the second distance value L2.
  • the second steering control module 1375 is used to control the steering of the automatic walking device and to face the direction indicator 220.
  • the second regression control module 1375 controls the automatic walking device 100 to rotate to the left or right until the direction indicated by the direction indicator 220 in the image 142 coincides with or substantially coincides with the center line 146, that is, the automatic walking device
  • the direction of travel of 100 coincides with the direction of docking indicated by direction indicator 220.
  • the autonomous walking apparatus 100 travels under the control of the processor 130 along the path indicated by the single arrow mark in the figure.
  • the regression path control module 137 includes: a first steering control module 1373, a spacing control module 1376, a second steering control module-1375, and a direction determination. Module 137b.
  • the first steering control module 1373 is configured to control the automatic walking device 100 to rotate a predetermined second angle in a direction indicated by the direction indicated by the in-image direction indicator 220.
  • the second angle is 90 degrees.
  • the spacing control module 1376 is for controlling the automatic walking device 100 to travel a predetermined pitch s.
  • the second steering control module 1375 controls the automatic walking device 100 to rotate after the automatic walking device 100 travels a predetermined distance s, and is oriented toward the direction indicator 220.
  • the direction determining module 137b is configured to determine whether the traveling direction of the automatic traveling device 100 coincides with the docking direction indicated by the direction indicator 220.
  • the second regression control module 1375 controls the automatic walking device 100 to rotate left or right until the direction indicator 220 is located in the middle of the image.
  • the regression path control module 137 may further include a first regression control module 132, a distance determination module 133, and a positioning target identification module 137a.
  • the positioning identification module 137a identifies the positioning identifier and controls the automatic walking device to face the positioning identifier, and then the first regression control module 132 further controls the automatic walking device 100 to face the positioning identifier.
  • the distance determination module 133 determines whether the distance between the automatic traveling device 100 and the docking station 200 is the first distance value L1, and then the second steering control module 1375 controls the automatic walking device 100 to rotate and is oriented toward the direction indicator 220.
  • the location identifier identifying module 137a is the docking station identifier identifying module 131.
  • the single figure in the figure is under the control of the processor 130. Walk along the path indicated by the arrow mark.
  • the regression path control module 137 includes: a positioning identifier identifying module 137a, an angle calculating module 1371, a side length calculating module 1377, and a radius calculating module 1378.
  • the positioning identifier recognition module 137a identifies the positioning identifier and controls the automatic walking device to face the positioning identifier.
  • the location identifier identifying module 137a is the docking station identifier identifying module 131.
  • the angle calculation module 1371 calculates a first angle ⁇ formed between the direction indicator 220 and the center line 146 with the positioning identifier as a base point.
  • the side length calculation module 1377 calculates the lengths of the other right angle side L2 and the oblique side Lx of the right triangle according to the first angle ⁇ and the first distance value L1.
  • the radius calculation module 1378 imaginary an inscribed circle that is tangent to the hypotenuse Lx at the third distance L3 (point D) predetermined by the landing station identifier 210 on the hypotenuse Lx and that is tangent to the right angle L2.
  • the radius calculation module 1378 further calculates the radius R of the inscribed circle (centered at point O).
  • the second angle ⁇ of the angle C is calculated in the right-angled triangle ABC, and the angle bisector CO of the second angle ⁇ and the focal point O of the perpendicular line DO of the oblique side Lx at the point D are the center of the inscribed circle.
  • the length of the CD segment is calculated according to the oblique side Lx and the third distance L3, and then the length of the OD segment in the triangular CDO is calculated, and the length of the OD segment is the radius R of the inscribed circle.
  • the first steering control module 1373 is for controlling the automatic walking device 100 to rotate 90 degrees in the direction indicated by the in-image direction indicator. At this time, the state of the autonomous traveling apparatus 100 is as shown in FIG.
  • the cut point control module 1379 is configured to control the automatic walking device 100 to advance the fourth distance L4 and reach the inside. Cut the circle and the tangent point E of the right-angled edge L2. Since the CD segment length is equal to the length of the CE segment, the fourth distance L4 is the length of the AE segment, and the fourth distance L4 is the difference between the length of the right-angled edge L2 and the CE segment.
  • the arc path control module 1370 is configured to calculate the rotation speed ratio of the left and right wheels according to the radius R of the inscribed circle and the wheel spacing 2d of the left and right wheels, and control the automatic walking device 100 so that the left and right wheels have a specific rotation speed ratio, thereby automatically
  • the walking device 100 travels along a predetermined arcuate path until the walking direction of the autonomous running device 100 coincides with the docking direction indicated by the direction indicator 220.
  • the outer radius of the left wheel from the center O of the automatic traveling device 100 is R+d
  • the inner radius of the right wheel from the center O is Rd
  • the rotational speed ratio of the left and right wheels is equal to the outer radius R+d and The ratio of the radius Rd.
  • first angle ⁇ is a right angle
  • a point at which the autonomous walking apparatus 100 is located is a specific acute angle ⁇ 1, such as 60 degrees
  • a right-angled triangle ABC is formed by the center line 146, the extension line of the direction indicator 220, and the acute angle ⁇ 1.
  • the length of each side of the right triangle ABC, the radius of the inscribed circle, and the like are calculated, and the specific method is similar to the case where the first angle ⁇ is an acute angle.
  • the isosceles triangle is constructed with the same length as the first distance value L1 at the side where the direction indicator 220 is located, and the angle bisector of the first angle ⁇ is perpendicular to the opposite side L2 of the first angle ⁇ , thereby Form two right triangles. Then, the lengths of the sides of the two right-angled triangles, the radius of the inscribed circle tangent to the two sides of the triangle where the direction indicator 220 is located, and the like are calculated, and the specific method is similar to the case where the first angle ⁇ is an acute angle.
  • the automatic walking device control method provided by the embodiment of the present invention includes the following steps:
  • Step S101 When the automatic walking device 100 starts the regression, the processor 130 identifies the stop station identifier 210 according to the image of the docking station 200 acquired by the image capturing device 140.
  • the processor 130 may identify the stop sign 210 based on the shape, pattern, or shape and shape of the object in the image.
  • Step S102 The processor 130 determines whether the distance between the automatic walking device 100 and the stopping station 200 is less than or equal to the first distance value L1. If yes, the process proceeds to step S1, otherwise, the process proceeds to step S103.
  • Step S103 The processor 130 controls the automatic walking device 100 to advance toward the stop station identifier 210.
  • the image 142 acquired by the image capture device 140 has a first specific area 144.
  • the processor 130 controls the automatic walking device 100 to advance toward the stop sign 210, the processor 130 continuously adjusts the walking direction of the autonomous walking device 100 so that the docking station identifier 210 is always in the image. In the first specific area 144 of 142. In this way, the automatic walking device 100 can be prevented from being deflected, and the regression efficiency is improved.
  • the processor 130 determines the automatic line according to the size of the area value of the stop station identifier 210 in the image 142. Whether the distance between the device 100 and the docking station 200 is the first distance value L1. This causes the autonomous vehicle 100 to move from a position farther from the docking station 200 to a position closer to the docking station 200.
  • the processor 130 stores a predetermined area value, and the processor 130 calculates an area value of the image corresponding to the stop station identifier 210, and compares the calculated area value and the predetermined area value of the image corresponding to the stop station identifier 210; When the area value of the image corresponding to the station identifier 210 reaches the predetermined area value, it is determined that the distance between the automatic traveling device 100 and the docking station 200 is the first distance value L1.
  • the image 142 acquired by the image capture device 140 has a second specific region that matches the shape of the docking station identifier 210, when the docking station identifier 210 in the image 142 substantially overlaps the second specific region, processing The device 130 determines that the distance between the automatic traveling device 100 and the docking station 200 is the first distance value L1.
  • the second specific area may be the same as the first specific area.
  • Step S1 The processor 130 recognizes the direction identifier 220 according to the image of the docking station 200 acquired by the image capturing device 140.
  • the direction identifier 220 has a specific pattern, and the processor 130 stores a corresponding second preset pattern, and compares the pattern in the image with the second preset pattern. If the pattern in the image matches the second preset pattern, the The pattern is the direction indicator 220.
  • Step S2 The processor 130 determines whether the walking direction of the automatic traveling device 100 coincides with the docking direction indicated by the direction indicator 220.
  • the image 142 acquired by the image acquisition device 140 has a center line 146 that divides the image 142 into two parts, and the processor 130 compares the positional relationship between the center line 146 and the direction indicator 220 in the image to determine whether the walking direction coincides with the docking direction.
  • Step S3 When the processor 130 determines that the center line 146 does not coincide with the direction indicator 220, it controls the walking path of the automatic walking device 100 such that the walking direction of the automatic walking device 100 coincides with the docking direction indicated by the direction indicator.
  • Step S4 The processor 110 controls the automatic walking device 100 to dock with the docking station 200 in the docking direction indicated by the direction identifier.
  • steps S101, S102, and S103 can also be omitted, that is, the processor 130 can directly identify the direction identifier 220, and further determine whether the walking direction of the automatic walking device 100 coincides with the docking direction indicated by the direction indicator 220. If the coincidence, the automatic walking device 100 is controlled to dock with the docking station in the docking direction indicated by the direction indicator 220. If not, the walking path of the autonomous walking device 100 is controlled so that the walking direction of the autonomous walking device 100 and the direction indicator 220 are as shown. The docking direction coincides.
  • step S3 includes:
  • Step S311 The processor controls the automatic walking device to rotate a predetermined angle
  • Step S312 the processor controls the automatic walking device to travel a predetermined interval s;
  • Step S313 The processor controls the automatic walking device to turn and face the docking station identifier
  • Step S314 The processor determines whether the walking direction of the automatic walking device coincides with the docking direction indicated by the direction identifier, if yes, the process proceeds to step S4, otherwise returns to step S311.
  • step S3 includes:
  • Step S321 The processor identifies the positioning identifier and controls the automatic walking device to face the positioning identifier.
  • the landing station identification 210 can also be located in the docking direction and as a positioning identification.
  • Step S322 the image acquired by the image acquisition device has a center line dividing the image into two parts, and the processor calculates a first angle ⁇ formed between the direction identifier and the center line;
  • Step S323 The processor calculates a second distance value L2 according to the first distance value L1 and the first angle ⁇ ;
  • Step S324 The processor controls the automatic walking device to rotate a predetermined angle
  • Step S325 the processor controls the automatic walking device to travel a second distance value L2;
  • Step S326 The processor controls the automatic walking device to turn and mark toward the direction.
  • step S6 includes:
  • Step S331 The processor identifies the positioning identifier and controls the automatic walking device to face the positioning identifier.
  • the landing station identification 210 can also be located in the docking direction and as a positioning identification.
  • Step S332 The image acquired by the image acquisition device has a center line dividing the image into two parts, and the processor calculates a first angle ⁇ formed between the direction identifier and the center line;
  • Step S333 The processor constructs a specific triangle according to the point where the automatic walking device is located, the positioning identifier, and the first angle ⁇ , and calculates the first one of the triangles according to the first angle ⁇ and the first distance value L1.
  • the other adjacent side Lx of the angle ⁇ and the side of the opposite side L2 are long;
  • Step S334 The processor calculates a radius R of an inscribed circle tangent to the adjacent side Lx at the specific position on the adjacent side Lx and tangent to the opposite side L2;
  • Step S335 The processor controls the automatic walking device to rotate a predetermined angle in a direction indicated by the direction indicated by the direction identifier;
  • Step S336 the processor controls the automatic walking device to advance to the tangent point of the inscribed circle and the opposite side L2;
  • Step S337 The processor calculates a rotation speed ratio of the wheels on the left and the right sides according to the radius R of the inscribed circle and the wheel spacing 2d of the left and right wheels, and the processor controls the automatic walking device so that the left and right wheels have a specific rotation speed ratio, thereby The autonomous walking device travels along a predetermined arcuate path until the walking direction of the autonomous walking device coincides with the docking direction indicated by the direction sign.
  • the automatic walking device control method and the automatic working system of the present embodiment have the beneficial effects of improving the return efficiency of the automatic traveling device 100 and realizing the effective docking of the automatic traveling device 100 and the docking station 200.
  • FIG. 15 is a schematic illustration of another embodiment of the automated working system of the present invention.
  • the front portion of the autonomous traveling apparatus 100 has a docking terminal (not shown) for performing energy transmission, and the number of the docking terminals corresponds to the number of charging terminals 202 to which the docking station 200 supplies energy, which is the docking in this embodiment.
  • the number of terminals is at least two, and the docking station 200 has at least two charging terminals 202.
  • the autonomous traveling device 100 is docked with the docking station 200, the docking terminals are connected to the corresponding charging terminals 202.
  • the docking station 200 is located on the boundary 300 of the working area 400, and the charging terminal 202 is disposed in a specific direction, such as toward the working area 400.
  • FIG. 16 is a perspective view of the docking station 200 in the present embodiment
  • FIG. 17 is a front view of the docking station 200 in the present embodiment
  • the docking station is provided with the object 21.
  • the target 21 is vertically disposed at one side of the docking station 200 and the charging terminal 202 is vertically disposed on the target 21 such that the charging terminal 202 is perpendicular to the plane of the target 21.
  • the standard direction is defined, which is the docking direction when the autonomous traveling device 100 and the docking station 200 are docked. It can be seen from the position setting manner of the target object 21 that the standard direction is a mid-perpendicular line of the surface on which the target object 21 is located.
  • the surface of the object 21 may be designed as a plane, or may be designed as a convex or concave surface with a certain curvature, so that the surface of the object 21 has micro-archs or dimples.
  • the shape of the object 21 is a polygonal pattern, and thus, when viewed from different directions in a certain horizontal plane, the observed real-time image of the object 21 may be differently deformed.
  • the automatic walking device 100 can determine the orientation of the autonomous walking device 100 relative to the target 21 according to the observed deformation of the real-time image of the target object 21, that is, whether the automatic walking device 100 is offset from the standard direction and The orientation of the offset.
  • it is preferable that the shape of the object 21 is a rectangle.
  • the shape of the object 21 is not limited to a rectangle, and is not limited to a polygon, and only needs to be deformed when viewed from different directions, and the deformation can be performed according to the deformation. Judging the orientation of the autonomous walking device 100 relative to the target 21, such as a target
  • the shape of the object 21 can also be set to a circular shape.
  • the target 21 has a specific color, such as dividing the target 21 into upper and lower portions, the first portion 211 has a first color such as blue, and the second portion 212 has a second color different from the first color. red.
  • a specific color such as dividing the target 21 into upper and lower portions
  • the first portion 211 has a first color such as blue
  • the second portion 212 has a second color different from the first color. red.
  • the target 21 can also be divided into two parts, left and right, or two parts of the inner and outer rings, and different parts have different colors.
  • the target 21 can also be set only as an area having two or more different colors.
  • the object 21 has a specific color in order to improve the matching ratio of the image recognition unit 14 to recognize the image of the object 21, which is actually a preferred embodiment, and the object 21 may have only one color.
  • the object 21 is disposed as two upper and lower portions, and the first portion 211 has a first color and the second portion 212 has a second color different from the first color.
  • the docking station 200 has a flat panel 230 for the automatic walking equipment to be docked, and the flat panel 230 is laid flat on the ground or on the grass.
  • the automatic traveling device 100 is entirely positioned on the flat plate 230, it is possible to prevent the ground or grass from being uneven, causing the automatic traveling device 100 to be skewed so that the docking terminal cannot be docked with the charging terminal 202.
  • the automatic walking device 100 includes a housing 110, a plurality of wheels 120 at the bottom of the housing 110, a power system (not shown) inside the housing 110 for driving the wheels 120, a control module 13 located inside the housing 110, and An image capture device 140 is located on the housing 110.
  • the power system includes a battery pack or gasoline, a transmission mechanism, and the like.
  • the image acquisition device 140 is configured to acquire an image of the object 21, and the control module 13 is configured to process and analyze the image acquired by the image acquisition device 140 and control the walking and working of the automatic walking device 100.
  • the image capture device 140 is a camera.
  • control module of the autonomous walking apparatus 100 of the present invention includes the following working units: an image recognition unit 14, an offset determination unit 17, and a control unit 18.
  • the control unit 18 is for controlling whether the automatic walking device 100 is activated and the selection of the operating mode, and controlling the speed of the wheel 120, steering, and the like.
  • the image recognition unit 14 is configured to determine whether the object 21 appears in the real-time image acquired by the image acquisition device 140. The image recognition unit 14 recognizes whether it is the target 21 based on the shape, pattern, color of the object in the real-time image, or a combination of its shape and pattern, or shape and color.
  • the image recognition unit 14 determines the real-time image acquired by the image collection device 140.
  • the process of whether or not the object 21 is present is specifically as follows: the control module 13 stores a pixel value corresponding to the color of the object 21, and as described above, if the object 21 has the first portion 211 and the second portion 212, the control module 13 A pixel value of a first color corresponding to the first portion 211 and a pixel value of a second color corresponding to the second portion 212 are stored.
  • the image recognition unit 14 scans the real-time image collected by the image collection device 140.
  • the identification unit 14 judges that the partial image is a real-time image of the object 21.
  • the offset judging unit 17 compares the real-time image 21' of the object 21 collected by the image capturing device 140 with the standard image 21" of the object 21, and judges according to the shape change of the real-time image 21' with respect to the standard image 21".
  • the offset of the autonomous walking apparatus 100 with respect to the standard direction enables it to be judged that the autonomous running apparatus 100 is in the left azimuth, the right azimuth, or the aligned position of the object 21.
  • FIG. 22 to 27 the imaging situation of the target real-time image 21' in the imaging area A when the autonomous walking apparatus 100 is located at a different orientation of the object 21 is shown.
  • the autonomous traveling apparatus 100 is aimed at the target 21 (ie, the advancing direction of the autonomous traveling apparatus 100 is in the standard direction); as shown in FIG. 25, at this time, the shape of the real-time image 21' of the object 21 is opposite to that of FIG. The shape of the target 21 is not deformed.
  • the side length ratio a/b of the specific side a of the outer contour side of the target 21 to the specific side b of the other side is equal to the side length ratio a of the specific side a' of the outer contour side of the real image 21' and the specific side b' of the other side.
  • the real-time image 21' of the object 21 in the imaging area A is the standard image 21" of the object 21, and the standard image 21" is not deformed in shape with respect to the object 21.
  • the standard image 21" of the target is an image of the object acquired when the autonomous walking apparatus 100 is in the standard direction.
  • the autonomous traveling apparatus 100 is located on the left side of the object 21; as shown in FIG. 23, at this time, the shape of the real-time image 21' in the imaging area of the object 21 changes with respect to the shape of the object 21. .
  • the ratio of side lengths a/b of the specific side a of the outer contour side of the target 21 to the specific side b of the other side is larger than the side length ratio a of the specific side a' of the outer contour side of the real image 21' and the specific side b' of the other side. '/b' (ie a/b>a'/b'). This is mainly because the real-time image of the observed object 21 may be differently deformed due to the visual difference of the collected objects at different positions of the image capturing device 140, which are observed from different directions in a certain horizontal plane.
  • the autonomous traveling apparatus 100 is located on the right side of the object 21; as shown in Fig. 27, at this time, the shape of the real-time image 21' in the imaging area of the object 21 changes with respect to the shape of the object 21. .
  • the ratio of the side lengths a/b of the specific side a of the outer contour side of the target 21 to the specific side b of the other side is smaller than The side length ratio a'/b' of the specific side a' of the outer contour side of the real image 21' and the specific side b' of the other side (i.e., a/b ⁇ a'/b'). This is mainly because the real-time image of the observed object 21 may be differently deformed due to the visual difference of the collected objects at different positions of the image capturing device 140, which are observed from different directions in a certain horizontal plane.
  • the offset determination unit 17 determines the offset of the autonomous traveling apparatus 100 with respect to the standard direction based on the change of the shape of the real-time image 21' with respect to the object 21, that is, determines that the autonomous traveling apparatus 100 is in the left orientation, the right orientation of the target 21 or Align the position.
  • a side length proportional relationship a/b of the specific side a of the target object standard image 21" on the outer contour side and the specific side b of the other side is prestored.
  • the offset determining unit 17 will perform the real time image 21 when making the judgment.
  • the ratio of the side length ratio a'/b' of the specific side a' of the outer contour side to the specific side b' of the other side is compared with the presumed side length ratio a/b.
  • the offset judging unit 17 judges that the automatic autonomous traveling apparatus 100 is biased to the side of the specific side a'; if the comparison result is that a/b is smaller than a'/b', the offset judging unit 17 judges the automatic autonomous traveling apparatus 100. The other side of the specific side b' is biased; if the comparison result is a/b equal to a'/b', the offset judging unit 17 judges that the automatic autonomous traveling apparatus 100 is in the standard direction.
  • the shape of the object 21 is a rectangle, so that the aspect ratio a/b of the specific side a of the outer contour side of the target standard image 21" and the specific side b of the other side is equal to 1.
  • the offset judging unit. 17 When judging, the side length ratio a'/b' of the specific side a' of the outer contour side of the real-time image 21' and the specific side b' of the other side can be directly compared with 1.
  • the shape of the object 21 is selected as a rectangle, but the shape of the object 21 cannot be defined as a rectangle.
  • the object 21 is selected as another polygon (such as a parallelogram, a trapezoid, a hexagon, etc.), it is only necessary to set The above-described comparison method is still suitable for judging the offset of the autonomous walking apparatus 100 with respect to the standard direction, with respect to a specific side of one side of the outer contour and a specific side of the other side.
  • the control unit 18 in the control module 13 adjusts the traveling direction of the autonomous traveling apparatus 100 until it is aligned with the standard direction. Thereby, the automatic traveling apparatus 100 can advance in the standard direction and travel to the docking station 200 in which the object 21 is provided.
  • control unit 18 according to the side length ratio a'/b' of the specific side a' of the current real-time image outer contour side and the specific side b' of the other side and the pre-stored target standard image 21" outer contour side specific side a Calculating the deviation angle of the automatic walking device 100 from the standard direction with the change of the side length ratio a/b of the specific side b of the other side, thereby controlling the automatic walking device 100 to offset by the specific left and right wheel differential speed.
  • the corresponding arc of the angle directly enters the stop.
  • control module 13 of the autonomous running device 100 of the present invention may further include: a returning party
  • the confirmation unit 15 is turned.
  • the return direction confirming unit 15 is for adjusting the orientation of the autonomous traveling apparatus 100 such that the automatic traveling apparatus 100 does not deviate in the course of advancing toward the object 21 (or in the process of returning the automatic traveling apparatus to the docking station 200).
  • the image imaging area range of the image capture device 140 is derived. Therefore, it is possible to ensure that the autonomous traveling apparatus 100 does not deviate from the target in the process of advancing or judging whether or not the relative standard direction is shifted.
  • the imaging area A of the image capturing device 140 is a symmetrical rectangle, and a central area ⁇ H is formed in a range of the left and right sides of the symmetry line, and ⁇ H can occupy 1% to 40% of the length of the entire imaging area.
  • the range is determined according to the size of the wide-angle range (imaging area) of the collecting device 11. Generally, the larger the wide-angle range (imaging area), the larger the ratio of ⁇ H to the length of the entire imaging area. As shown in FIG. 29 or FIG.
  • the real-time image 21' of the object 21 is outside the range of the ⁇ H region, that is, the position of the object 21 is not in the middle or the middle position of the image capturing area of the image capturing device 140, then
  • the walking device 100 walks forward in the current direction, and the target 21 will deviate from the imaging area of the image capturing device 140, so that the automatic walking device 100 needs to perform the searching of the target 21 again, and the image recognizing unit 14 needs to judge the imaging again.
  • Whether or not the live image 21' of the object 21 appears in the area greatly reduces the efficiency of the autonomous apparatus 100 returning to the docking station 200. As shown in FIG.
  • the real-time image 21' of the object 21 is within the range of the ⁇ H region, that is, the position of the object 21 is in the middle or the middle position of the image forming area of the image capturing device 140, and the automatic traveling device 100 is in accordance with the predetermined The direction of walking forward, the target 21 will always be within the imaging range of the image acquisition device.
  • the regression direction confirming unit 15 determines whether or not the current traveling direction of the autonomous traveling apparatus 100 is the return direction based on whether or not the real-time image 21' of the object 21 is within the central area ⁇ H of the imaging area A.
  • the regression direction confirming unit 15 determines that the current traveling direction of the automatic traveling apparatus 100 is the return direction, that is, the automatic traveling apparatus 100 walks in the current traveling direction, and the target object 21 is always in the image.
  • the imaging device 140 is within the imaging range A.
  • the regression direction confirming unit 15 determines that the current traveling direction of the automatic traveling apparatus 100 is not the return direction, that is, if the automatic traveling apparatus 100 walks in the current traveling direction, the target 21 deviates. Within the imaging range A of the image capture device 140.
  • the control unit 18 adjusts according to the real-time image 21' on the left side or the right side of the center area ⁇ H. The orientation of the autonomous walking device 100. As shown in Fig.
  • the real-time image 21' is on the right side of the center area ⁇ H
  • the control unit 18 controls the orientation of the automatic traveling device to be rightward until the real-time image 21' is in the center area ⁇ H.
  • the real-time image 21' is on the left side of the center area ⁇ H
  • the control unit 18 controls the orientation of the autonomous walking device to be shifted to the left until the real-time image 21' is in the center area ⁇ H.
  • the flow chart of the automatic traveling equipment 100 of the embodiment of the present invention is returned to the docking station 200, that is, the entire docking process of the docking system of the automatic traveling equipment.
  • Step S11 The image acquisition device 140 acquires an image, and the image recognition unit 14 recognizes whether or not there is a real-time image 21' of the object 21; after recognizing the presence of the real-time image 21' in the image formation region A, the process proceeds to step S12.
  • Step S12 The regression direction confirming unit 15 determines whether or not the automatic traveling apparatus 100 is oriented in the return direction based on the position of the target real-time image 21' in the imaging area A. If the autonomous walking device 100 is not facing the returning direction, the control unit 18 adjusts the orientation of the autonomous walking device until it faces the returning direction. After the autonomous traveling apparatus 100 faces the returning direction, the process proceeds to step S13.
  • Step S13 The offset determination unit 17 determines the offset of the automatic walking device with respect to the standard direction according to the change of the shape of the target real-time image 21' and the target standard image 21". And adjusts the walking of the automatic walking device 100 according to the determination result. Go to target 21 (ie return to stop 200).
  • the control module 13 of the autonomous vehicle 100 further includes a distance calculation unit 16.
  • the distance calculation unit 16 is for calculating the distance between the autonomous traveling apparatus 100 and the object 21, and the control unit 18 controls the automatic walking apparatus 100 to travel in the return direction within a predetermined distance range from the object 21.
  • control module 13 stores a predetermined length value, and the control module 13 calculates a side length of at least one specific side of the 21' in the real-time image, and compares the calculated side length with a predetermined length value, when the calculated side length of the specific side reaches When the length value is predetermined, it is judged that the distance between the automatic traveling device 100 and the docking station 200 is within a preset distance range.
  • control module 13 can also store the predetermined area value, and control the module area 13 at least part of the area value of the 21' in the real-time image, and compare the calculated area value with the predetermined area value. When the calculated area value reaches the predetermined area value, it is determined.
  • the distance between the autonomous walking device 100 and the docking station 200 is within a preset distance range.
  • FIG. 36 a flow chart of the automatic traveling apparatus 100 returning to the docking station 200 in another embodiment of the present invention, that is, the entire docking process of the docking system of the automatic traveling equipment.
  • Step S21 The image acquisition device 140 acquires an image, and the image recognition unit 14 recognizes whether or not there is a real-time image 21' of the object 21; after recognizing the presence of the real-time image 21' in the image formation region A, the process proceeds to step S22.
  • Step S22 The regression direction confirming unit 15 determines whether the automatic traveling apparatus 100 is oriented in the return direction based on the position of the target real-time image 21' in the imaging area A. If the autonomous walking device 100 is not facing the returning direction, the control unit 18 adjusts the orientation of the autonomous walking device until it faces the returning direction. After the autonomous traveling apparatus 100 faces the returning direction, the process proceeds to step S23.
  • Step S23 The automatic walking device 100 advances according to the adjusted regression direction, and the distance calculating unit 16 calculates the distance between the automatic walking device 100 and the target object 21 in real time, when the automatic walking device 100 walks within the preset distance range. At this time, the control unit 18 controls the automatic traveling apparatus 100 to stop the advancement and proceeds to step S24.
  • Step S24 The offset determination unit 17 determines the offset of the automatic walking device with respect to the standard direction according to the change of the shape of the target real-time image 21' and the target standard image 21". And adjusts the walking of the automatic walking device 100 according to the determination result. Go to target 21 (ie return to stop 200).
  • FIG. 37 is an image recognition-based positioning apparatus 1100 according to another embodiment of the automatic working system of the present invention, for positioning a charging station, including: a marker 1110, an image acquisition module 1120, an image and position corresponding module 1130, and a position determination module. 1140.
  • the identifier 1110 is stereoscopically fixed on the charging station.
  • the image acquisition module 1120 is disposed on the walking device for collecting image information of the identifier 1110.
  • the image and position corresponding module 1130 is configured to set a correspondence relationship between the image information of the identifier 1110 and the position information of the charging station with respect to the traveling device.
  • the location determining module 1140 is configured to perform image correspondence between the identifier 1110 collected by the image capturing module 1120 and the image information of the identifier 1110 set in the image and location corresponding module 1130 and the location information of the charging station relative to the traveling device. In comparison, the position information of the charging station relative to the traveling device is determined.
  • the location of the identifier 1110 on the charging station must be met, the image acquisition module 1120 can collect the image information of the identifier 1110 360 degrees, and the identifier 1110 is as close as possible to the image acquisition module 1120.
  • the height is to ensure the accuracy of the information of the identifier 1110 collected by the image acquisition module 1120.
  • the identifier 1110 is fixedly disposed above the charging column of the charging station.
  • the image acquisition module 1120 is disposed on the walking device and can be implemented by the camera device 1121, such as a camera.
  • the image and position corresponding module 1130 includes an orientation corresponding module 1132 and a distance corresponding module 1131.
  • the distance corresponding module 1131 is used to set the identifier 1110.
  • the orientation corresponding module 1132 is configured to set a correspondence between the image information of the identifier 1110 and the orientation information of the charging station with respect to the walking device.
  • different image information of the identifier 1110 is collected according to different angles, and the distance and orientation information of the different image information and the charging station relative to the walking device are determined, thereby realizing accurate positioning of the charging device by the walking device.
  • the image information of the identifier 1110 in FIG. 38-1 collected by the image acquisition module 1120 on the walking device is shown.
  • the image of the marker 1110 collected from different orientations includes a blue and yellow proportional relationship, which corresponds to the orientation information of the charging station relative to the walking device.
  • the size of the blue and yellow regions in the image of the marker 1110 collected at different distances from the charging station corresponds to the distance information of the charging station relative to the walking device.
  • the correlation between the blue and yellow ratios in the image and the orientation information of the charging station relative to the walking device can be obtained through experiments.
  • the correspondence between the size and relative distance of various color modes can be obtained.
  • the position determination module 1140 includes a distance determination module 1141 and an orientation determination module 1142.
  • the distance determination module 1141 is configured to compare the image information of the identifier 1110 collected by the image acquisition module 1120 with the image information of the identifier 1110 in the distance corresponding module 1131 and the distance information of the charging station relative to the walking device. The distance information of the charging station relative to the walking device is determined.
  • the orientation determination module 1142 is configured to compare the image information of the identifier 1110 collected by the image acquisition module 1120 with the correspondence between the image information in the orientation corresponding module 1132 and the orientation information of the charging station with respect to the walking device, and determine the charging. The position information of the station relative to the walking equipment.
  • the image acquisition module 1120 is directly in front of the charging station, left front, right left, left rear, rear right, right rear, and right side, respectively.
  • the image information is analyzed to determine the color proportional relationship, and compared with the orientation relationship of the charging station in the position determining module 1140 with respect to the traveling device, the relative direction of the autonomous walking device and the charging station can be located.
  • the image information is analyzed to determine the color mode size, and compared with the distance correspondence relationship between the charging station and the walking device in the position determining module 1140, the distance information of the charging station relative to the traveling device can be located.
  • the marker 1110 is a cylinder, and an image recognition area 1111 and an image content area 1112 are provided on the outer surface of the marker 1110.
  • Image Identification The area 1111 is used to identify and define the image content area 1112. The content of the collected image content area 1112 can be further collected and analyzed only when the image acquisition module 1111 captures the image recognition area 1111.
  • the image content area 1112 includes image information content having a correspondence relationship with the position information of the charging station with respect to the walking device. Image content area 1112 includes different characters or different color combinations that correspond to location information of the charging station relative to the traveling device.
  • the identifier 1110 is a cylinder, and the upper and lower portions of the red area of the cylinder are the image recognition area 1111, and the blue area and the yellow area are disposed around the cylinder, and along the The cylindrical axis is in the direction of the cylinder that occupies half of the circumference of the cylinder.
  • the image of the marker 1110 in Figure 39-1 is set to be directly in front of the traveling device. It should be noted that the position and color design of the image recognition area 1111 on the marker 1110 are not limited to the case shown in FIG. 38-1, and various selections can be made while satisfying the recognition function.
  • the identifier 1110 is a cylinder, and the upper and lower parts of the cylinder are image recognition areas 1111, and may be coated with various colors or identification symbols for identifying images.
  • the contents of the image content area 1112 are the characters "L" and "R". It should be noted that the location and content information of the image content area 1112 on the identifier 1110 are not limited to the cases shown in FIG. 38-1 and FIG. 38-2, and can satisfy the condition that can distinguish the charging station from different positions of the walking device. Next, make a variety of selective designs.
  • FIG. 40 it is a flowchart of a positioning method based on an image recognition and positioning device according to the present invention, and the positioning method includes the following steps:
  • S210 presets a correspondence between the identifier image information on the charging station and the position information of the charging station relative to the walking device.
  • the step further includes: S211 preset correspondence between the identifier image information and the distance information of the charging station with respect to the walking device.
  • S212 presets a correspondence between the identifier image information and the orientation information of the charging station with respect to the walking device.
  • the image of the marker 1110 collected from different orientations includes blue and yellow.
  • the proportional relationship corresponds to the orientation information of the charging station relative to the walking device.
  • the size of the blue and yellow regions in the image of the marker 1110 collected at different distances from the charging station corresponds to the distance information of the charging station relative to the walking device.
  • the S220 walking device collects image information of the marker set on the charging station.
  • S230 compares the identifier image information collected by the walking device with a preset correspondence, and determines location information of the charging station relative to the walking device.
  • the step further includes: S231 comparing the collected identifier image information with the identifier image information and the distance information of the charging station with respect to the walking device, and determining the charging station relative to the walking. Distance information of the device. S232 compares the collected identifier image information with the identifier image information and the correspondence relationship between the charging station and the orientation information of the walking device, and determines the orientation information of the charging station relative to the walking device.
  • FIGS. 39-1 to 39-8 respectively, in front of the charging station, left front, right left, left rear, right rear, right rear, right right, right front Image information of the collected markers.
  • the image information is analyzed to determine the size of the color mode, and compared with the distance correspondence relationship between the charging station and the walking device in the position determining module, the distance information of the charging station relative to the traveling device can be located.
  • the marker 1110 is a cylinder, and an image recognition area 1111 and an image content area 1112 are provided on the outer surface of the marker 1110.
  • the image recognition area 1111 is used to identify and define the image content area 1112.
  • the content of the collected image content area 1112 can be further collected and analyzed only when the image acquisition module 1120 collects the image recognition area 1111.
  • the image content area 1112 includes image information content having a correspondence relationship with the position information of the charging station with respect to the walking device.
  • Image content area 1112 includes different characters or different color combinations that correspond to location information of the charging station relative to the traveling device.
  • the identifier 1110 is a cylinder, and the upper and lower portions of the red area of the cylinder are the image recognition area 1111, and the blue area and the yellow area are disposed around the cylinder, and along the The cylindrical axis is in the direction of the cylinder that occupies half of the circumference of the cylinder.
  • the image of the marker 1110 in Figure 39-1 is set to be directly in front of the traveling device. It should be noted that the position and color design of the image recognition area 1111 on the marker 1110 are not limited to the case shown in FIG. 38-1, and various selections can be made while satisfying the recognition function.
  • the identifier 1110 is a cylinder, and the upper and lower parts of the cylinder are image recognition areas 1111, and may be coated with various colors or identification symbols for identifying images.
  • the contents of the image content area 1112 are the characters "L" and "R". It should be noted that the location and content information of the image content area 1112 on the identifier 1110 are not limited to the cases shown in FIG. 38-1 and FIG. 38-2, and can satisfy the condition that can distinguish the charging station from different positions of the walking device. Next, make a variety of selective designs.
  • the image recognition-based positioning device 1100 collects the image information of the identifier 1110 disposed on the charging station through the image acquisition module 1120, and passes the position determination module and the image information of the identifier 1110 previously set in the image and position corresponding module 1130.
  • the positional relationship of the charging station with respect to the traveling device is compared to determine the position information of the charging station relative to the traveling device.
  • the identifier 1110 in the present invention is three-dimensional, so that the image acquisition module 1120 can collect the image information of the identifier 1110 in 360 degrees, thereby more accurately positioning.
  • the invention is based on the positioning method of the image recognition and positioning device, and can position the charging station relative to the walking device from multiple directions, high accuracy, such as front, back, left and right.

Abstract

一种自动行走设备(100)及与其相应的对接系统,该自动行走设备(100)包括:用于采集实时图像(21')的图像采集装置(140)及控制模块(13),该控制模块(13)包括:用于识别所述实时图像(21')中是否出现目标物(21)的图像识别单元(14),以及偏移判断单元(17),该偏移判断单元(17)将所述实时图像(21')与自动行走设备(100)位于标准方向时所采集到的目标物(21)的标准图像(21")进行比对,根据实时图像(21')相对于标准图像(21")的形状的改变,判断自动行走设备(100)相对于标准方向的偏移。

Description

自动行走设备 技术领域
本发明涉及一种自动行走设备,尤其涉及一种能够自动对准目标物的自动行走设备,本发明还涉及一种与该自动行走设备相对应的对接系统。
背景技术
自动行走设备无需人工操作便可以实现自动工作,比如自动割草机或自动吸尘器,实现了在用户上班或娱乐的时间进行割草或清除灰尘,给用户带来了极大的便利。自动行走设备通常在预定的工作区域内行走,并在电量较低时返回至特定的领地(如停靠站等)补充能量,或者在完成工作或下雨时返回特定的领地。
现有的自动行走设备一般是按照预定的方向沿边界线返回停靠站,回归效率较低。由于停靠站的充电端子是朝向特定的方向设置的,如果在任意地点均控制自动行走设备直接朝向停靠站回归,则可能导致自动行走设备无法与停靠站成功对接。
智能割草机自动返回充电时,需要对充电站进行定位,并移向充电站进行自动充电。目前智能割草机对充电站的定位方案为,在充电站上的某些位置布置彩色条形标记,分析由自动割草机的机载摄像头捕获的图像中是否含有匹配的彩色条形,若有,说明充电站进入视野范围内,则计算此时割草机和充电站的相对位置信息,并进行运动控制引导割草机进站;若无,说明在可视范围内未发现充电站,割草机继续按原有模式行进。
目前的解决方案存在一定的缺陷,例如当自动行走设备的摄像头视线与标记平面垂直时,对标记的识别最有利。当视线偏离标记平面的垂线越远时,对标记的识别能力越弱。当自动行走设备位于此标记平面的左、右、后方时,无法有效识别标记。
发明内容
本发明提供一种自动行走设备,该自动行走设备能够自动对准目标物,通过自动对准目标物从而可以高效地回归至带有目标物的充电站或者带有目标物的领地。
为实现上述目的,本发明的技术方案是:一种自动行走设备,包括:图像采集装置,用于采集实时图像;控制模块,连接所述图像采集装置,用于控制 自动行走设备工作;所述控制模块包括:图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;偏移判断单元,将图像采集装置采集到的目标物的实时图像与自动行走设备位于一标准方向时所能采集到的目标物的标准图像进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移。
优选的,所述标准方向为所述目标物所在表面的中垂线方向。
优选的,所述偏移判断单元通过提取目标物的实时图像和标准图像的形状特征进行比对,判断实时图像相对于标准图像的形状改变。
优选的,所述目标物为多边形图案时,所述形状特征包括多边形图像外轮廓一侧的特定边与另一侧的的特定边的边长比例关系。
优选的,所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值小于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备位于标准方向。
优选的,所述多边形为矩形时,标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于1;所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值小于1,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于1,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于1,自动行走设备位于标准方向。
优选的,所述偏移判断单元若判断自动行走设备相对于标准方向偏移,所述控制模块调整自动行走设备的行走方向,直至其行走方向对准标准方向。
优选的,调整自动行走设备的行走方向对准标准方向后,控制模块控制自动行走朝向标准方向行走,直至行走至目标物处。
优选的,所述控制模块还包括距离计算单元,所述偏移判断单元在进行相对标准方向偏移判断之前,距离计算单元计算自动行走设备与目标物之间的距离,并且由控制模块控制自动行走设备行走至离目标物预设的距离范围之内。
优选的,所述控制模块还包括回归方向确认单元,所述偏移判断单元在进 行相对标准方向偏移判断之前,回归方向确认单元判断并且控制自动行走设备的朝向指向目标物。
优选的,当图像采集装置采集到的目标物的实时图像位于图像采集装置成像区域的预设范围内时,所述回归方向确认单元确定自动行走设备的行走方向朝向目标物。
优选的,图像识别单元通过比对图像采集装置采集到的实时图像和目标物的标准图像的颜色特征,识别图像采集装置采集到的实时图像是包含目标物的图像。
一种自动行走设备,包括:图像采集装置,用于采集实时图像;控制模块,连接所述图像采集装置,用于控制自动行走设备工作;所述控制模块包括:图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;偏移判断单元,将图像采集装置采集到的目标物的实时图像的形状特征与自动行走设备位于一标准方向时所能采集到的目标物的标准图像的形状特征进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移。
本发明还提供一种自动行走设备的对接系统,该自动行走设备的对接系统可以实现快速且准确的对接。
为实现上述目标,本发明的技术方案为:一种自动行走设备的对接系统,包括:充电站,用于给自动行走设备提供电能,设有用于引导自动行走设备回归充电的目标物;自动行走设备,包括:图像采集装置,用于采集实时图像;控制模块,连接所述图像采集装置,用于控制自动行走设备工作;所述控制模块包括:图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;回归方向确认单元,用于确保自动行走设备朝向目标物;偏移判断单元,将图像采集装置采集到的目标物的实时图像的形状特征与自动行走设备位于一标准方向时所能采集到的目标物的标准图像的形状特征进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移;当图像识别单元识别出图像采集装置采集到的实时图像包含目标物图像之后,回归方向确认单元控制自动行走设备朝向目标物,在回归方向确认单元确认自动行走设备朝向目标物后,偏移判断单元启动判断自动行走设备相对标准方向的偏移,从而根据判断结果控制自动行走设备对接至充电站。
优选的,所述标准方向为所述目标物所在表面的中垂线方向。
优选的,图像识别单元通过比对图像采集装置采集到的实时图像和目标物的标准图像的颜色特征,识别图像采集装置采集到的实时图像是否包含目标物的图像。
优选的,所述目标物的外轮廓形状为矩形,且具有至少两种不同的颜色。
优选的,所述控制模块还包括距离计算单元,所述偏移判断单元在进行相对于标准方向偏移判断之前,距离计算单元计算自动行走设备与目标物之间的距离,并且控制自动行走设备行走至离目标物预设的距离范围之内。
优选的,所述目标物为多边形时,形状特征包括多边形图像外轮廓一侧的特定边与另一侧的特定边的边长比例关系。
优选的,其特征在于,所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值小于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备处于标准方向。
与现有技术相比,本发明自动行走设备能够自动对准目标物,从而提高自动行走设备回归至特定领地的效率。本发明自动行走设备的对接系统,能够快速且准确地进行回归对接。
本发明还提供一种提高自动行走设备的回归效率,且实现自动行走设备与停靠站的有效对接的自动行走设备控制方法及自动工作系统。
为实现上述目的,本发明的技术方案是:一种自动行走设备控制方法,用于控制所述自动行走设备返回停靠站,所述停靠站上设有表示自动行走设备与停靠站回归对接时的对接方向的方向标识,所述自动行走设备上设有图像采集装置及处理器,所述方法包括以下步骤:步骤S1:所述处理器识别所述方向标识;步骤S2:所述处理器判断所述自动行走设备的行走方向;步骤S3:所述处理器控制所述自动行走设备的行走路径,使得自动行走设备的行走方向与所述方向标识所示的对接方向重合;步骤S4:所述处理器控制所述自动行走设备沿所述方向标识所示的对接方向与所述停靠站对接。
优选的,所述停靠站上还设有停靠站标识,所述停靠站标识具有特定的形状或/及图案,在上述步骤S1之前还包括以下步骤:步骤S101:所述自动行走 设备启动回归时,所述处理器识别所述停靠站标识;步骤S102:所述处理器判断所述自动行走设备与停靠站标识之间的距离是否等于第一距离值L1,是则进入步骤S1,否则进入步骤S103;步骤S103:所述处理器控制所述自动行走设备朝向所述停靠站标识前进。
优选的,所述停靠站标识位于所述停靠站顶部,所述停靠站标识的形状为圆柱形。
优选的,在步骤S103中,所述处理器控制所述自动行走设备朝向所述停靠站标识前进时,控制所述自动行走设备的行走方向,使得所述停靠站标识对应的图像始终位于所述图像采集装置获取的图像中的第一特定区域中。
优选的,在步骤S103中,所述处理器根据所述图像中的停靠站标识对应的图像的面积的大小判断所述自动行走设备与停靠站之间的距离。
优选的,上述步骤S3包括:步骤S311:所述处理器控制所述自动行走设备旋转预定的角度;步骤S312:所述处理器控制所述自动行走设备行走预定的间距s;步骤S313:所述处理器控制所述自动行走设备转向并朝向所述方向标识;步骤S314:所述处理器判断所述自动行走设备的行走方向是否与所述方向标识所示的对接方向重合,是则进入步骤S4,否则返回步骤S311。
优选的,所述停靠站上还设有在对接方向上位于所述方向标识前端的定位标识,所述定位标识具有特定的形状或/及图案,上述步骤S3包括:步骤S321:所述处理器识别所述定位标识,并控制所述自动行走设备朝向所述定位标识;步骤S322:所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述处理器计算图像中的方向标识与所述中线之间形成的第一角度α;步骤S323:所述处理器根据所述第一距离值L1及第一角度α计算第二距离值L2;步骤S324:所述处理器控制所述自动行走设备旋转预定的角度;步骤S325:所述处理器控制所述自动行走设备行走第二距离值L2;步骤S326:所述处理器控制所述自动行走设备转向并朝向所述方向标识。
优选的,所述停靠站上还设有在对接方向上位于所述方向标识前端的定位标识,所述定位标识具有特定的形状或/及图案,上述步骤S3包括:步骤S331:所述处理器识别所述定位标识,并控制所述自动行走设备朝向所述定位标识;步骤S332:所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述处理器计算图像中的方向标识与所述中线之间形成的第一角度α;步骤S333:所述处理器根据所述自动行走设备所处的点、所述定位标识及第一角 度α构建出特定的三角形,根据第一角度α、第一距离值L1计算三角形中的第一角度α的另一条邻边Lx及对边L2的边长;步骤S334:所述处理器计算在邻边Lx上特定的位置处与邻边Lx内切,同时与对边L2相切的内切圆的半径R;步骤S335:所述处理器控制所述自动行走设备朝所述方向标识所指示的方向旋转预定的角度;步骤S336:所述处理器控制自动行走设备前进到内切圆与对边L2的切点处;步骤S337:所述处理器根据内切圆的半径R及左右轮子的轮间距2d计算左右两侧的轮子的转速比,所述处理器控制自动行走设备使得左右轮子具有特定的转速比,从而使得自动行走设备沿预定的弧形路径行走,直到自动行走设备的行走方向与所述方向标识所示的对接方向一致。
优选的,所述方向标识为直线图案、长方形图案或者至少一个箭头图案。
优选的,在步骤S2中,所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述中线与方向标识的图像重叠或基本重叠时,所述处理器判断所述自动行走设备的行走方向与所述方向标识所示的对接方向重合。
本发明还提供一种自动工作系统,包括自动行走设备及停靠站,所述停靠站上设有表示自动行走设备与停靠站回归对接时的对接方向的方向标识,所述自动行走设备上设有图像采集装置及处理器,所述处理器包括:方向标识识别模块,用于识别所述方向标识;对接方向判断模块,用于判断所述自动行走设备的行走方向;回归路径控制模块,当所述自动行走设备的行走方向与所述方向标识表示的对接方向不重合时,所述回归路径控制模块控制所述自动行走设备的行走路径,使得自动行走设备的行走方向与所述方向标识所示的对接方向重合;第二回归控制模块,当所述自动行走设备的行走方向与所述方向标识表示的对接方向重合时,所述第二回归控制模块控制所述自动行走设备沿所述方向标识所示的对接方向与所述停靠站对接。
优选的,所述停靠站上还设有停靠站标识,所述停靠站标识具有特定的形状或/及图案,所述自动工作系统还包括:停靠站标识识别模块,用于识别所述停靠站标识;距离判断模块,用于在自动行走设备朝向所述停靠站标识前进的过程中判断所述自动行走设备与停靠站标识之间的距离是否等于第一距离值L1;第一回归控制模块,用于控制所述自动行走设备朝向所述停靠站标识前进。
优选的,所述停靠站标识位于所述停靠站顶部,所述停靠站标识的形状为圆柱形。
优选的,所述图像采集装置获取的图像具有第一特定区域,第一回归控制 模块在控制自动行走设备朝向停靠站标识前进时,调整自动行走设备的行走方向,使得停靠站标识对应的图像始终位于图像的第一特定区域中。
优选的,所述距离判断模块根据停靠站标识对应的图像的面积值的大小判断自动行走设备与停靠站之间的距离。
优选的,所述回归路径控制模块包括:第一转向控制模块,用于控制自动行走设备朝方向标识在图像中所指示的方向旋转预定的角度;间距控制模块,用于控制所述自动行走设备行走预定的间距s;第二转向控制模块,在自动行走设备行走预定的间距s后控制所述自动行走设备转动朝向方向标识;方向判断模块,用于判断所述自动行走设备的行走方向是否与所述方向标识所示的对接方向重合。
优选的,所述停靠站上还设有在对接方向上位于所述方向标识前端的定位标识,所述定位标识具有特定的形状或/及图案,所述回归路径控制模块包括:定位标识识别模块,用于识别所述定位标识并控制所述自动行走设备朝向所述定位标识;角度计算模块,所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述角度计算模块计算图像中的方向标识与中线之间形成的第一角度α;距离计算模块,用于根据所述第一距离值L1及第一角度α计算第二距离值L2;第一转向控制模块,用于控制所述自动行走设备旋转预定的角度;距离控制模块,用于控制所述自动行走设备行走第二距离值L2;第二转向控制模块,用于控制所述自动行走设备转向并朝向所述方向标识。
优选的,所述停靠站上还设有在对接方向上位于所述方向标识前端的定位标识,所述定位标识具有特定的形状或/及图案,所述回归路径控制模块包括:定位标识识别模块,用于识别所述定位标识并控制所述自动行走设备朝向所述定位标识;角度计算模块,所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述角度计算模块计算图像中的方向标识与中线之间形成的第一角度α;边长计算模块,根据所述自动行走设备所处的点、所述定位标识及第一角度α构建出特定的三角形,根据第一角度α、第一距离值L1计算三角形中的第一角度α的另一条邻边Lx及对边L2的边长;半径计算模块,用于计算在邻边Lx上特定的位置处与邻边Lx内切,同时与对边L2相切的内切圆的半径R;第一转向控制模块,用于控制所述自动行走设备朝所述方向标识所指示的方向旋转预定的角度;切点控制模块,用于控制自动行走设备前进到内切圆与对边L2的切点处;弧形路径控制模块,用于根据内切圆的半径R及 左右轮子的轮间距2d计算左右两侧的轮子的转速比,并控制自动行走设备使得左右轮子具有特定的转速比,从而使得自动行走设备沿预定的弧形路径行走,直到自动行走设备的行走方向与所述方向标识所示的对接方向一致。
优选的,所述方向标识为直线图案、长方形图案或者至少一个箭头图案。
与现有技术相比,本发明自动行走设备控制方法及自动工作系统能够提高自动行走设备的回归效率,且实现自动行走设备与停靠站的有效对接。
本发明还提供一种自动工作系统基于图像识别的定位装置及其定位方法,对充电站进行更准确定位。
本发明所述的基于图像识别的定位装置,用于对充电站的定位,包括:
标识物,所述标识物为立体状固定设置在所述充电站上;
图像采集模块,所述图像采集模块设置在行走设备上,用于采集所述标识物的图像信息;
图像与位置对应模块,所述图像与位置对应模块用于设置所述标识物的图像信息与所述充电站相对于所述行走设备的位置信息之间的对应关系;
位置判定模块,所述位置判定模块用于将所述图像采集模块采集到的标识物的图像信息,与所述图像与位置对应模块中设置的所述标识物图像信息与所述充电站相对于所述行走设备的位置信息的对应关系进行比较,判定出所述充电站相对于所述行走设备的位置信息。
在其中一个实施例中,所述图像与位置对应模块包括距离对应模块和方位对应模块;
所述距离对应模块用于设置所述标识物的图像信息与所述充电站相对于所述行走设备的距离信息之间的对应关系;
所述方位对应模块用于设置所述标识物的图像信息与所述充电站相对于所述行走设备的方位信息之间的对应关系。
在其中一个实施例中,所述位置判定模块包括距离判定模块和方位判定模块;
所述距离判定模块用于将所述图像采集模块采集到的所述标识物的图像信息,与所述距离对应模块中所述标识物的图像信息与所述充电站相对于所述行走设备的距离信息之间的对应关系进行比较,判定出所述充电站相对于所述行走设备的距离信息;
所述方位判定模块用于将所述图像采集模块采集到的所述标识物的图像信 息,与所述方位对应模块中所述图像信息与所述充电站相对于所述行走设备的方位信息之间的对应关系进行比较,判定出所述充电站相对于所述行走设备的方位信息。
在其中一个实施例中,所述标识物为圆柱体,在所述标识物的外表面上设置图像识别区域和图像内容区域;
所述图像识别区域用于识别和定义所述图像内容区域;
所述图像内容区域包括与所述充电站相对于所述行走设备的位置信息具有对应关系的图像信息内容。
在其中一个实施例中,所述图像内容区域包括不同的字符或不同的色彩组合,与所述充电站相对于所述行走设备的位置信息进行对应。
一种基于图像识别定位装置的定位方法,包括如下步骤:
预设充电站上的标识物图像信息与所述充电站相对于行走设备的位置信息之间的对应关系;
所述行走设备采集设置在充电站上的标识物的图像信息;
将所述行走设备采集到的标识物图像信息,与所述预设的对应关系进行比较,判定出所述充电站相对于所述行走设备的位置信息。
在其中一个实施例中,所述预设的对应关系,包括预设所述标识物图像信息与所述充电站相对于所述行走设备的距离信息之间的对应关系,及预设所述标识物图像信息与所述充电站相对于所述行走设备的方位信息之间的对应关系。
在其中一个实施例中,在所述将采集到的标识物图像信息与所述预设的对应关系进行比较时,包括将采集到的标识物图像信息与所述标识物图像信息和所述充电站相对于所述行走设备的距离信息之间的对应关系进行比较,判定出所述充电站相对于所述行走设备的距离信息;
及将采集到的标识物图像信息与所述标识物图像信息和所述充电站相对于所述行走设备的方位信息之间的对应关系进行比较,判定出所述充电站相对于所述行走设备的方位信息。
在其中一个实施例中,所述标识物为圆柱体,在所述标识物的外表面上设置图像识别区域和图像内容区域;
所述图像识别区域用于识别和定义所述图像内容区域;
所述图像内容区域包括与所述充电站相对于所述行走设备的位置信息具有对应关系的图像信息内容。
在其中一个实施例中,所述图像内容区域包括不同的字符或不同的色彩组合,与所述充电站相对于所述行走设备的位置信息进行对应。
上述基于图像识别的定位装置,通过图像采集模块采集设置在充电站上的标识物图像信息,并通过位置判断模块,与预先设置在图像与位置对应模块中的标识物图像信息和充电站相对行走设备的位置关系进行比较,从而判定出充电站相对行走设备的位置信息。本发明中的标识物为立体状,使得图像采集模块可以360度全方位采集标识物的图像信息,从而更加准确的定位。本发明基于图像识别定位装置的定位方法,可从前、后、左、右等多方位、高准确度地对充电站相对行走设备的位置进行定位。
附图说明
下面结合附图和实施方式对本发明作进一步说明。
图1是本发明自动工作系统的示意图。
图2是图1中图像采集装置获取的图像中的停靠站标识、第一特定区域或第二特定区域,方向标识及中线的示意图。
图3是图2的局部放大示意图。
图4是本发明自动工作系统的方框示意图。
图5是本发明自动工作系统的第一较佳实施方式的工作示意图。
图6是本发明自动工作系统的第一较佳实施方式中回归路径控制模块的方框示意图。
图7是本发明自动工作系统的第二较佳实施方式中回归路径控制模块的方框示意图。
图8是本发明自动工作系统的第二较佳实施方式的工作示意图。
图9是本发明自动工作系统的第三较佳实施方式中回归路径控制模块的方框示意图。
图10是本发明自动工作系统的第三较佳实施方式的工作示意图。
图11是本发明自动行走设备控制方法的流程示意图。
图12是本发明自动行走设备控制方法的第一较佳实施方式的部分流程示意图。
图13是本发明自动行走设备控制方法的第二较佳实施方式的部分流程示意图。
图14是本发明自动行走设备控制方法的第三较佳实施方式的部分流程示 意图。
图15是本发明自动工作系统另一实施方式的示意图。
图16是图15所示的自动工作系统的停靠站的立体图。
图17是图16所示停靠站的主视图。
图18是目标物的第一种设计的主视图。
图19是目标物的第二种设计的主视图。
图20是目标物的第三种设计的主视图。
图21是图15所示实施方式的自动行走设备的框图。
图22是自动行走设备偏移目标物左侧时的示意图。
图23是图22所示情况下目标物在成像区域中的成像示意图。
图24是自动行走设备对准目标物时的示意图。
图25是图24所示情况下目标物在成像区域中的成像示意图。
图26是自动行走设备偏移目标物右侧时的示意图。
图27是图26所示情况下目标物在成像区域中的成像示意图。
图28是自动行走设备的前进方向不是回归方向的一种示意图。
图29是图28所示情况下目标物在成像区域中的成像示意图。
图30是自动行走设备的前进方向时回归方向的一种示意图。
图31是图30所示情况下目标物在成像区域中的成像示意图。
图32是自动行走设备的前进方向不是回归方向的另一种示意图。
图33是图32所示情况下目标物在成像区域中的成像示意图。
图34是图15所示实施方式的自动行走设备回归停靠站的流程图。
图35是另一实施方式自动行走设备的框图。
图36是另一实施方式自动行走设备回归停靠站的流程图。
图37是本发明自动工作系统另一实施方式的基于图像识别的定位装置的原理框图;
图38-1是图37所示的基于图像识别定位装置的标识物一实施例的结构示意图;
图38-2是图37所示的基于图像识别定位装置的标识物又一实施例的结构示意图;
图39-1是自动行走设备位于图38-1中所示标识物正前方时,图像采集装 置采集到的图像示意图;
图39-2是自动行走设备位于图38-1中所示标识物左前方时,图像采集装置采集到的图像示意图;
图39-3是自动行走设备位于图38-1中所示标识物正左方时,图像采集装置采集到的图像示意图;
图39-4是自动行走设备位于图38-1中所示标识物左后方时,图像采集装置采集到的图像示意图;
图39-5是自动行走设备位于图38-1中所示标识物正后方时,图像采集装置采集到的图像示意图;
图39-6是自动行走设备位于图38-1中所示标识物右后方时,图像采集装置采集到的图像示意图;
图39-7是自动行走设备位于图38-1中所示标识物正右方时,图像采集装置采集到的图像示意图;
图39-8是自动行走设备位于图38-1中所示标识物右前方时,图像采集装置采集到的图像示意图;
图40是图37所示的基于图像识别定位装置的定位方法其中一实施例的流程图;
图41是图37所示的基于图像识别定位装置的定位方法其中又一实施例的流程图;
图42是图37所示的基于图像识别定位装置的定位方法其中又一实施例的流程图。
其中,
100、自动行走设备;   200、停靠站;           300、边界;
400、工作区域;       202、充电端子;         210、停靠站标识;
220、方向标识;       212、顶部;             214、中部;
216、底部;           230、平板;             110、壳体;
120、轮子;           130、处理器;           140、图像采集装置;
131、停靠站标识识别模 132、第一回归控制模块; 133、距离判断模块;
块;
134、方向标识识别模块; 135、对接方向判断模块; 136、第二回归控制模块;
137、回归路径控制模块; 1371、角度计算模块;    1372、距离计算模块;
1373、第一转向控制模    1374、距离控制模块;    1375、第二转向控制模
块;                                            块;
1376、间距控制模块;    1377、边长计算模块;    1378、半径计算模块;
142、图像;             144、第一特定区域;     218、连线;
146、中线;             1379、切点控制模块;    1370、弧形路径控制模
                                                块;
137a、定位标识识别模块  137b、方向判断模块      21、目标物;
211、第一部分;         212、第二部分;         13、控制模块;
14、图像识别单元;      15、回归方向确认单元;  16、距离计算单元;
17、偏移判断单元;      18、控制单元;          21′、目标物的实时图像
21″、目标物的标准图像  △H、中心区域           A、成像区域
a、目标物一侧的边长     b、目标物另一侧的边长   a′、实时图像一侧的边
                                                长
b′、实时图像另一侧的   1100、基于图像识别的定  1110、标识物
边长                    位装置
1111、图像识别区域      1112、图像内容区域      1120、图像采集模块
1121、摄像装置          1130、图像与位置对应模  1131、距离对应模块
                        块
1132、方位对应模块      1140、位置判定模块      1141、距离判定模块
1142、方位判定模块
具体实施方式
请参考图1,本发明实施方式提供一种自动工作系统及自动行走设备控制方法。
该自动工作系统包括自动行走设备100,如自动割草机或自动吸尘器,及停靠站200。自动行走设备100在预定的边界300所限定的工作区域400内行走,并在电量较低时返回停靠站200补充能量,在完成工作或下雨时返回停靠站200。
自动行走设备100的前部具有至少两个对接端子(图未示),停靠站200具有至少两个充电端子202,当自动行走设备100与停靠站200对接时,对接端子与对应的充电端子202连接。
本实施方式中,停靠站200位于工作区域400的边界300上,充电端子202朝向特定的方向设置,如朝向停靠站200左侧或右侧的边界300设置。
请同时参考图2及图3,表示自动行走设备100与停靠站200对接时的对接方向的方向标识220、停靠站标识210及位于对接方向上且位于方向标识220前端的的定位标识(图未示)。所述停靠站标识210可位于所述停靠站200的任意位置,所述定位标识具有特定的形状或/及图案,如所述定位标识即为方向标识220的指向末端。
本实施方式中,停靠站标识210在对接方向上位于方向标识220的前端,且停靠站标识210同时作为定位标识。
停靠站标识210竖直设置于停靠站200的顶部,停靠站标识210的形状基本为圆柱形,如此在一定水平面内不管从哪个方向观察,停靠站标识210所形成的图形的面积都相同。
具体地,停靠站标识210的形状为:从上到下具有圆柱形的顶部212、中部214及底部216;顶部212与底部216的直径相同,中部214的直径小于顶部212及底部216的直径。
进一步地,停靠站标识210具有特定的图案,如:顶部212与底部216的外周具有相同的第一颜色,中部214具有与第一颜色明显不同的第二颜色。其他实施方式中,也可在顶部212与底部216的外周设有规则的第一条纹,在中部214设有规则的第二条纹或者不设条纹。
停靠站200具有供自动行走设备停靠的平板230,平板230平铺于地面上。自动行走设备100整体位于平板230上时,可以防止地面不平导致自动行走设备100歪斜使得对接端子无法与充电端子202对接。方向标识220位于平板230的上表面,方向标识220具有与充电端子202平行的直线图案、长方形图案或者至少一个箭头图案。本实施方式中,方向标识220为若干首尾相连的箭头图案,除最末尾的一个箭头图案外,其他的箭头图案仅具有指示方向的斜线部分,而没有延伸的直线部分。所有的箭头图案均指向充电端子202。
自动行走设备100包括壳体110、位于壳体110底部的若干轮子120,位于壳体110内部、用于驱动轮子120的动力系统(图未示),位于壳体110内部的处理器130,及位于壳体110上的图像采集装置140。动力系统包括电池包、传动机构等。
图像采集装置140用于获取停靠站200的图像,处理器130用于对图像采 集装置140获取的图像进行处理分析并控制自动行走设备100行走。本实施方式中,图像采集装置140为摄像机。
请参考图4,本发明自动工作系统的处理器130具有以下工作模块:停靠站标识识别模块131、第一回归控制模块132、距离判断模块133、方向标识识别模块134、对接方向判断模块135、第二回归控制模块136及回归路径控制模块137。
停靠站标识识别模块131用于根据图像采集装置140获取停靠站200的图像识别停靠站标识210。
停靠站标识识别模块131可根据图像中物体的形状、图案,或者综合其形状与图案,以识别其是否为停靠站标识210。
处理器130存储有与停靠站标识210的图案对应的第一预设图案,处理器130还存储有与停靠站标识210的形状对应的第一预设形状;停靠站标识识别模块131比较图像中的物体的形状与第一预设形状,判断该物体的形状是否与第一预设形状匹配;停靠站标识识别模块131比较物体的图案与第一预设图案,判断该物体的图案是否与第一预设图案匹配。
第一回归控制模块132用于控制自动行走设备100朝向停靠站标识210前进。从而引导自动行走设备100从距离停靠站200较远的位置移动到距离停靠站200较近的位置,方便后续识别方向标识220及控制自动行走设备100的路径。
请再次参考图2及图3,图像采集装置140获取的图像142具有第一特定区域144,第一回归控制模块132在控制自动行走设备100朝向停靠站标识210前进时,调整自动行走设备100的行走方向,使得停靠站标识210对应的图像始终位于图像142的第一特定区域144中。如此可防止自动行走设备100走偏,提高了回归效率。
距离判断模块133用于判断自动行走设备100与停靠站200之间的距离是否小于或等于第一距离值L1。距离判断模块133根据图像142中的停靠站标识210的侧边长或者面积值的大小判断自动行走设备100与停靠站200之间的距离是否为第一距离值L1。
具体地,处理器130中存储有预定长度值,处理器130计算图像142中停靠站标识210的至少一个侧边的侧边长,比较计算的侧边长与预定长度值,当计算的侧边长达到预定长度值时,即判断自动行走设备100与停靠站200之间 的距离为第一距离值L1。
具体地,处理器130也可存储预定面积值,处理器130计算图像142中停靠站标识210的至少部分的面积值,比较计算的面积值与预定面积值,当计算的面积值达到预定面积值时,即判断自动行走设备100与停靠站200之间的距离为第一距离值L1。
本实施方式中,为了简化处理,处理器130用连线218连接停靠站标识210图形的四个端点,形成矩形。由于在一定水平面内不管从哪个方向观察,若与停靠站标识210距离一定,则该矩形的长和宽均相同,因此可将该矩形的面积作为停靠站标识210图形的面积值。
在另一实施方式中,图像采集装置140获取的图像142具有与停靠站标识210的形状匹配的第二特定区域,当图像142中的停靠站标识142与第二特定区域基本重叠时,距离判断模块133判断自动行走设备100与停靠站200之间的距离为第一距离值L1。为了简化处理,处理器130用连线218连接停靠站标识210图形的四个端点,形成矩形,距离判断模块133判断该矩形是否与第二特定区域重叠。第二特定区域可与第一特定区域144相同。
请参考图5,方向标识识别模块134根据图像采集装置140获取的停靠站200的图像142识别方向标识220。方向标识220具有特定的图案,处理器130存储有对应的第二预设图案,方向标识识别模块134比较图像142中的图案与第二预设图案,若图像142中具有与第二预设图案匹配的图案,则识别该图案为方向标识220。
图像采集装置获取的图像具有将图像划分左右两部分的中线146,对接方向判断模块135判断中线146是否与所述方向标识220表示的对接方向重合或基本重合。
若中线146与方向标识220表示的对接方向重合或基本重合,第二回归控制模块136则控制自动行走设备100沿方向标识220所示的对接方向与停靠站对接。
若中线146与方向标识220表示的对接方向不重合或不基本重合时,回归路径控制模块137控制自动行走设备100的行走路径,使得自动行走设备100的行走方向与方向标识220所示的对接方向重合或基本重合。之后由第二回归控制模块136控制自动行走设备。
可以理解的是,处理器130也可直接识别方向标识220,进而判断自动行走 设备100的行走方向是否与所述方向标识220表示的对接方向重合,若重合则控制自动行走设备100沿方向标识220所示的对接方向与停靠站对接,若不重合则控制自动行走设备100的行走路径,使得自动行走设备100的行走方向与方向标识220所示的对接方向重合。
请同时参考图5及图6,本发明自动工作系统的第一较佳实施方式中,回归路径控制模块137包括:定位标识识别模块137a、角度计算模块1371、距离计算模块1372、第一转向控制模块1373、距离控制模块1374及第二转向控制模块1375。
定位标识识别模块137a识别定位标识,并控制所述自动行走设备朝向所述定位标识。特别地,当停靠站标识210位于对接方向上且同时作为定位标识时,定位标识识别模块137a即为停靠站标识识别模块131。
角度计算模块1371以定位标识为基点计算方向标识220与中线146之间形成的第一角度α。
距离计算模块1372用于根据所述第一距离值L1及第一角度α计算第二距离值L2。
若第一角度α为锐角,则以自动行走设备100所处的点为直角,由中线146、方向标识220的延长线及第一角度α形成一个直角三角形,另一直角边的长度即为第二距离值L2,距离计算模块1372根据第一距离值L1及第一角度α计算第二距离值L2。第一距离值L1、第二距离值L2及第一角度α满足关系:tanα=L2/L1。
若第一角度α为直角,则以自动行走设备100所处的点为特定的锐角α1,如60度,由中线146、方向标识220的延长线及第一角度α形成一个直角三角形,斜边的长度即为第二距离值L2,距离计算模块1372根据第一距离值L1及锐角α1计算第二距离值L2。第一距离值L1、第二距离值L2及锐角α1满足关系:cosα1=L1/L2。
若第一角度α为钝角,在方向标识220所在的边取与第一距离值L1相同的长度构建等腰三角形,第一角度α的角平分线与第一角度α的对边L2垂直,根据第一角度α可计算出对边L2的长度。
第一转向控制模块1373用于控制自动行走设备100旋转预定的第二角度。本实施方式中,由于之前自动行走设备100一直朝向停靠站标识210,则方向标识220必然完全地位于中线146的一侧,如图2所示的左侧。转向控制模块 138用于控制自动行走设备100朝图像142中方向标识220所指示的方向所指示的方向,如向左,旋转第二角度。
若第一角度α为锐角,则第二角度为90度。
若第一角度α为直角,则第二角度为特定的锐角α1,如60度。
若第一角度α为钝角,则计算出等腰三角形的另两个相同的角,且以该角作为第二角度。
距离控制模块1374用于控制自动行走设备100行走第二距离值L2。
自动行走设备100行走第二距离值L2后,第二转向控制模块1375用于控制自动行走设备转向,并朝向方向标识220。本实施方式中,第二回归控制模块1375控制自动行走设备100向左或向右旋转,直到图像142中的方向标识220所示的方向与所述中线146重合或基本重合,即使得自动行走设备100的行走方向与方向标识220所示的对接方向一致。
综上,请参考图1,在本发明自动工作系统的第一较佳实施方式中,自动行走设备100在处理器130的控制下沿图中的单箭头标记所示的路径行走。
请参考图7及图8,本发明自动工作系统的第二较佳实施方式中,回归路径控制模块137包括:第一转向控制模块1373、间距控制模块1376、第二转向控制模块1375及方向判断模块137b。
第一转向控制模块1373用于控制自动行走设备100朝图像中方向标识220所指示的方向所指示的方向旋转预定的第二角度。本实施方式中,第二角度为90度。
间距控制模块1376用于控制自动行走设备100行走预定的间距s。
第二转向控制模块1375在自动行走设备100行走预定的间距s后控制自动行走设备100转动,并朝向方向标识220。
方向判断模块137b,用于判断自动行走设备100的行走方向是否与方向标识220所示的对接方向重合。
本实施方式中,第二回归控制模块1375控制自动行走设备100向左或向右旋转,直到方向标识220位于图像的中部。
优选地,回归路径控制模块137还可包括第一回归控制模块132、距离判断模块133及定位标识识别模块137a。在自动行走设备100行走预定的间距s后,定位标识识别模块137a识别定位标识,并控制所述自动行走设备朝向所述定位标识,之后第一回归控制模块132还控制自动行走设备100朝向定位标识前进 一段距离,距离判断模块133判断自动行走设备100与停靠站200之间的距离是否为第一距离值L1,之后由第二转向控制模块1375控制自动行走设备100转动,并朝向方向标识220。
特别地,当停靠站标识210位于对接方向上且同时作为定位标识时,定位标识识别模块137a即为停靠站标识识别模块131。
综上,请参考图8,在本发明自动工作系统的第二较佳实施方式中,自动行走设备100到达距离停靠站第一距离值L1后,在处理器130的控制下沿图中的单箭头标记所示的路径行走。
请参考图9及图10,本发明自动工作系统的第三较佳实施方式中,回归路径控制模块137包括:定位标识识别模块137a、角度计算模块1371、边长计算模块1377、半径计算模块1378、第一转向控制模块1373、切点控制模块1379、弧形路径控制模块1370。
定位标识识别模块137a识别定位标识,并控制所述自动行走设备朝向所述定位标识。特别地,当停靠站标识210位于对接方向上且同时作为定位标识时,定位标识识别模块137a即为停靠站标识识别模块131。
角度计算模块1371以定位标识为基点计算方向标识220与中线146之间形成的第一角度α。
若第一角度α为锐角,则以自动行走设备100所处的点A为直角,由中线146、方向标识220的延长线及第一角度α形成一个直角三角形ABC。边长计算模块1377根据第一角度α及第一距离值L1计算直角三角形的另外一条直角边L2及斜边Lx的长度。
半径计算模块1378虚构出一个在斜边Lx上距离停靠站标识210预定的第三距离L3处(D点)与斜边Lx相切、且与直角边L2相切的内切圆。半径计算模块1378进一步计算该内切圆(以O点为圆心)的半径R。
具体地,在直角三角形ABC中计算出角C的第二角度β,第二角度β的角平分线CO与D点处斜边Lx的垂线DO的焦点O即为内切圆的圆心。根据斜边Lx及第三距离L3计算出CD段长度,进而计算三角形CDO中的OD段的长度,OD段的长度即为内切圆的半径R。
第一转向控制模块1373用于控制自动行走设备100朝图像中方向标识所指示的方向旋转90度。此时自动行走设备100的状态如图10所示。
切点控制模块1379用于控制自动行走设备100前进第四距离L4,到达内 切圆与直角边L2的切点E处。由于CD段长度等于CE段的长度,第四距离L4即为AE段长度,第四距离L4为直角边L2与CE段的长度之差。
弧形路径控制模块1370用于根据内切圆的半径R及左右轮子的轮间距2d计算左右两侧的轮子的转速比,并控制自动行走设备100使得左右轮子具有特定的转速比,从而使得自动行走设备100沿预定的弧形路径行走,直到自动行走设备100的行走方向与方向标识220所示的对接方向一致。
具体地,图10中自动行走设备100的左轮距离圆心O的外半径为R+d,右轮距离圆心O的内半径为R-d,左右两侧的轮子的转速比等于外半径R+d与内半径R-d的比值。
若第一角度α为直角,则以自动行走设备100所处的点为特定的锐角α1,如60度,由中线146、方向标识220的延长线及锐角α1形成一个直角三角形ABC。之后计算出直角三角形ABC的各边长、内切圆的半径等,具体方法与第一角度α为锐角时相似。
若第一角度α为钝角,在方向标识220所在的边取与第一距离值L1相同的长度构建等腰三角形,第一角度α的角平分线与第一角度α的对边L2垂直,从而形成两个直角三角形。之后计算出两个直角三角形的各边长、与方向标识220所在的三角形的两条边相切的内切圆的半径等,具体方法与第一角度α为锐角时相似。
请参考图11,本发明实施例提供的自动行走设备控制方法的包括以下步骤:
步骤S101:自动行走设备100启动回归时,处理器130根据图像采集装置140获取的停靠站200的图像识别停靠站标识210。
处理器130可根据图像中物体的形状、图案,或者综合其形状与图案,以识别该停靠站标识210。
步骤S102:处理器130判断自动行走设备100与停靠站200之间的距离是否小于或等于第一距离值L1,是则进入步骤S1,否则进入步骤S103。
步骤S103:处理器130控制自动行走设备100朝向停靠站标识210前进。
图像采集装置140获取的图像142具有第一特定区域144,处理器130在控制自动行走设备100朝向停靠站标识210前进时,不断调整自动行走设备100的行走方向,使得停靠站标识210始终位于图像142的第一特定区域144中。如此可防止自动行走设备100走偏,提高了回归效率。
处理器130根据图像142中的停靠站标识210的面积值的大小判断自动行 走设备100与停靠站200之间的距离是否为第一距离值L1。如此使得自动行走设备100从距离停靠站200较远的位置移动到距离停靠站200较近的位置。
具体地,处理器130中存储有预定面积值,处理器130计算停靠站标识210对应的图像的面积值,比较计算的停靠站标识210对应的图像的面积值与预定面积值;当计算的停靠站标识210对应的图像的面积值达到预定面积值时,即判断自动行走设备100与停靠站200之间的距离为第一距离值L1。
在另一实施方式中,图像采集装置140获取的图像142具有与停靠站标识210的形状相匹配的第二特定区域,当图像142中的停靠站标识210与第二特定区域基本重叠时,处理器130即判断自动行走设备100与停靠站200之间的距离为第一距离值L1。本实施方式中,第二特定区域可与第一特定区域相同。
步骤S1:所述处理器130根据图像采集装置140获取的停靠站200的图像识别方向标识220。
方向标识220具有特定的图案,处理器130存储有对应的第二预设图案,并比较图像中的图案与第二预设图案,若图像中的图案与第二预设图案匹配,则识别该图案为方向标识220。
步骤S2:处理器130判断自动行走设备100的行走方向是否与方向标识220所示的对接方向重合。具体地,图像采集装置140获取的图像142具有将图像142划分左右两部分的中线146,处理器130比较图像中的中线146与方向标识220的位置关系判断行走方向是否与对接方向重合。
步骤S3:处理器130判断中线146与方向标识220不重合时,则控制自动行走设备100的行走路径,使得自动行走设备100的行走方向与方向标识所示的对接方向重合。
步骤S4:处理器110控制自动行走设备100沿方向标识所示的对接方向与停靠站200对接。
可以理解的是,上述步骤S101、S102、S103也可省略,即处理器130也可直接识别方向标识220,进而判断自动行走设备100的行走方向是否与所述方向标识220表示的对接方向重合,若重合则控制自动行走设备100沿方向标识220所示的对接方向与停靠站对接,若不重合则控制自动行走设备100的行走路径,使得自动行走设备100的行走方向与方向标识220所示的对接方向重合。
请参考图12,本发明自动行走设备控制方法的第一较佳实施方式中,步骤S3包括:
步骤S311:所述处理器控制所述自动行走设备旋转预定的角度;
步骤S312:所述处理器控制所述自动行走设备行走预定的间距s;
步骤S313:所述处理器控制所述自动行走设备转向并朝向停靠站标识;
步骤S314:所述处理器判断所述自动行走设备的行走方向是否与所述方向标识所示的对接方向重合,是则进入步骤S4,否则返回步骤S311。
请参考图13,本发明自动行走设备控制方法的第二较佳实施方式中,步骤S3包括:
步骤S321:所述处理器识别所述定位标识,并控制所述自动行走设备朝向所述定位标识。特别地,停靠站标识210也可位于对接方向上并作为定位标识。
步骤S322:所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述处理器计算所述方向标识与所述中线之间形成的第一角度α;
步骤S323:所述处理器根据所述第一距离值L1及第一角度α计算第二距离值L2;
步骤S324:所述处理器控制所述自动行走设备旋转预定的角度;
步骤S325:所述处理器控制所述自动行走设备行走第二距离值L2;
步骤S326:所述处理器控制所述自动行走设备转向并朝向所述方向标识。
请参考图14,本发明自动行走设备控制方法的第三较佳实施方式中,步骤S6包括:
步骤S331:所述处理器识别所述定位标识,并控制所述自动行走设备朝向所述定位标识。特别地,停靠站标识210也可位于对接方向上并作为定位标识。
步骤S332:所述图像采集装置获取的图像具有将所述图像划分左右两部分的中线,所述处理器计算所述方向标识与所述中线之间形成的第一角度α;
步骤S333:所述处理器根据所述自动行走设备所处的点、所述定位标识及第一角度α构建出特定的三角形,根据第一角度α、第一距离值L1计算三角形中的第一角度α的另一条邻边Lx及对边L2边长;
步骤S334:所述处理器计算在邻边Lx上特定的位置处与邻边Lx内切、同时与对边L2相切的内切圆的半径R;
步骤S335:所述处理器控制所述自动行走设备朝所述方向标识所指示的方向所指示的方向旋转预定的角度;
步骤S336:所述处理器控制自动行走设备前进到内切圆与对边L2的切点处;
步骤S337:所述处理器根据内切圆的半径R及左右轮子的轮间距2d计算左右两侧的轮子的转速比,所述处理器控制自动行走设备使得左右轮子具有特定的转速比,从而使得自动行走设备沿预定的弧形路径行走,直到自动行走设备的行走方向与所述方向标识所示的对接方向一致。
本实施方式的自动行走设备控制方法及自动工作系统的有益效果是:能够提高自动行走设备100的回归效率,且实现自动行走设备100与停靠站200的有效对接。
图15是本发明自动工作系统另一实施方式的示意图。自动行走设备100的前部具有用于进行能量传输的对接端子(图未示),所述对接端子的数量和停靠站200供给能量的充电端子202的数量相对应,在本实施例中该对接端子的数量至少两个,停靠站200具有至少两个充电端子202,当自动行走设备100与停靠站200对接时,对接端子与对应的充电端子202连接。
本实施方式中,停靠站200位于工作区域400的边界300上,充电端子202朝向特定的方向设置,如朝向工作区域400设置。
图16为本实施方式中停靠站200的立体图,图17为本实施方式中的停靠站200的主视图,该停靠站设有目标物21。所述目标物21竖直设置于停靠站200的一侧且所述充电端子202垂直地设置于目标物21上,从而所述充电端子202垂直于所述目标物21的所在平面。在本实施方式中,定义了标准方向,所述标准方向即为自动行走设备100与停靠站200进行对接时的对接方向。由上述目标物21的位置设定方式可知,所述标准方向为目标物21的所在表面的中垂线。
目标物21的表面可以设计为平面,也可以设计成带有一定弧度的凸面或者凹面,从而目标物21的表面具有微拱或者微凹。
目标物21形状为多边形图案,如此,在一定水平面内从不同的方向进行观察,所观察到的目标物21的实时图像会产生不同的形变。自动行走设备100可以根据所观察到的目标物21的实时图像的形变不同,而判断自动行走设备100相对于目标物21所处的方位,即自动行走设备100相对于标准方向是否偏移及所偏移的方位。在本实施例中,优选目标物21的形状为矩形。当然,在本发明具体的实施过程中,目标物21的形状不仅仅限于矩形,也不仅仅限于多边形,只需要从不同的方向进行观察时,目标物21能够发生一定的形变,可以根据该形变判断出自动行走设备100相对于目标物21所处的方位即可,例如目标 物21的形状也可设置成圆形。
进一步地,目标物21具有特定的颜色,如:将目标物21分为上下两个部分,第一部分211具有第一颜色如蓝色,第二部分212具有不同于第一颜色的第二颜色如红色。如图18、图19、图20所示,展示了目标物21的三种不同的设计。目标物21也可分为左右两部分,或者内外圈两部分,不同的部分具有不同的颜色。当然,也可以将目标物分为大于2个部分的多个部分,每个部分具有不同的颜色,或者仅仅是相邻的部分具有不同的颜色。当然,在目标物21的不同部分内,也可以设置有标识图像或者是文字或者仅仅是条纹。目标物21也可以仅仅设置为具有两种或者多种不同颜色的区域。目标物21具有特定的颜色,是为了提高图像识别单元14识别目标物21图像的匹配率,实际上是一种优选的实施方式,目标物21也可以仅仅只具有一种颜色。在本实施例中,目标物21设置为上下两个部分,且第一部分211具有第一颜色,第二部分212具有不同于第一颜色的第二颜色。
停靠站200具有供自动行走设备停靠的平板230,平板230平铺于地面或者草地上。自动行走设备100整体位于平板230上时,可以防止地面或者草地不平导致自动行走设备100歪斜使得对接端子无法与充电端子202对接。
自动行走设备100包括壳体110、位于壳体110底部的若干轮子120,位于壳体110内部、用于驱动轮子120的动力系统(图未示),位于壳体110内部的控制模块13,及位于壳体110上的图像采集装置140。动力系统包括电池包或者汽油、传动机构等。
图像采集装置140用于获取目标物21的图像,控制模块13用于对图像采集装置140获取的图像进行处理分析并控制自动行走设备100行走和工作。本实施方式中,图像采集装置140为摄像机。
请参考图21,本发明自动行走设备100的控制模块包括以下工作单元:图像识别单元14,偏移判断单元17及控制单元18。
控制单元18用于控制自动行走设备100工作是否启动以及工作模式的选择,和控制轮子120的速度以及转向等。
图像识别单元14用于判断图像采集装置140获取的实时图像中是否出现目标物21。图像识别单元14根据实时图像中物体的形状、图案、颜色,或者综合其形状与图案、或者形状与颜色等,以识别其是否为目标物21。
在本实施例中,图像识别单元14判断图像采集装置140获取的实时图像中 是否出现目标物21的过程具体如下:控制模块13存储有与目标物21的颜色相对应的像素值,如上所述,若目标物21具有第一部分211和第二部分212,则控制模块13中存储有与第一部分211对应的第一颜色的像素值,和与第二部分212对应的第二颜色的像素值。图像识别单元14对图像采集装置140所采集到的实时图像进行扫描,若扫描出实时图像中有一部分图像具有一部分和第一颜色像素值匹配,另一部分和第二颜色像素值匹配,则识别单元14判断该部分图像即为目标物21的实时图像。
偏移判断单元17通过图像采集装置140采集到的目标物21的实时图像21′和目标物21的标准图像21″进行比对,根据实时图像21′相对于标准图像21″的形状改变,判断自动行走设备100相对于标准方向的偏移,从而能够判断出自动行走设备100处于目标物21的左方位、右方位或者对准位置。
如图22~图27所示,展示了当自动行走设备100位于目标物21不同的方位时,其目标物实时图像21′在成像区域A内的成像情况。
如图24所示,自动行走设备100对准目标物21(即自动行走设备100的前进方向位于标准方向);如图25所示,此时,目标物21的实时图像21′的形状相对于目标物21的形状没有发生形变。目标物21外轮廓一侧特定边a与另一侧特定边b的边长比值a/b等于实时图像21′外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′(即a/b=a′/b′)。此时,成像区域A内的目标物21的实时图像21′即为目标物21的标准图像21″,标准图像21″相对于目标物21是不会发生形状畸变的。目标物的标准图像21″是自动行走设备100位于标准方向时采集到的目标物图像。
如图22所示,自动行走设备100位于目标物21的左侧;如图23所示,此时,目标物21成像区域内的实时图像21′的形状相对于目标物21的形状有所变化。目标物21外轮廓一侧特定边a与另一侧特定边b的边长比值a/b大于实时图像21′外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′(即a/b>a′/b′)。这主要是因为由于图像采集装置140在不同位置对采集到的物体会产生视觉差异性,在一定水平面内从不同的方向进行观察,所观察到的目标物21的实时图像会产生不同的形变。
如图26所示,自动行走设备100位于目标物21的右侧;如图27所示,此时,目标物21成像区域内的实时图像21′的形状相对于目标物21的形状有所变化。目标物21外轮廓一侧特定边a与另一侧特定边b的边长比值a/b小于 实时图像21′外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′(即a/b<a′/b′)。这主要是因为由于图像采集装置140在不同位置对采集到的物体会产生视觉差异性,在一定水平面内从不同的方向进行观察,所观察到的目标物21的实时图像会产生不同的形变。
偏移判断单元17根据实时图像21′相对于目标物21的形状的变化来判断自动行走设备100相对于标准方向的偏移,即判断自动行走设备100处于目标物21的左方位、右方位或者对准位置。在控制模块13中预存有目标物标准图像21″外轮廓一侧特定边a与另一侧特定边b的边长比例关系a/b。偏移判断单元17在进行判断时,将实时图像21′外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′与预存的边长比例关系a/b进行对比。若对比结果为a/b大于a′/b′,则偏移判断单元17判断自动自动行走设备100偏向特定边a′的一侧;若对比结果为a/b小于a′/b′,则偏移判断单元17判断自动自动行走设备100偏向特定边b′的另一侧;若对比结果为a/b等于a′/b′,则偏移判断单元17判断自动自动行走设备100位于标准方向。
在本实施例中,目标物21的形状为矩形,因此目标物标准图像21″外轮廓一侧特定边a与另一侧特定边b的边长比例关系a/b等于1。偏移判断单元17在进行判断时,可以直接将实时图像21′外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′与1进行比较。虽然在本实施例中,将目标物21的形状选为矩形,但是不能因此就限定目标物21的形状就为矩形。因为当目标物21选为其他多边形时(如平行四边形、梯形、六边形等),只需设定和外轮廓一侧特定边与另一侧的特定边,上述对比方法仍然适用于判断出自动行走设备100相对于标准方向的偏移。
当偏移判断单元17判断出自动行走设备100相对于标准方向的偏移后,控制模块13中的控制单元18会调整自动行走设备100的行走方向,直至其对准标准方向。从而自动行走设备100能够按照标准方向前进,行走至设有目标物21的停靠站200内。或者,控制单元18根据当前实时图像外轮廓一侧特定边a′和另一侧特定边b′的边长比值a′/b′与预存的目标物标准图像21″外轮廓一侧特定边a与另一侧特定边b的边长比值a/b的变化情况,计算得出自动行走设备100相对标准方向的偏移角度,进而控制自动行走设备100以特定的左右轮差速以和偏移角度相应的弧线直接进入停靠站。
请参考图21,本发明自动行走设备100的控制模块13还可以包括:回归方 向确认单元15。
回归方向确认单元15用于调整自动行走设备100的朝向,从而使得自动行走设备100在向目标物21前进的过程中(或者自动行走设备回归停靠站200的过程中),目标物21不会偏离出图像采集装置140的图像成像区域范围。因而,能够确保自动行走设备100在前行或者判断相对标准方向是否偏移的过程中,不会偏离目标物。
如图28~图33所示,自动行走设备100朝向目标物21不同情况下,目标物21的实时图像在图像采集装置140的成像区域A中不同位置情况。图像采集装置140的成像区域A为对称的矩形,在该对称线的相对称的左右两侧一段范围内形成一个中心区域△H,△H可以占整个成像区域长度的1%~40%,具体范围根据采集装置11的广角范围(成像区域)的大小来确定。一般广角范围(成像区域)越大,△H的占整个成像区域长度的比例就越大。如图29或图33所示,目标物21的实时图像21′处于△H区域范围之外,即目标物21所处位置不在图像采集装置140成像区域的中间或者偏中间位置,那么随着自动行走设备100按照目前既定的方向向前行走,目标物21会偏离出图像采集装置140的成像区域范围,从而自动行走设备100需要重新再进行目标物21的寻找,图像识别单元14需要再次判断成像区域内是否出现目标物21的实时图像21′,从而极大地降低了自动行走设备100回归至停靠站200的效率。如图31所示,目标物21的实时图像21′处于△H区域范围之内,即目标物21所处位置在图像采集装置140成像区域的中间或者偏中间位置,那么自动行走设备100按照既定的方向向前行走,目标物21会始终处于图像采集装置的成像范围之内。
回归方向确认单元15根据目标物21的实时图像21′是否处于成像区域A的中心区域△H之内,来判断自动行走设备100目前的行进的方向是否为回归方向。当实时图像21′处于中心区域△H之内时,回归方向确认单元15判断自动行走设备100的当前行进方向为回归方向,即自动行走设备100按照当前行进方向行走,目标物21会始终处于图像采集装置140的成像范围A之内。当实时图像21′处于中心区域△H之外时,回归方向确认单元15判断自动行走设备100的当前行进方向不为回归方向,即自动行走设备100若按照当前行进方向行走,目标物21会偏离出图像采集装置140的成像范围A之内。当回归方向确认单元15判断自动行走设备100当前的行走方向不为回归方向后,控制单元18根据实时图像21′处于中心区域△H的左侧或是右侧,相应的调整 自动行走设备100的朝向。如图29所示,实时图像21′处于中心区域△H的右侧,则控制单元18控制自动行走设备的朝向往右偏,直至实时图像21′处于中心区域△H内。相应的,如图33所示,实时图像21′处于中心区域△H的左侧,则控制单元18控制自动行走设备的朝向往左偏,直至实时图像21′处于中心区域△H内。
如图34所示,本发明该实施例的自动行走设备100回归停靠站200的流程图,即自动行走设备的对接系统的整个对接过程。
步骤S11:图像采集装置140采集图像,并且由图像识别单元14识别出是否有目标物21的实时图像21′;在识别出图像成像区域A中出现实时图像21′后,进入步骤S12。
步骤S12:回归方向确认单元15根据目标物实时图像21′在成像区域A中所处的位置,判断出自动行走设备100是否朝向回归方向。若自动行走设备100不朝向回归方向,则控制单元18调整自动行走设备的朝向,直至其朝向回归方向。在自动行走设备100朝向回归方向后,进入步骤S13。
步骤S13:偏移判断单元17根据目标物实时图像21′与目标物标准图像21″的形状改变的情况,判断自动行走设备相对于标准方向的偏移。并且根据判断结果调整自动行走设备100行走至目标物21(即回归至停靠站200)。
如图35所示,为本发明的另一实施例。该自动行走设备100的控制模块13还包括一距离计算单元16。距离计算单元16用于计算自动行走设备100和目标物21之间的距离,控制单元18控制自动行走设备100按照回归方向行走至离目标物21预设的距离范围之内。
具体地,控制模块13中存储有预定长度值,控制模块13计算实时图像中21′至少一个特定边的边长,比较计算的侧边长与预定长度值,当计算的特定边的边长达到预定长度值时,即判断自动行走设备100与停靠站200之间的距离为预设的距离范围之内。
具体地,控制模块13也可存储预定面积值,控制模块13实时图像中21′至少部分的面积值,比较计算的面积值与预定面积值,当计算的面积值达到预定面积值时,即判断自动行走设备100与停靠站200之间的距离为预设的距离范围之内。
如图36所示,在本发明另一实施例的自动行走设备100回归停靠站200的流程图,即自动行走设备的对接系统的整个对接过程。
步骤S21:图像采集装置140采集图像,并且由图像识别单元14识别出是否有目标物21的实时图像21′;在识别出图像成像区域A中出现实时图像21′后,进入步骤S22。
步骤S22:回归方向确认单元15根据目标物实时图像21′在成像区域A中所处的位置,判断出自动行走设备100是否朝向回归方向。若自动行走设备100不朝向回归方向,则控制单元18调整自动行走设备的朝向,直至其朝向回归方向。在自动行走设备100朝向回归方向后,进入步骤S23。
步骤S23:自动行走设备100按照调整好的回归方向前行,距离计算单元16实时地计算自动行走设备100与目标物21之间的距离,当自动行走设备100行走至预设的距离范围之内时,控制单元18控制自动行走设备100停止前进并且进入步骤S24。
步骤S24:偏移判断单元17根据目标物实时图像21′与目标物标准图像21″的形状改变过情况,判断自动行走设备相对于标准方向的偏移。并且根据判断结果调整自动行走设备100行走至目标物21(即回归至停靠站200)。
图37为本发明自动工作系统另一实施方式的基于图像识别的定位装置1100,用于对充电站的定位,包括:标识物1110、图像采集模块1120、图像与位置对应模块1130、位置判定模块1140。其中,标识物1110为立体状固定设置在充电站上。图像采集模块1120设置在行走设备上,用于采集标识物1110的图像信息。图像与位置对应模块1130用于设置标识物1110的图像信息与充电站相对于行走设备的位置信息之间的对应关系。位置判定模块1140用于将图像采集模块1120采集到的标识物1110的图像信息,与图像与位置对应模块1130中设置的标识物1110图像信息与充电站相对于行走设备的位置信息的对应关系进行比较,判定出充电站相对于行走设备的位置信息。
在该实施例中,标识物1110设置在充电站上的位置必须满足,图像采集模块1120能够360度全方位的采集到标识物1110的图像信息,并且标识物1110尽量与图像采集模块1120处于差不多的高度,以确保图像采集模块1120采集到的标识物1110信息的准确度,在实践中将标识物1110固定设置在充电站的充电柱上方。图像采集模块1120设置在行走设备上,可通过摄像装置1121实现,例如摄像头。
如图37所示,在其中一个实施例中,图像与位置对应模块1130包括方位对应模块1132和距离对应模块1131。距离对应模块1131用于设置标识物1110 的图像信息与充电站相对于行走设备的距离信息之间的对应关系。方位对应模块1132用于设置标识物1110的图像信息与充电站相对于行走设备的方位信息之间的对应关系。
在该实施例中,根据不同角度采集到标识物1110不同的图像信息,是这些不同的图像信息与充电站相对行走设备的距离及方位信息得以确定,从而实现行走设备对充电站的准确定位。具体地,以图38-1中的标识物1110为例,如图39-1至39-8所示,为行走设备上图像采集模块1120采集到的图38-1中标识物1110的图像信息示意图,通过不同从不同方位采集到的标识物1110的图像中包含蓝色、黄色的比例关系,与充电站相对行走设备的方位信息进行对应。通过相对充电站不同距离位置采集到的标识物1110图像中蓝色、黄色区域的尺寸大小,与充电站相对行走设备的距离信息进行对应。对于本领域的普通技术人员来说,可利用现有技术,通过实验,得到图像中蓝、黄色比例关系与充电站相对行走设备的方位信息的对应关系。同理,通过实验,可以获取各种颜色模式的尺寸与相对距离的对应关系。
如图37所示,在其中一个实施例中,位置判定模块1140包括距离判定模块1141和方位判定模块1142。距离判定模块1141用于将图像采集模块1120采集到的标识物1110的图像信息,与距离对应模块1131中标识物1110的图像信息与充电站相对于行走设备的距离信息之间的对应关系进行比较,判定出充电站相对于行走设备的距离信息。方位判定模块1142用于将图像采集模块1120采集到的标识物1110的图像信息,与方位对应模块1132中图像信息与充电站相对于行走设备的方位信息之间的对应关系进行比较,判定出充电站相对于行走设备的方位信息。
在该实施例中,如图39-1至39-8所示,分别为图像采集模块1120在充电站的正前方、左前方、正左方、左后方、正后方、右后方、正右方、右前方时,采集到的标识物1110的图像信息。将这些图像信息进行分析,求出颜色比例关系,并与位置判定模块1140中的充电站相对行走设备的方位对应关系进行比较,就可以定位出自动行走设备与充电站的相对方向。同时,将这些图像信息进行分析,求出颜色模式尺寸大小,并与位置判定模块1140中的充电站相对行走设备的距离对应关系进行比较,就可以定位出充电站相对行走设备的距离信息。
如图38-1、38-2所示,在其中一个实施例中,标识物1110为圆柱体,在标识物1110的外表面上设置图像识别区域1111和图像内容区域1112。图像识别 区域1111用于识别和定义图像内容区域1112,只有当图像采集模块采集到图像识别区域1111时,才可以进一步对采集的图像内容区域1112的内容进行采集和分析。图像内容区域1112包括与充电站相对于行走设备的位置信息具有对应关系的图像信息内容。图像内容区域1112包括不同的字符或不同的色彩组合,与充电站相对于行走设备的位置信息进行对应。
如图38-1所示,在该实施例中,标识物1110为圆柱体,圆柱体的上、下两部分红色区域为图像识别区域1111,蓝色区域和黄色区域围绕圆柱体设置,并且沿圆柱体轴切方向各占圆柱体环绕表面的一半。优选地,当标识物1110设置在充电站上时,将图39-1中标识物1110的图像设置为相对行走设备的正前方。需要指出的是,标识物1110上图像识别区域1111的位置和颜色设计都不局限于图38-1所示的情况,可在满足识别功能的情况下进行各种选择。
如图38-2所示,在该实施例中,标识物1110为圆柱体,圆柱体的上、下两部分为图像识别区域1111,可以涂上各种颜色或者识别符号等,用于识别图像内容区域1112的内容信息。图像内容区域1112的内容为字符“L”和“R”。需要指出的是,标识物1110上图像内容区域1112的位置和内容信息都不局限于图38-1、图38-2所示的情况,可在满足能够区分充电站相对行走设备不同位置的条件下,做出各种选择性设计。
如图40所示,为本发明基于图像识别定位装置的定位方法流程图,该定位方法包括如下步骤:
S210预设充电站上的标识物图像信息与充电站相对于行走设备的位置信息之间的对应关系。
如图41所示,该步骤还包括:S211预设标识物图像信息与充电站相对于行走设备的距离信息之间的对应关系。S212预设标识物图像信息与充电站相对于行走设备的方位信息之间的对应关系。具体地,结合图38-1及图39-1至39-8,针对图38-1中所示的标识物1110,通过不同从不同方位采集到的标识物1110的图像中包含蓝色、黄色的比例关系,与充电站相对行走设备的方位信息进行对应。通过相对充电站不同距离位置采集到的标识物1110图像中蓝色、黄色区域的尺寸大小,与充电站相对行走设备的距离信息进行对应。
S220行走设备采集设置在充电站上的标识物的图像信息。
S230将行走设备采集到的标识物图像信息,与预设的对应关系进行比较,判定出充电站相对于行走设备的位置信息。
如图42所示,该步骤还包括:S231将采集到的标识物图像信息与标识物图像信息和充电站相对于行走设备的距离信息之间的对应关系进行比较,判定出充电站相对于行走设备的距离信息。S232将采集到的标识物图像信息与标识物图像信息和充电站相对于行走设备的方位信息之间的对应关系进行比较,判定出充电站相对于行走设备的方位信息。
在该实施例中,如图39-1至39-8所示,分别为在充电站的正前方、左前方、正左方、左后方、正后方、右后方、正右方、右前方时,采集到的标识物的图像信息。将这些图像信息进行分析,求出颜色比例关系,并与位置判定模块中的充电站相对行走设备的方位对应关系进行比较,就可以定位出自动行走设备与充电站的相对方向。同时,将这些图像信息进行分析,求出颜色模式尺寸大小,并与位置判定模块中的充电站相对行走设备的距离对应关系进行比较,就可以定位出充电站相对行走设备的距离信息。
如图38-1、38-2所示,在其中一个实施例中,标识物1110为圆柱体,在标识物1110的外表面上设置图像识别区域1111和图像内容区域1112。图像识别区域1111用于识别和定义图像内容区域1112,只有当图像采集模块1120采集到图像识别区域1111时,才可以进一步对采集的图像内容区域1112的内容进行采集和分析。图像内容区域1112包括与充电站相对于行走设备的位置信息具有对应关系的图像信息内容。图像内容区域1112包括不同的字符或不同的色彩组合,与充电站相对于行走设备的位置信息进行对应。
如图38-1所示,在该实施例中,标识物1110为圆柱体,圆柱体的上、下两部分红色区域为图像识别区域1111,蓝色区域和黄色区域围绕圆柱体设置,并且沿圆柱体轴切方向各占圆柱体环绕表面的一半。优选地,当标识物1110设置在充电站上时,将图39-1中标识物1110的图像设置为相对行走设备的正前方。需要指出的是,标识物1110上图像识别区域1111的位置和颜色设计都不局限于图38-1所示的情况,可在满足识别功能的情况下进行各种选择。
如图38-2所示,在该实施例中,标识物1110为圆柱体,圆柱体的上、下两部分为图像识别区域1111,可以涂上各种颜色或者识别符号等,用于识别图像内容区域1112的内容信息。图像内容区域1112的内容为字符“L”和“R”。需要指出的是,标识物1110上图像内容区域1112的位置和内容信息都不局限于图38-1、图38-2所示的情况,可在满足能够区分充电站相对行走设备不同位置的条件下,做出各种选择性设计。
上述基于图像识别的定位装置1100,通过图像采集模块1120采集设置在充电站上的标识物1110图像信息,并通过位置判断模块,与预先设置在图像与位置对应模块1130中的标识物1110图像信息和充电站相对行走设备的位置关系进行比较,从而判定出充电站相对行走设备的位置信息。本发明中的标识物1110为立体状,使得图像采集模块1120可以360度全方位采集标识物1110的图像信息,从而更加准确的定位。本发明基于图像识别定位装置的定位方法,可从前、后、左、右等多方位、高准确度地对充电站相对行走设备的位置进行定位。
本领域技术人员可以想到的是,本发明还可以有其他的实现方式,但只要其采用的技术精髓与本发明相同或相近似,或者任何基于本发明作出的变化和替换都在本发明的保护范围之内。

Claims (20)

  1. 一种自动行走设备,包括:
    图像采集装置,用于采集实时图像;
    控制模块,连接所述图像采集装置,用于控制自动行走设备工作;
    其特征在于,所述控制模块包括:
    图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;
    偏移判断单元,将图像采集装置采集到的目标物的实时图像与自动行走设备位于一标准方向时所能采集到的目标物的标准图像进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移。
  2. 根据权利要求1所述的自动行走设备,其特征在于,所述标准方向为所述目标物所在表面的中垂线方向。
  3. 根据权利要求1所述的自动行走设备,其特征在于,所述偏移判断单元通过提取目标物的实时图像和标准图像的形状特征进行比对,判断实时图像相对于标准图像的形状改变。
  4. 根据权利要求3所述的自动行走设备,其特征在于,所述目标物为多边形图案时,所述形状特征包括多边形图像外轮廓一侧的特定边与另一侧的的特定边的边长比例关系。
  5. 根据权利要求4所述的自动行走设备,其特征在于,所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值小于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备位于标准方向。
  6. 根据权利要求4所述的自动行走设备,其特征在于,所述多边形为矩形时,标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于1;所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值小于1,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于1,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于1,自动 行走设备位于标准方向。
  7. 根据权利要求1所述的自动行走设备,其特征在于,所述偏移判断单元若判断自动行走设备相对于标准方向偏移,所述控制模块调整自动行走设备的行走方向,直至其行走方向对准标准方向。
  8. 根据权利要求7所述的自动行走设备,其特征在于,调整自动行走设备的行走方向对准标准方向后,控制模块控制自动行走朝向标准方向行走,直至行走至目标物处。
  9. 根据权利要求1所述的自动行走设备,其特征在于,所述控制模块还包括距离计算单元,所述偏移判断单元在进行相对标准方向偏移判断之前,距离计算单元计算自动行走设备与目标物之间的距离,并且由控制模块控制自动行走设备行走至离目标物预设的距离范围之内。
  10. 根据权利要求1所述的自动行走设备,其特征在于,所述控制模块还包括回归方向确认单元,所述偏移判断单元在进行相对标准方向偏移判断之前,回归方向确认单元判断并且控制自动行走设备的朝向指向目标物。
  11. 根据权利要求10所述的自动行走设备,其特征在于,当图像采集装置采集到的目标物的实时图像位于图像采集装置成像区域的预设范围内时,所述回归方向确认单元确定自动行走设备的行走方向朝向目标物。
  12. 根据权利要求1所述的自动行走设备,其特征在于,图像识别单元通过比对图像采集装置采集到的实时图像和目标物的标准图像的颜色特征,识别图像采集装置采集到的实时图像是包含目标物的图像。
  13. 一种自动行走设备,包括:
    图像采集装置,用于采集实时图像;
    控制模块,连接所述图像采集装置,用于控制自动行走设备工作;
    其特征在于,所述控制模块包括:
    图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;
    偏移判断单元,将图像采集装置采集到的目标物的实时图像的形状特征与自动行走设备位于一标准方向时所能采集到的目标物的标准图像的形状特征进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移。
  14. 一种自动行走设备的对接系统,包括:
    充电站,用于给自动行走设备提供电能,设有用于引导自动行走设备回归充电 的目标物;
    自动行走设备,包括:
    图像采集装置,用于采集实时图像;
    控制模块,连接所述图像采集装置,用于控制自动行走设备工作;
    其特征在于,所述控制模块包括:
    图像识别单元,用于识别图像采集装置所采集到的实时图像中是否出现目标物;回归方向确认单元,用于确保自动行走设备朝向目标物;
    偏移判断单元,将图像采集装置采集到的目标物的实时图像的形状特征与自动行走设备位于一标准方向时所能采集到的目标物的标准图像的形状特征进行比对,根据实时图像相对于标准图像的形状的改变,判断自动行走设备相对于标准方向的偏移;
    当图像识别单元识别出图像采集装置采集到的实时图像包含目标物图像之后,回归方向确认单元控制自动行走设备朝向目标物,在回归方向确认单元确认自动行走设备朝向目标物后,偏移判断单元启动判断自动行走设备相对标准方向的偏移,从而根据判断结果控制自动行走设备对接至充电站。
  15. 根据权利要求14所述的自动行走设备的对接系统,其特征在于,所述标准方向为所述目标物所在表面的中垂线方向。
  16. 根据权利要求14所述的自动行走设备的对接系统,其特征在于,图像识别单元通过比对图像采集装置采集到的实时图像和目标物的标准图像的颜色特征,识别图像采集装置采集到的实时图像是否包含目标物的图像。
  17. 根据权利要求14所述的自动行走设备的对接系统,其特征在于,所述目标物的外轮廓形状为矩形,且具有至少两种不同的颜色。
  18. 根据权利要求14所述的自动行走设备的对接系统,其特征在于,所述控制模块还包括距离计算单元,所述偏移判断单元在进行相对于标准方向偏移判断之前,距离计算单元计算自动行走设备与目标物之间的距离,并且控制自动行走设备行走至离目标物预设的距离范围之内。
  19. 根据权利要求14所述的自动行走设备的对接系统,其特征在于,所述目标物为多边形时,形状特征包括多边形图像外轮廓一侧的特定边与另一侧的特定边的边长比例关系。
  20. 根据权利要求19所述的自动行走设备的对接系统,其特征在于,其特征在于,所述判断包括:实时图像外轮廓一侧的特定边与另一侧的特定边的边长比 值小于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述另一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值大于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备相对于标准方向偏向所述一侧;实时图像外轮廓一侧的特定边与另一侧的特定边的边长比值等于标准图像外轮廓一侧的特定边与另一侧的特定边的边长比值,自动行走设备处于标准方向。
PCT/CN2015/083100 2014-07-02 2015-07-01 自动行走设备 WO2016000622A1 (zh)

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
CN201410312406.3 2014-07-02
CN201410311780.1 2014-07-02
CN201410312406.3A CN105334849A (zh) 2014-07-02 2014-07-02 自动行走设备控制方法及自动工作系统
CN201410311780.1A CN105334848A (zh) 2014-07-02 2014-07-02 自动行走设备控制方法及自动工作系统
CN201410386482.9A CN105334850A (zh) 2014-08-07 2014-08-07 自动移动设备
CN201410386482.9 2014-08-07
CN201510003318.XA CN105825160B (zh) 2015-01-05 2015-01-05 基于图像识别的定位装置及其定位方法
CN201510003318.X 2015-01-05

Publications (1)

Publication Number Publication Date
WO2016000622A1 true WO2016000622A1 (zh) 2016-01-07

Family

ID=55018462

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2015/083100 WO2016000622A1 (zh) 2014-07-02 2015-07-01 自动行走设备

Country Status (1)

Country Link
WO (1) WO2016000622A1 (zh)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107553497A (zh) * 2017-10-20 2018-01-09 苏州瑞得恩光能科技有限公司 太阳能面板清扫机器人的边缘定位装置及其定位方法
WO2021042486A1 (zh) * 2019-09-06 2021-03-11 苏州科瓴精密机械科技有限公司 自动工作系统、自动行走设备及其控制方法及计算机可读存储介质

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1470368A (zh) * 2002-07-26 2004-01-28 ������������ʽ���� 机器人清洁器和机器人清洁系统及其控制方法
CN1518946A (zh) * 2003-02-06 2004-08-11 ������������ʽ���� 具有外部再充电装置的自动吸尘器系统及用于使自动吸尘器与外部再充电装置相对接的方法
CN1660007A (zh) * 2001-04-18 2005-08-31 三星光州电子株式会社 用于对机器人清洁工充电的外部充电装置
CN1876336A (zh) * 2005-06-07 2006-12-13 Lg电子株式会社 使自行机器人自动返回到充电站的系统和方法
JP2007152472A (ja) * 2005-12-02 2007-06-21 Victor Co Of Japan Ltd 充電システム、充電ステーション及びロボット誘導システム
CN102545275A (zh) * 2010-12-07 2012-07-04 上海新世纪机器人有限公司 机器人自动充电装置及其自动充电方法
CN102771246A (zh) * 2012-07-05 2012-11-14 芜湖鸿宇智能科技有限公司 一种智能割草机系统及其智能割草方法
KR20130076277A (ko) * 2011-12-28 2013-07-08 현대엠엔소프트 주식회사 경로안내장치 및 교통안전표지 안내방법
CN103283404A (zh) * 2012-03-02 2013-09-11 苏州宝时得电动工具有限公司 自动行走设备及其控制方法

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN1660007A (zh) * 2001-04-18 2005-08-31 三星光州电子株式会社 用于对机器人清洁工充电的外部充电装置
CN1470368A (zh) * 2002-07-26 2004-01-28 ������������ʽ���� 机器人清洁器和机器人清洁系统及其控制方法
CN1518946A (zh) * 2003-02-06 2004-08-11 ������������ʽ���� 具有外部再充电装置的自动吸尘器系统及用于使自动吸尘器与外部再充电装置相对接的方法
CN1876336A (zh) * 2005-06-07 2006-12-13 Lg电子株式会社 使自行机器人自动返回到充电站的系统和方法
JP2007152472A (ja) * 2005-12-02 2007-06-21 Victor Co Of Japan Ltd 充電システム、充電ステーション及びロボット誘導システム
CN102545275A (zh) * 2010-12-07 2012-07-04 上海新世纪机器人有限公司 机器人自动充电装置及其自动充电方法
KR20130076277A (ko) * 2011-12-28 2013-07-08 현대엠엔소프트 주식회사 경로안내장치 및 교통안전표지 안내방법
CN103283404A (zh) * 2012-03-02 2013-09-11 苏州宝时得电动工具有限公司 自动行走设备及其控制方法
CN102771246A (zh) * 2012-07-05 2012-11-14 芜湖鸿宇智能科技有限公司 一种智能割草机系统及其智能割草方法

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107553497A (zh) * 2017-10-20 2018-01-09 苏州瑞得恩光能科技有限公司 太阳能面板清扫机器人的边缘定位装置及其定位方法
CN107553497B (zh) * 2017-10-20 2023-12-22 苏州瑞得恩光能科技有限公司 太阳能面板清扫机器人的边缘定位装置及其定位方法
WO2021042486A1 (zh) * 2019-09-06 2021-03-11 苏州科瓴精密机械科技有限公司 自动工作系统、自动行走设备及其控制方法及计算机可读存储介质

Similar Documents

Publication Publication Date Title
CN109782770B (zh) 一种割草机自主充电的方法
US10115027B2 (en) Barrier and guardrail detection using a single camera
CN107637255B (zh) 智能割草机的行走路径控制方法、自动工作系统
US10029368B2 (en) Domestic robotic system and method
CN105825160B (zh) 基于图像识别的定位装置及其定位方法
US10068141B2 (en) Automatic operation vehicle
JP6169544B2 (ja) 走行支援制御装置
US11417018B2 (en) Device and method for calibrating camera for vehicle
CN105955259A (zh) 基于多窗口实时测距的单目视觉agv的精确定位方法及系统
CN106910198B (zh) 一种草坪割草机无电线围栏的边界确定方法
CN109917782A (zh) 停车导引系统及其方法与自动停车系统
CN206623754U (zh) 车道线检测装置
US11703344B2 (en) Landmark location estimation apparatus and method, and computer-readable recording medium storing computer program programmed to perform method
US20190197908A1 (en) Methods and systems for improving the precision of autonomous landings by drone aircraft on landing targets
WO2016000622A1 (zh) 自动行走设备
CN106946049A (zh) 集装箱码头流动设备自动化行走方法
WO2014173290A1 (zh) 自动行走设备及其工作区域判断方法
CN105334849A (zh) 自动行走设备控制方法及自动工作系统
US10054952B2 (en) Automatic operation vehicle
KR101720649B1 (ko) 자동주차 방법 및 시스템
CN107168368B (zh) 一种基于视觉识别的无人机自动放线系统的放线方法
CN115824231B (zh) 一种汽车行驶智能定位管理系统
CN105334848A (zh) 自动行走设备控制方法及自动工作系统
CN115346195B (zh) 一种基于机器视觉的道内车辆探测方法
US20230286399A1 (en) Charging station, charging station system, method and apparatus for returning to station and lawnmowing robot

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 15815464

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 15815464

Country of ref document: EP

Kind code of ref document: A1