EP2160663A1 - Système de commande pour véhicule robot - Google Patents

Système de commande pour véhicule robot

Info

Publication number
EP2160663A1
EP2160663A1 EP08759663A EP08759663A EP2160663A1 EP 2160663 A1 EP2160663 A1 EP 2160663A1 EP 08759663 A EP08759663 A EP 08759663A EP 08759663 A EP08759663 A EP 08759663A EP 2160663 A1 EP2160663 A1 EP 2160663A1
Authority
EP
European Patent Office
Prior art keywords
vehicle
logic unit
drive system
robot vehicle
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP08759663A
Other languages
German (de)
English (en)
Inventor
Ulrich-Lorenz Benzler
Klaus Marx
Soenke Carstens-Behrens
Wolfgang Niehsen
Thilo Koeder
Christoph Koch
Thomas Brosche
Joachim Platzer
Amos Albert
Sebastian Jackisch
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Publication of EP2160663A1 publication Critical patent/EP2160663A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0246Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using a video camera in combination with image processing means
    • EFIXED CONSTRUCTIONS
    • E04BUILDING
    • E04HBUILDINGS OR LIKE STRUCTURES FOR PARTICULAR PURPOSES; SWIMMING OR SPLASH BATHS OR POOLS; MASTS; FENCING; TENTS OR CANOPIES, IN GENERAL
    • E04H4/00Swimming or splash baths or pools
    • E04H4/14Parts, details or accessories not otherwise provided for
    • E04H4/16Parts, details or accessories not otherwise provided for specially adapted for cleaning
    • E04H4/1654Self-propelled cleaners
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0212Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory
    • G05D1/0225Control of position or course in two dimensions specially adapted to land vehicles with means for defining a desired trajectory involving docking at a fixed facility, e.g. base station or loading bay
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0231Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
    • G05D1/0242Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using non-visible light signals, e.g. IR or UV signals
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05DSYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
    • G05D1/00Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
    • G05D1/02Control of position or course in two dimensions
    • G05D1/021Control of position or course in two dimensions specially adapted to land vehicles
    • G05D1/0268Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means
    • G05D1/0272Control of position or course in two dimensions specially adapted to land vehicles using internal positioning means comprising means for registering the travel distance, e.g. revolutions of wheels

Definitions

  • the invention relates to a drive system for a robotic vehicle.
  • An improved autonomous lawnmower drive system is known from GB 2 277 152 A1.
  • the known drive system includes a plurality of spaced apart landmarks defining a work surface (lawn).
  • the autonomous lawnmower actively communicates with the landmarks in order to determine its position and to calculate a route on the basis of these position data with the aid of a logic unit.
  • a large number of landmarks are necessary.
  • the lawnmower robottic vehicle
  • due to the provision of a logic unit for Calculation of the route is complex and thus prone to failure.
  • EP 1 704 766 A1 it is known to provide a lawnmower with infrared sensors for analyzing the immediate surroundings of the lawnmower and to control the lawnmower by means of an internal logic unit on the basis of the sensor data.
  • a global position detection is not possible with the known control system and thus can not ensure that a complete mowing of the lawn occurs.
  • EP 1 041 220 A2, EP 1 302 611 A2, WO 2005/045162 A1, EP 1 022 411 A2, US 2004 007 4524 A1, WO 2004/019295 A1, EP 1 489 249 A2, EP 6 596 03 A1, ES 2 074 401 A1, ER 2 685 374 A1, JP 2005/257441 A, EP 1 500 83 A1 and KR 2004/101953 A disclose robotic vehicles which, after detecting a wall contact, are known make a change of direction by a defined angle and then continue their movement in a straight line.
  • a disadvantage of these robotic vehicles is that the robotic vehicles are controlled with a purely random navigation, so that it is not possible optimally run the Häschenweg. A complete shutdown is not guaranteed for any areas and takes a correspondingly long time.
  • the invention has for its object to propose an alternative drive system for robotic vehicles, which allows the use of relatively simple robotic vehicles.
  • the invention is based on the idea of providing at least one camera arranged outside the robotic vehicle for detecting the working area and the robotic vehicle.
  • the camera is preferably arranged above the work area so that the largest possible portion of the work area can be detected with the camera, in particular a digital video camera. If the working area is contoured in such a way that it can not be detected completely with a single camera, it is advantageous to provide at least one second external camera, that is to say located outside the robot vehicle. Furthermore, it is conceivable to arrange the at least one camera pivotable and to arrange it in such a way by means of the logic unit, that it follows the movement of the robot vehicle.
  • the camera or cameras generate / generate image data which are transmitted to a logic unit also located outside the robot vehicle.
  • the logic unit can be part of a camera or can be arranged as a separate component at a distance from a camera, wherein the transmission of the image data can take place, for example, via a data cable and / or a radio interface.
  • a logic unit for example, a personal computer, a PDA or a mobile phone can serve.
  • the logic unit determines the position of the robot vehicle on the work area and calculates driving instructions for the robot vehicle for the procedure on the work area on the basis of the position determined. It is within the scope of the invention that the image data generated by the at least one external camera either revised in the logic unit or in front of the logic unit, in particular filtered or processed in any other way.
  • the travel instructions calculated by the logic unit are transmitted via a transmitting unit connected to the logic unit in a signal-conducting manner and received by a receiving unit arranged on the robot vehicle.
  • the receiving unit of the robot vehicle is signal-connected to a control unit of simple construction arranged on the robot vehicle, which controls the drive means of the robot vehicle in accordance with the received driving instructions.
  • the drive means are designed such that with them the robot vehicle can be driven and steered.
  • the drive system according to the invention has significant advantages over known drive systems. Since by means of the at least one camera wide areas can be detected, the provision of only one camera is usually sufficient. Actually, usually less cameras need than Landmarks are used in a known from the prior art drive system.
  • the robot vehicle can be easily formed, since the logic unit, which preferably creates a digital map of the work area, is arranged outside the robot vehicle. This in turn means that the robot vehicle is less susceptible to interference and cheaper to produce. Thus, the entire driving system according to the invention is less prone to failure, since it is possible to arrange the logic unit in a largely protected from external environmental influences area, for example, within a house or below a canopy.
  • a commercially available personal computer is used as the logic unit, only a corresponding program has to be installed on it which is able to process the image data generated by the at least one camera and to recognize the position of the robot vehicle on the basis of this data and corresponding data Driving instructions to be calculated, which are then sent to the transmitting unit, such as a wireless LAN transmission unit to the receiving unit of the robotic vehicle.
  • the transmitting unit such as a wireless LAN transmission unit to the receiving unit of the robotic vehicle.
  • an internal logic unit can also be provided, that is to say a logic unit which is part of the robot vehicle, or is arranged in or on it.
  • the logic unit is arranged in a moisture-proof housing.
  • the image data captured by the digital camera and the external transmitting unit must be sent to the internal receiving unit on the robotic vehicle, which is then signal conductively connected to the internal logic unit, which in turn evaluates the image data and determines appropriate driving instructions for the control unit, then the drive means controls accordingly.
  • the logic unit and the control unit are combined in one component.
  • the logic unit and the control unit are signal-conducting connected to each other.
  • the camera is a color digital camera.
  • the external logic unit is designed in such a way that it recognizes the inner and / or outer boundary of the working area on the basis of the image data of the at least one camera.
  • the external logic unit can calculate the boundaries based on contrast differences between adjacent pixels.
  • the external logic unit is designed in such a way that the determined limits of the working area are included in the calculation of the driving instructions, in particular such that the robot vehicle does not exceed the limits, ie does not leave the working area.
  • the logic unit additionally or alternatively recognizes static and / or moving, ie temporary, obstacles within the working area and takes these into account when calculating the driving instructions for the robot vehicle, in particular in such a way that the robot vehicle does not collide with the obstacles, so the direction of travel changes or stops.
  • the external logic unit recognizes the orientation of the robot vehicle from the image data and takes this information into account in the calculation of the driving instructions, for example such that first a rotation is made before the robot vehicle is driven in a straight-ahead direction becomes.
  • the external logic unit takes into account further data when calculating the driving instructions. It is advantageous if the logic unit, for example, weather data that are queried in particular via the Internet or a belonging to the control system weather station, taken into account.
  • the external logic unit may be designed such that the robot vehicle travels in a parked position, for example in a parking garage, in the event of rainfall and / or excessive wind forces.
  • the logic unit can take account of time data and / or date data, for example from the Internet or a clock belonging to the control system, for example such that the robot vehicle only travels on the work surface at certain times and / or only on certain days, in particular Weekdays.
  • the logic unit recognizes different sections of the working area based on the image data, for example a mowed and an unmown section of the working area, and this information is used in the calculation. tion of the driving instructions, in particular such that the robot vehicle moves only or preferably on one of the sections, in particular the unmilled lawn section.
  • the boundaries of the work area can be set manually, in particular such that limits automatically recognized by the logic unit are revised.
  • the logic unit is preferably equipped with a corresponding input unit and / or with a corresponding visualization unit for displaying the work surface or the boundaries of the work surface.
  • obstacles and / or outer and inner boundaries may be manually set or removed, or it may be possible to have exit patterns, i. Departure strategies are given or revised by the logic unit proposed departure strategies.
  • the logic unit is designed in such a way that the driving instructions are calculated in such a way that the working area is traversed according to a specific departure pattern, ie a specific departure strategy.
  • a specific departure pattern ie a specific departure strategy.
  • a time-optimized and thus energy consumption optimized shutdown of the work area can be realized and / or the complete shutdown of the work area, for example on mutually parallel and / or overlapping webs.
  • the latter embodiment is particularly advantageous when the robotic vehicle is an autonomous lawnmower. It is conceivable that the logic unit proposes different departure patterns and an operator can select an individually preferred departure strategy via the input unit.
  • Preferred may Revisioned and / or defined areas within the work area (inner limits) to be excluded via the input unit, ie not to be traveled. It is also conceivable for the logic unit to be provided with new departure strategies that can be read or obtained, for example, via the Internet or a data carrier, in particular against payment of fees.
  • a bidirectional communication connection exists between the logic unit and the robot vehicle.
  • This embodiment allows the robot vehicle to send status information to the logic unit, which takes it into account when calculating the driving instructions.
  • the logic unit after detecting a small accumulator charging state, drive the robot vehicle in such a way that it docks to a charging station.
  • the robot vehicle with appropriate sensors odometriechal data and / or Griffinbeschaf- fenenheits romance, for example by means of IR sensors, determined and transmitted this data by means of a transmitting unit to a connected to the logic unit receiving unit.
  • the lawnmower may communicate over the communication link to take control of the vehicle navigation, e.g. when near-field sensors detect an obstacle on the vehicle.
  • security mechanisms known per se for example, communication protocols with checksum, handshaking, etc.
  • the data transfer takes place usually in the normal case cyclically. If a data transmission does not occur over a defined time range or if no valid data is transmitted in the defined time range, then the system enters a safe state (the robot vehicle stops, for example).
  • the logic unit recognizes the position and / or orientation of the robot vehicle exclusively on the basis of the distinctive shape and / or color of the robot vehicle and possibly tracks the movement.
  • markings on the robot vehicle for example LEDs, in a suitable arrangement in order to facilitate the identification of the robot vehicle and thus the position and / or orientation determination.
  • the robot vehicle can be executed in a variety of configurations.
  • the robot vehicle may be designed as a snowplow vehicle, as a leaf-picking vehicle, as a grass-catching vehicle, as a scarifying vehicle, or as a weed-hunting vehicle, etc.
  • Preferred is an embodiment in which the robot vehicle is designed as a lawn mower with a mower.
  • the logic unit not only calculates driving instructions for the robot vehicle based on the image data, but additionally generates based on the image data a start instruction and / or a stop instruction for a tool of the robot vehicle, wherein the start instruction or the stop instruction means the transmitting unit is transmitted to the receiving unit of the robot vehicle and implemented by the control unit accordingly.
  • a mower is operated only in the event that the robotic vehicle moves on a non-mown section of the work area. Likewise, the mower can be switched off when a, in particular moving, obstacle in the area of the robot vehicle is detected by the logic unit.
  • the internal or external logic unit automatically calculates a trajectory for the robot vehicle on the basis of the image data acquired by the external camera, in particular continuously.
  • This trajectory is preferably calculated or designed so that the entire working area or a predetermined and / or predetermined by an operator portion of the work area, at least approximately, completely traversed, preferably without a surface section is run over several times.
  • the latter restriction or departure optimization does not necessarily apply to the last departure route or the last route section of the trajectory, in particular when the diameter (s) of the work area can not be divided in an integer by the track widths or track widths of the tracks or ring tracks to be traveled /are.
  • Traj ektoriebetician preferably takes place by means of the image processing operation erosion, in particular with gradually increased or decreased Erosionsfiltermaske (eg circular or rectangular mask for round or square shapes).
  • annular lanes oriented at the outer boundary and / or the inner boundary are calculated, whereby the diameters of the lanes to be driven are either progressively greater by increasing or decreasing the erosion filter mask, depending on whether the work is started inside or outside or gradually getting smaller.
  • the calculated lanes are not (exactly) parallel, but their topology (shape) changes in accordance with the neighboring lane.
  • the logic unit in response to determined from the image data of the camera distance information, the driving instructions, due to which the control unit drives the drive means, calculated such that the robot vehicle the work area in several rounds, ie ring tracks departing, wherein the round contours are oriented at the inner or outer boundary contour.
  • the contours of the ring tracks preferably approach the contour of the outer or inner boundary on an enlarged or reduced scale. While one lap is being traveled, ie while the robotic vehicle is traveling on a ring track, the control unit controls the drive means in dependence on the logic Unit calculated driving instructions such that the robotic vehicle approximately a constant, round specific distance (depending on topological changes by the erosion) to the inner or outer boundary complies.
  • the robot vehicle After completion of each round, ie after a ring has been traveled by the robot vehicle, preferably completely, the robot vehicle changes to an adjacent, larger or smaller ring or to a larger or smaller round, whereby the contour of this ring or this ring track the contour of the outer or the inner boundary is adapted due to the maintenance of the approximately constant distance, or this contour corresponds to a changed scale and with topological changes, due to the use of the image processing operation erosion.
  • this adjacent round then a changed approximately constant distance to the boundary contour is maintained.
  • the width of a ring track at least approximately the width of the robot vehicle transversely to the direction of travel or the width of a working element of the robot vehicle, for example, the width of a cutting knife or a cleaning device corresponds, so that the entire work area are completely "processed" can, preferably without several times to run over a surface section.
  • the robot vehicle is designed as a pool robot vehicle, that is to say in particular as a filter vehicle and / or cleaning or cleaning vehicle.
  • a pool robot vehicle that is to say in particular as a filter vehicle and / or cleaning or cleaning vehicle.
  • Such pool robotic vehicles drive in particular at the bottom of a pool, which then forms the work area.
  • the digital camera constructed above the pool or swimming pool, etc. in particular as a color camera, can distinguish the robotic vehicle from the environment, in particular from the blue and generally reflecting water surface
  • the robotic vehicle equipped with a float that floats on the water surface for example, it is possible to carry the float on a, in particular pivotably mounted, rod, in particular a telescopic rod.
  • the float along the longitudinal extension of the rod is displaceable relative to this.
  • a further float fixedly connected to the rod is preferably provided below the float, providing sufficient buoyancy to vertically expand the rod.
  • the logic unit can determine the exact position of the float and thus of the robot vehicle based on the image data supplied by the camera. Additionally or alternatively, shape matching based on edge detection and / or color segmentation may be performed.
  • the communication between an external logic unit and the control unit preferably takes place via radio, wherein a corresponding receiver can be provided on a guide rod for the float.
  • the camera over Cable or radio communicates with a designed as an internal logic unit logic unit.
  • the float in which its width, i. Extension transverse to the direction of travel at least approximately the width of the robot vehicle or the width of a working element, such as a cleaning device, etc. corresponds. In particular, in such an embodiment, no calibration is required.
  • FIG. 1 shows a schematic representation of a drive system for a robotic vehicle
  • FIG. 2 a trajectory calculated automatically by the logic unit
  • FIG. 3 shows a schematic representation of a drive system for a robotic vehicle designed as a pool robot vehicle.
  • FIG. 1 schematically shows a drive system 1 for a robotic vehicle 2 designed as a lawnmower.
  • the robot vehicle 2 comprises drive means, not shown, in particular a drive motor and a steering device for steering the robot vehicle 2 or two drive units, which together form a differential drive.
  • the drive motor is formed in the embodiment shown as an electric motor which is operated by means of a rechargeable battery, also not shown.
  • the robotic vehicle 2 is located on a working area 3 (lawn area) with an outer boundary 4. Within the working area 3 is located within an inner boundary 5 a static obstacle 6, in the present case a flowerbed.
  • a charging station 7 for charging the accumulator of the robot vehicle 2.
  • the entire work area 3 is optically detected by a camera 8 designed as a digital video camera, which is located outside and above the work area 3.
  • the camera 8 may for example be mounted on a house gable, etc. Possibly. several cameras 8 can be provided.
  • the camera 8 is connected via a data cable 9 to a logic unit 10 embodied as a personal computer. Image data is transmitted to the logic unit 10 by the camera 8 via the data cable 9.
  • a radio link can also be provided.
  • the logic unit 10 may alternatively also be integrated in the camera or in the robot vehicle 2.
  • the logic unit 10 comprises a visualization unit 11 (screen) for visualizing the image data, ie the robot vehicle 2 and the work area 3, in particular the outer boundary 4, the inner boundary 5 and the static obstacle 6.
  • the logic unit 10 is connected to an input unit 12, by means of which predetermined departure strategies can be selected and departure strategies for the robot vehicle 2 can be designed or adapted. Furthermore, areas to be omitted within the working area 3 can be determined via the input unit 12, and outer and inner boundaries 4, 5 can be defined or changed.
  • the logic unit 10 calculates driving instructions for the robotic vehicle 2 on the basis of the image data and any further data or parameters input, for example, via the input unit 12 or via a data carrier or the Internet.
  • the driving instructions are preferably calculated in such a way that the robotic vehicle 2 leaves the working area 3 in a specific departure strategy - in the present case a meandering departure strategy with mutually parallel, partially overlapping tracks 13.
  • the driving instructions are calculated so that the outer boundary 4 and the inner boundary 5 are not run over, the robot vehicle 2 thus remains within the work area 3.
  • obstacles occurring temporarily are detected and circulated by means of corresponding driving instructions.
  • a driving instruction can also consist in stopping the robot vehicle 2 (temporarily).
  • the logic unit 10 or a computer program installed on it is designed such that differently procured sections (mowed / not mowed) 3a, 3b are recognized and the driving instructions are calculated such that the unmilled section 3b is preferably traveled.
  • the logic unit 10 in this embodiment is connected via a further data cable 14 to a transmitting unit 15, by means of which the driving instructions are sent to a receiving unit 16 on the robotic vehicle 2.
  • This receiving unit 16 is connected via a further data cable 17 to a control unit 18, wherein the control unit 18 on the basis of the received from the receiving unit 16 driving instructions not shown drive means of the robot vehicle 2 such that the robot vehicle 2 follows the calculated paths 13 and upon detection a particular obstacle 6 evades this or otherwise reacts.
  • markings 19 LEDs in this embodiment
  • the robotic vehicle 2 comprises, in addition to the receiving unit 16, a transmitting unit 20, wherein the receiving unit 16 and the transmitting unit 20 can also be designed as a combined receiving and transmitting unit.
  • the robot vehicle 2 transmits status information of the robot vehicle 2 to an external receiving unit 21, which is connected via a data cable 22 to the logic unit 10.
  • the external receiving unit 21 and the external transmitting unit 15 can also be designed as a combined transmitting and receiving unit.
  • the status information transmitted via the data cable 22 to the logic unit 10 are taken into account by the logic unit 10 in the calculation of the driving instructions for the robotic vehicle 2, for example, such that the robotic vehicle 2 drives directly to the charging station 7 and upon detection of a low Akkumu- Latorladeieris docks.
  • the logic unit can generate a start command and / or a stop command for the non-illustrated mower of the robot vehicle 2, these instructions being sent via the external transmitting unit 15 to the receiving unit 16 and implemented accordingly by the control unit 18. For example, when a temporary obstacle is detected, a stop command is issued for the mower, also when the robot vehicle 2 travels over the uncut portion 3a of the work area 3.
  • FIG. 2 shows a possible trajectory for the robotic vehicle 2 calculated automatically by the logic unit 10 on the basis of distance information for the outer boundary 4 of the working area 3 determined from the image data of the camera 8.
  • the control unit 18 controls the drive means of the robot vehicle 2 as a function of these driving instructions, that is to say as a function of the constantly determined distance information about the outer limit.
  • the trajectory shown was stepped by the logic unit 10 with the aid of the image processing operation erosion increased erosion filter mask (eg circular mask for round and rectangular mask for angular contours) determined.
  • erosion filter mask eg circular mask for round and rectangular mask for angular contours
  • a connecting line 24 (radial line) is shown. After each round 23 has been completed, the robotic vehicle 2 reaches this (imaginary) connecting line 24. The reaching of this connecting line 24 can be ascertained by means of the above-arranged camera 8, which continuously detects the position of the swimmer. Upon reaching the connecting line 24, the robot vehicle 2 changes to an adjacent, radially further inside, approximately parallel round 23 or ring track.
  • FIG. 3 shows an automatic drive system 1 for a robot vehicle 2 designed as a pool robot.
  • the robot vehicle 2 comprises drive means not shown, in particular a drive motor and a steering device for steering the robot vehicle 2, or, as in the present exemplary embodiment, two drive units which together form a differential drive.
  • the drive motor is formed in the embodiment shown as an electric motor which is operated by means of a rechargeable battery, also not shown.
  • the robot vehicle 2 is located on a work area 3, which is formed by a pool floor.
  • the outer one Limit 4 of the work area 3 is formed by circumferential pool walls.
  • float 26 For optically detecting the robot vehicle 2 or a floating on a water surface 25 float 26, which is carried on a hinged to the robot vehicle 2 guide rod 27 with the robot vehicle 2 in the process on the work area 3, a designed as a color digital camera camera 8 vorgese - hen, which is arranged above the water surface 25. Since the camera is pivotable about a hinge 29 relative to the robot vehicle, below the relative to the guide rod 27 adjustable float 26, a further float 28 is provided which is fixedly connected to the guide rod 22 and vertically aligns them.
  • the camera 8 is signal-conducting connected to a transmitting unit 15, are transmitted via the image data to a mounted on the guide rod 27 receiving unit 21, which is signal-conducting connected to a logic unit 10 within the robot vehicle 2.
  • the logic unit 10 is signal-conducting connected to a control unit 18 which acts in a controlling manner on the drive means, not shown.
  • analogous to the exemplary embodiment according to FIG. 1, further transmitting and receiving units for reciprocating communication can be provided.
  • the indicated only as an arrow float 26 preferably has a width (extension transverse to the direction of travel of the robot vehicle 2), which corresponds to the width of the robot vehicle 2.
  • a width extension transverse to the direction of travel of the robot vehicle 2
  • the robotic vehicle can be positioned correctly even in the outer areas further away from the CAM despite perspective distortion / image.
  • the perspective of the camera disortion, perspective image
  • the camera is irrelevant as the camera can always compare the outer edges of the marker or float with adjacent webs and can keep the robot vehicle at a distance.
  • the individual rounds or ring tracks then have a substantially rectangular contour, that is to say a contour which matches the rectangular contour of the outer contour.
  • FIG Limit 4 is adjusted.

Landscapes

  • Engineering & Computer Science (AREA)
  • Architecture (AREA)
  • Physics & Mathematics (AREA)
  • Structural Engineering (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Civil Engineering (AREA)
  • Electromagnetism (AREA)
  • Aviation & Aerospace Engineering (AREA)
  • Radar, Positioning & Navigation (AREA)
  • Remote Sensing (AREA)
  • General Physics & Mathematics (AREA)
  • Automation & Control Theory (AREA)
  • Control Of Position, Course, Altitude, Or Attitude Of Moving Bodies (AREA)
  • Guiding Agricultural Machines (AREA)

Abstract

L'invention concerne un système de commande (1) pour un véhicule robot (2), ledit système présentant au moins une caméra externe (8) configurée pour former les données d'image d'une zone de travail (3), au moins un véhicule robot (2), ainsi qu'une unité logique externe (10) configurée pour déterminer la position du ou des véhicules robots (2) et pour calculer les indications de conduite du ou des véhicules robots (2) à partir des données d'image formées par la caméra (8), une unité externe d'émission (15, 20) configurée pour émettre les indications de conduite, une unité de réception (16, 21) configurée pour recevoir les indications de conduite et une unité de commande (18) configurée pour commander des moyens d'entraînement d'au moins un véhicule robot (2) à partir des indications de conduite.
EP08759663A 2007-06-21 2008-05-16 Système de commande pour véhicule robot Withdrawn EP2160663A1 (fr)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
DE102007028519 2007-06-21
DE102007053311A DE102007053311A1 (de) 2007-06-21 2007-11-08 Ansteuersystem für ein Roboterfahrzeug
PCT/EP2008/056016 WO2008155178A1 (fr) 2007-06-21 2008-05-16 Système de commande pour véhicule robot

Publications (1)

Publication Number Publication Date
EP2160663A1 true EP2160663A1 (fr) 2010-03-10

Family

ID=40030890

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08759663A Withdrawn EP2160663A1 (fr) 2007-06-21 2008-05-16 Système de commande pour véhicule robot

Country Status (4)

Country Link
US (1) US20100299016A1 (fr)
EP (1) EP2160663A1 (fr)
DE (1) DE102007053311A1 (fr)
WO (1) WO2008155178A1 (fr)

Families Citing this family (37)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8515595B2 (en) * 2009-02-05 2013-08-20 Moon Publicity Corporation Shadow shaping to image planetary or lunar surfaces
DE102009027396A1 (de) * 2009-07-01 2011-01-13 Robert Bosch Gmbh Autonome mobile Plattform zur Flächenbearbeitung und Verfahren zur Flächenbearbeitung
US8423225B2 (en) * 2009-11-11 2013-04-16 Intellibot Robotics Llc Methods and systems for movement of robotic device using video signal
US8679260B2 (en) * 2009-11-11 2014-03-25 Intellibot Robotics Llc Methods and systems for movement of an automatic cleaning device using video signal
DE102010008807A1 (de) * 2010-02-22 2011-08-25 Engelskirchen, Jürgen, Dipl.-Ing., 22395 Verfahren zur selbsttätigen Bahnsteuerung eines steuerbaren Objektes
DE102011113099A1 (de) 2011-09-09 2013-03-14 Volkswagen Aktiengesellschaft Verahren zur Bestimmung von Objekten in einer Umgebung eines Fahrzeugs
US9594380B2 (en) * 2012-03-06 2017-03-14 Travis Dorschel Path recording and navigation
US9388595B2 (en) * 2012-07-10 2016-07-12 Aqua Products, Inc. Pool cleaning system and method to automatically clean surfaces of a pool using images from a camera
EP3373097B1 (fr) * 2012-08-14 2024-06-05 Husqvarna AB Tondeuse robotique comportant un système de détection d'objet
TWM451103U (zh) * 2012-10-30 2013-04-21 Agait Technology Corp 行走裝置
WO2014101840A1 (fr) 2012-12-28 2014-07-03 苏州宝时得电动工具有限公司 Système de tondeuse automatique
DE102013107492A1 (de) 2013-07-15 2015-01-15 Koubachi AG System zur Überwachung und Steuerung von Aktivitäten zumindest eines Gartengeräts innerhalb zumindest eines Aktivitätsgebiets
CN104704979B (zh) * 2013-12-17 2016-12-07 苏州宝时得电动工具有限公司 一种自动割草装置
DE102014212399A1 (de) * 2014-06-27 2015-12-31 Robert Bosch Gmbh Arbeitsbereichsmarkiervorrichtung
SE538776C2 (en) * 2014-12-23 2016-11-15 Husqvarna Ab Improved operation of a robotic work tool by determining weather conditions and adapting the operation
JP6014192B1 (ja) * 2015-03-27 2016-10-25 本田技研工業株式会社 無人作業車の制御装置
DE102015209190A1 (de) * 2015-05-20 2016-11-24 Volkswagen Aktiengesellschaft Verfahren zur nutzerdefinierten Bereitstellung eines Fahrzeugs
DE102015220840B4 (de) 2015-10-26 2018-11-15 Siemens Schweiz Ag Steuerung von Reinigungsrobotern
BR102016024151B1 (pt) 2016-01-06 2021-10-13 Cnh Industrial America Llc Meio legível por computador não transitório tangível, sistema e método para controlar pelo menos um veículo agrícola autônomo
US9904283B2 (en) * 2016-03-08 2018-02-27 Fuji Xerox Co., Ltd. Systems and methods employing coded light to dock aerial drones, self-driving cars and surface robots
US20170364924A1 (en) * 2016-06-15 2017-12-21 James Duane Bennett Mobile units for furnishing, repairing and refurbishing residences
US10942990B2 (en) * 2016-06-15 2021-03-09 James Duane Bennett Safety monitoring system with in-water and above water monitoring devices
US10167650B2 (en) 2016-08-10 2019-01-01 Aquatron Robotic Technology Ltd. Concurrent operation of multiple robotic pool cleaners
CN106910198B (zh) * 2017-02-21 2020-06-09 昂海松 一种草坪割草机无电线围栏的边界确定方法
CN107390686A (zh) * 2017-07-17 2017-11-24 深圳拓邦股份有限公司 一种割草机器人控制方法及自动控制割草系统
EP3692229A1 (fr) * 2017-10-04 2020-08-12 Zodiac Pool Systems LLC Procédé et système de nettoyage d'une piscine avec utilisation d'un nettoyeur de piscine automatique et d'un dispositif de cartographie de la piscine
DE102017126495B4 (de) 2017-11-10 2022-05-05 Zauberzeug Gmbh Kalibrierung eines stationären Kamerasystems zur Positionserfassung eines mobilen Roboters
DE102017221134A1 (de) * 2017-11-27 2019-05-29 Robert Bosch Gmbh Verfahren und Vorrichtung zum Betreiben eines mobilen Systems
DE102018104568A1 (de) * 2018-02-28 2019-08-29 Wacker Neuson Produktion GmbH & Co. KG System und Verfahren zur automatisierten Bodenverdichtung
US11519732B2 (en) 2018-08-20 2022-12-06 Zodiac Pool Systems Llc Mapping and tracking methods and systems principally for use in connection with swimming pools and spas
DE102018133165B4 (de) * 2018-12-20 2021-03-18 Peter Beeken Verfahren und System zur Bestimmung einer oder mehrerer Spielfiguren
JP7142597B2 (ja) * 2019-04-01 2022-09-27 ヤンマーパワーテクノロジー株式会社 走行領域形状登録システム
CN109901594A (zh) * 2019-04-11 2019-06-18 清华大学深圳研究生院 一种除草机器人的定位方法及系统
WO2020251477A1 (fr) * 2019-06-14 2020-12-17 National University Of Singapore Système automatisé pour la pollinisation de cultures
AT523051B1 (de) * 2019-12-06 2021-05-15 Tsp Gmbh Wendevorrichtung zum Wenden von Trocknungsgut
US11789350B2 (en) * 2020-05-11 2023-10-17 Anthony Goolab Celestial body image projection system
WO2024035727A1 (fr) * 2022-08-08 2024-02-15 Zodiac Pool Systems Llc Piscines et spas avec vision de bassin

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004090232A1 (fr) * 2003-04-14 2004-10-21 Wacker Construction Equipment Ag Systeme et procede de compactage de sol automatise
WO2006080997A2 (fr) * 2005-01-25 2006-08-03 Deere & Company Planificateur de parcours et procede de planification d'un plan de parcours presentant un element de spirale

Family Cites Families (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US556937A (en) 1896-03-24 powers
NL8402445A (nl) 1984-01-20 1985-08-16 Philips Nv Werkwijze voor het coderen van n-bits informatiewoorden naar m-bits codewoorden, inrichting voor het uitvoeren van die werkwijze, werkwijze voor het decoderen van m-bits codewoorden naar n-bits informatiewoorden en inrichting voor het uitvoeren van die werkwijze.
AU653958B2 (en) 1990-09-24 1994-10-20 Andre Colens Continuous, self-contained mowing system
JPH04300798A (ja) * 1991-03-28 1992-10-23 Mitsubishi Heavy Ind Ltd 水中壁面作業ロボット
AU6383494A (en) 1993-04-03 1994-10-24 Cat Systems Limited Localising system
ES2074401B1 (es) 1993-10-06 1996-04-01 Garcia Felix Alonso Filtro de depuradora de agua potable.
DE4344273C2 (de) 1993-12-23 1995-12-14 Bayerische Motoren Werke Ag Einfüllstutzen für einen Kraftstoffbehälter eines Kraftfahrzeuges
IT1267730B1 (it) * 1994-06-14 1997-02-07 Zeltron Spa Sistema di telecomando programmabile per un veicolo
IL113913A (en) 1995-05-30 2000-02-29 Friendly Machines Ltd Navigation method and system
US5974347A (en) 1997-03-14 1999-10-26 Nelson; Russell G. Automated lawn mower
IL124413A (en) * 1998-05-11 2001-05-20 Friendly Robotics Ltd System and method for area coverage with an autonomous robot
FR2781243A1 (fr) 1998-07-20 2000-01-21 Jean Pierre Pappalardo Dispositif de nettoyage automatique d'un bassin a deplacements controles
US6412133B1 (en) 1999-01-25 2002-07-02 Aqua Products, Inc. Water jet reversing propulsion and directional controls for automated swimming pool cleaners
US6971136B2 (en) 1999-01-25 2005-12-06 Aqua Products, Inc. Cleaner with high pressure cleaning jets
US6299699B1 (en) 1999-04-01 2001-10-09 Aqua Products Inc. Pool cleaner directional control method and apparatus
ES2600519T3 (es) * 2001-06-12 2017-02-09 Irobot Corporation Procedimiento y sistema de cobertura plurimodal para un robot autónomo
ATE283949T1 (de) 2001-10-15 2004-12-15 Aqua Products Inc Schwimmbeckenreinigungsverfahren und -gerät
AU2003268163A1 (en) 2002-08-23 2004-03-11 Aqua Products Inc. Pool cleaner with on-board water analysis, data recording and transmission device
IL156535A (en) 2003-06-19 2006-12-10 Maytronics Ltd Pool cleaning apparatus
DE602004016551D1 (de) 2003-11-04 2008-10-23 Aqua Products Inc Richtungssteuerung für doppelbürsten-schwimmbeckenreinigungsroboter
JP2005257441A (ja) 2004-03-11 2005-09-22 Ebara Kogyo Senjo Kk 原子炉発電施設のプールの清掃方法及び装置
KR100576315B1 (ko) 2004-08-19 2006-05-03 주식회사 에스피레저 수류에 따른 장애물 감지구조를 갖는 수영장 청소로봇
DE102005013365A1 (de) 2005-03-23 2006-09-28 Wolf-Garten Ag Messvorrichtung und Verfahren zur Bodenoberflächenanalyse für Rasenpflege-Roboter
CN102740279B (zh) 2011-04-15 2014-12-17 中兴通讯股份有限公司 一种无线网络接入终端及其运行方法

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2004090232A1 (fr) * 2003-04-14 2004-10-21 Wacker Construction Equipment Ag Systeme et procede de compactage de sol automatise
WO2006080997A2 (fr) * 2005-01-25 2006-08-03 Deere & Company Planificateur de parcours et procede de planification d'un plan de parcours presentant un element de spirale

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of WO2008155178A1 *

Also Published As

Publication number Publication date
US20100299016A1 (en) 2010-11-25
WO2008155178A1 (fr) 2008-12-24
DE102007053311A1 (de) 2008-12-24

Similar Documents

Publication Publication Date Title
WO2008155178A1 (fr) Système de commande pour véhicule robot
EP2758841B1 (fr) Appareil de travail autonome
EP3400162B1 (fr) Procédé de stationnement répété d'un véhicule dans une zone de stationnement en fonction d'une reconnaissance d'objets de différentes catégories d'objets, système d'aide au stationnement pour un véhicule et véhicule
EP3181422B1 (fr) Procédé et système de commande automatique d'un véhicule suiveur comprenant un véhicule scout
WO2019101651A1 (fr) Procédé et dispositif de fonctionnement d'un système mobile
EP3326040B1 (fr) Procédé et système pour la commande automatique d'au moins un véhicule suiveur avec un véhicule éclaireur
EP2746138B1 (fr) Système d'assistance au conducteur et procédé d'autorisation de parcage autonome ou piloté dans un garage
DE112020004931T5 (de) Systeme und verfahren zur bestimmung der verkehrssicherheit
DE102015225238B4 (de) Verfahren und System zur automatischen Steuerung eines Folgefahrzeugs mit einem Scout-Fahrzeug
DE112020000925T5 (de) Systeme und verfahren zur fahrzeugnavigation
WO2016083038A1 (fr) Procédé et dispositif de conduite assistée d'un véhicule
DE102008011947A1 (de) Roboterfahrzeug sowie Ansteuerverfahren für ein Roboterfahrzeug
DE102014221751A1 (de) Verfahren und Vorrichtung zum Führen eines Fahrzeugs auf einem Parkplatz
WO2015197353A2 (fr) Procédé de création d'un modèle d'environnement d'un véhicule
DE112020002869T5 (de) Navigationssysteme und verfahren zum bestimmen von objektabmessungen
WO2009138140A2 (fr) Procédé de commande d'un véhicule robotisé et véhicule robotisé
EP3167699B1 (fr) Engin de travail autonome
DE112021004128T5 (de) Systeme und verfahren für kartenbasierte modellierung der realen welt
DE112022000380T5 (de) Systeme und verfahren zur einheitlichen geschwindigkeitskartierung und navigation
DE112020006427T5 (de) Systeme und verfahren zum detektieren von ampeln
DE102017212908A1 (de) Verfahren zur Verbesserung der Quer- und/oder Längsführung eines Fahrzeugs
DE102014221763A1 (de) Verfahren zur automatischen Steuerung von Objekten innerhalb eines räumlich abgegrenzten Bereichs, der für die Herstellung oder Wartung oder das Parken eines Fahrzeugs vorgesehen ist
EP3688543B1 (fr) Procédé de navigation d'un robot et robot pour la mise en oeuvre du procédé
EP2788830A1 (fr) Procédé et dispositif de commande pour guider une machine agricole
DE102019129831A1 (de) Automatisches kupplungssystem mit auswahl des betreffenden anhängers aus mehreren identifizierten anhängern

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20100121

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
17Q First examination report despatched

Effective date: 20100915

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20110126