DE102007015552B3 - Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module - Google Patents

Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module Download PDF

Info

Publication number
DE102007015552B3
DE102007015552B3 DE200710015552 DE102007015552A DE102007015552B3 DE 102007015552 B3 DE102007015552 B3 DE 102007015552B3 DE 200710015552 DE200710015552 DE 200710015552 DE 102007015552 A DE102007015552 A DE 102007015552A DE 102007015552 B3 DE102007015552 B3 DE 102007015552B3
Authority
DE
Germany
Prior art keywords
image
self
camera module
halves
robot
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
DE200710015552
Other languages
German (de)
Inventor
Martin Dr. Kornberger
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Miele und Cie KG
Original Assignee
Miele und Cie KG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Miele und Cie KG filed Critical Miele und Cie KG
Priority to DE200710015552 priority Critical patent/DE102007015552B3/en
Application granted granted Critical
Publication of DE102007015552B3 publication Critical patent/DE102007015552B3/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/009Carrying-vehicles; Arrangements of trollies or wheels; Means for avoiding mechanical obstacles
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2805Parameters or conditions being sensed
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L9/00Details or accessories of suction cleaners, e.g. mechanical means for controlling the suction or for effecting pulsating action; Storing devices specially adapted to suction cleaners or parts thereof; Carrying-vehicles specially adapted for suction cleaners
    • A47L9/28Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means
    • A47L9/2836Installation of the electric equipment, e.g. adaptation or attachment to the suction cleaner; Controlling suction cleaners by electric means characterised by the parts which are controlled
    • A47L9/2852Elements for displacement of the vacuum cleaner or the accessories therefor, e.g. wheels, casters or nozzles
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C11/00Photogrammetry or videogrammetry, e.g. stereogrammetry; Photographic surveying
    • AHUMAN NECESSITIES
    • A47FURNITURE; DOMESTIC ARTICLES OR APPLIANCES; COFFEE MILLS; SPICE MILLS; SUCTION CLEANERS IN GENERAL
    • A47LDOMESTIC WASHING OR CLEANING; SUCTION CLEANERS IN GENERAL
    • A47L2201/00Robotic cleaning machines, i.e. with automatic control of the travelling movement or the cleaning operation
    • A47L2201/04Automatic control of the travelling movement; Automatic obstacle detection

Abstract

The automotive robot (1) has a camera module i.e. charge coupled device-chip, structured for detecting an image. Lenses (6) and mirrors are spaced apart from each other in a horizontal direction to guide an environment image to image halves of the camera module. An image detection area is limited in a section, which is matched approximately to a height of the robot in a direction perpendicular to a driving plane. The image halves divide the camera module in the horizontal direction. An independent claim is also included for a method for detecting and estimating a three-dimensional image of a driving environment of an automotive robot.

Description

Die Erfindung betrifft einen selbstfahrenden Roboter zur Behandlung von Bodenflächen mit einer Einrichtung zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung. Die Erfindung betrifft außerdem ein Verfahren zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung eines selbstfahrenden Roboters.The The invention relates to a self-propelled robot for treatment of floor surfaces with a device for detecting and evaluating a three-dimensional Picture of the driving environment. The invention also relates to a Method for recording and evaluating a three-dimensional image the driving environment of a self-propelled robot.

Die Verwendung von selbstfahrenden Robotern zur Säuberung von Bodenflächen oder zum Rasenschneiden ist bekannt. Einfache Systeme arbeiten nach dem Zufallsprinzip, d. h., bei einer Kollision mit einem Hindernis ändern sie lediglich ihre Fahrtrichtung. Die vollständige Bearbeitung der Fläche erfolgt nach dem Zufallsprinzip, wobei auch mehrfaches Überfahren von Bereichen in Kauf genommen wird. Der damit verbundene Vorteil einer einfachen und preiswerten Sensorik wird mit größerem Zeitbedarf erkauft.The Use of self-propelled robots for cleaning of floor surfaces or Lawn cutting is known. Simple systems work after the Random principle, d. h., in a collision with an obstacle they change only their direction of travel. The complete processing of the surface takes place at random, whereby also multiple driving over of areas in Purchase is taken. The associated advantage of a simple and inexpensive sensors will take more time he buys.

Komplexe Systeme beinhalten eine zielgerichtete Navigation, die unter anderem eine Vermeidung von Kollisionen einschließt. Zur sinnvollen Pfadplanung in einer unbekannten Umgebung ist eine simultane Lokalisation und Kartenerstellung erforderlich (SLAM – Simultaneous Location and Mapping). Die Lokalisation wird üblicherweise durch die Erkennung prägnanter Punkte der Umgebung (Landmarks) unterstützt. Während der Abarbeitung des Pfades muss der Roboter außerdem Hindernissen ausweichen und auf aktuelle Ereignisse reagieren. Hierzu sind bereits verschiedene Verfahren und Strategien unter Verwendung unterschiedlicher Sensorarten bekannt, z. B. Odometrie, optische und akustische Abstandssensoren. Detailreiche Informationen über die Umgebung des Roboters und Abstände von Hindernissen liefern 3D-orientierte Systeme wie Laserscanner oder stereoskopische Kamerasysteme. Das stereoskopische Kamerasystem ähnelt dem menschlichen Gesichtssinn – durch die Unterschiede im rechten und linken Bild können durch Triangulation Tiefeninformationen gewonnen werden.complex Systems include purposeful navigation, among others an avoidance of collisions. For meaningful path planning in an unknown environment is a simultaneous localization and Map Creation Required (SLAM - Simultaneous Location and Mapping). The localization usually becomes through the recognition more succinct Points of the environment (Landmarks) supported. During the execution of the path the robot must also Dodge obstacles and react to current events. For this Different methods and strategies are already being used different types of sensors known, for. B. Odometry, optical and acoustic distance sensors. Detailed information about the Environment of the robot and distances Obstacles deliver 3D-oriented systems such as laser scanners or stereoscopic camera systems. The stereoscopic camera system is similar to the human sense of sight - through The differences in the right and left image can be through triangulation depth information be won.

Aus der DE 196 14 916 A1 ist ein Fahrroboter bekannt, der zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung zwei digitale Kameras verwendet. Eine solche Lösung ist sehr teuer und erfordert einen hohen Rechenaufwand mit dementsprechender Prozessor- und Speicherleistung, sie ist deshalb für ein Consumer-Produkt nicht geeignet.From the DE 196 14 916 A1 For example, a driving robot is known which uses two digital cameras to acquire and evaluate a three-dimensional image of the driving environment. Such a solution is very expensive and requires a high computational effort with corresponding processor and memory performance, it is therefore not suitable for a consumer product.

Aus der EP 03 58 628 A2 und aus der US 51 01 351 A1 sind Roboter bekannt, bei denen der Erfassungsbereich des Bildes auf einen Ausschnitt unterhalb der Höhe des Roboters beschränkt ist.From the EP 03 58 628 A2 and from the US 51 01 351 A1 are known robots in which the detection range of the image is limited to a section below the height of the robot.

Der Erfindung stellt sich somit das Problem, einen selbstfahrenden Roboter der eingangs genannten Art bzw. ein Verfahren zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung eines selbstfahrenden Roboters zu offenbaren, bei dem die Sensorik einfach, preiswert und damit für Consumer-Produkte einsetzbar ist.Of the Invention thus raises the problem of a self-propelled robot of the type mentioned above or a method for detection and evaluation a three-dimensional image of the driving environment of a self-propelled To reveal a robot in which the sensors are simple, inexpensive and with it for Consumer products can be used.

Erfindungsgemäß wird dieses Problem durch eine Vorrichtung mit den Merkmalen des Patentanspruchs 1 bzw. durch ein Verfahren mit den Merkmalen des Patentanspruchs 7 gelöst. Vorteilhafte Ausgestaltungen und Weiterbildungen der Erfindung ergeben sich aus den nachfolgenden Unteransprüchen.According to the invention this Problem by a device having the features of the claim 1 or by a method having the features of the claim 7 solved. Advantageous embodiments and further developments of the invention result from the following subclaims.

Ein Ausführungsbeispiel der Erfindung ist in den Zeichnungen rein schematisch dargestellt und wird nachfolgend näher beschrieben. Es zeigtOne embodiment The invention is shown purely schematically in the drawings and will be closer below described. It shows

1 die Seitenansicht eines selbstfahrenden Roboters; 1 the side view of a self-propelled robot;

2 die Draufsicht auf den Roboter nach 1; 2 the top view of the robot behind 1 ;

3 eine Linse des Roboters; 3 a lens of the robot;

4 die Bilderfassung des Roboters; 4 the image capture of the robot;

5 ein Blockdiagramm der Bildverarbeitung des Roboters; 5 a block diagram of the image processing of the robot;

6a–c die erfassten Bilddaten innerhalb der einzelnen Teilabschnitte der Bildverarbeitung 6a -C the captured image data within the individual sections of the image processing

Der selbstfahrende Roboter, beispielsweise ein in den 1 und 2 dargestellter Robotsauger 1, besitzt in bekannter Weise eine Bodenplatte 2, auf die ein Gehäuse 3 aufgesetzt ist und die mit einem Fahrwerk 4 ausgestattet ist. Das Fahrwerk und die Saugeinrichtung 5 eines solchen Geräts sind hinreichend bekannt und deshalb nicht näher beschrieben. Zur Erkennung seiner Umgebung besitzt der Robotsauger 1 zwei horizontal beabstandete Linsen 6, mit denen jeweils ein Umgebungsbild (siehe 6a) erfasst wird. Die Linsen 6 besitzen einen Abstand von 6 bis 12 cm, so dass eine Triangulation gut möglich ist.The self-propelled robot, for example, one in the 1 and 2 illustrated robot vacuum cleaner 1 , has in a known manner a bottom plate 2 on which a housing 3 is attached and with a chassis 4 Is provided. The chassis and the suction device 5 Such a device are well known and therefore not described in detail. The Robotsauger owns to recognize its environment 1 two horizontally spaced lenses 6 , with each of which an environmental image (see 6a ) is detected. The lenses 6 have a distance of 6 to 12 cm, so that a triangulation is well possible.

Die Welt eines flachen Robotsaugers 1 ist eigentlich nur quasi-zweidimensional, da eine Höhe von 20 cm zugunsten einer möglichst geringen Unterfahrhöhe nicht überschritten werden sollte. Daher genügt es, den Erfassungsbereich des optischen Systems in der Richtung senkrecht zur Fahrebene – im Folgenden als vertikale Richtung bezeichnet – auf einen Ausschnitt zu beschränken, der dieser Höhe entspricht. Die seitliche Ausbreitung des Sichtfeldes kann im Vergleich zum menschlichen Auge vorteilhaft bis auf 180° erhöht werden. Diese optische Datenreduktion in vertikaler Richtung bei gleichzeitiger Sichtfelderweiterung in der Fahrebene erfolgt durch die Gestaltung der Linsen 6, von denen eine in 3 dargestellt ist. Die Linse 6 besitzt die Form einer gekrümmten Zylinder-Linse. Sie weist in horizontaler Richtung eine gekrümmte Brennlinie 7 auf, in vertikaler Richtung ist die Brennlinie 8 eine Gerade.The world of a flat robotic vacuum cleaner 1 is actually only quasi-two-dimensional, since a height of 20 cm in favor of the lowest possible under ride height should not be exceeded. Therefore, it is sufficient to limit the detection range of the optical system in the direction perpendicular to the driving plane - hereinafter referred to as vertical direction - to a section corresponding to this height. The lateral spread of the field of view can in comparison to the human eye advantage can be increased up to 180 °. This optical data reduction in the vertical direction with simultaneous field of view extension in the driving plane takes place through the design of the lenses 6 of which one in 3 is shown. The Lens 6 has the shape of a curved cylinder lens. It has a curved focal line in the horizontal direction 7 up, in the vertical direction is the focal line 8th a straight.

Die gesamte Bilderfassung des Robotsaugers ist in 4 dargestellt. Sie besteht aus zwei optischen Teilsystemen, den zwei vorbeschriebenen Linsen 6, zwei zugeordneten Spiegeln 9 und 10 und einem einzigen CCD-Chip 11 als Kameramodul. Die Spiegel sind so ausgerichtet, dass sie die beiden von den Linsen erfassten, in der Höhe reduzierten Umgebungsbilder untereinander auf dem CCD-Chip 11 projizieren (siehe 6b). Hierzu ist einer der Spiegel, inThe entire image capture of Robotsaugers is in 4 shown. It consists of two optical subsystems, the two lenses described above 6 , two associated mirrors 9 and 10 and a single CCD chip 11 as a camera module. The mirrors are aligned so that the two captured by the lenses, reduced in height environmental images with each other on the CCD chip 11 project (see 6b ). This is one of the mirrors, in

4 der Spiegel 9 gedreht, was durch den Pfeil 12 angedeutet ist. An Stelle von Spiegeln 9 und 10 können auch in den Zeichnungen nicht dargestellte Prismen oder Lichtleitern verwendet werden. 4 the mirror 9 turned, what by the arrow 12 is indicated. In place of mirrors 9 and 10 can also be used in the drawings, not shown prisms or light guides.

Besondere Anforderungen an die Bildauflösung in vertikaler Richtung werden nicht gestellt, da es ausreicht, wenn ein Hindernis 13 im Sichtfeld erkannt wird. Die Höhe des Hindernisses spielt keine Rolle, der Robotsauger 1 muss in jedem Fall ausweichen. Deshalb werden die Daten des erfassten Bildes zur Auswertung weiter reduziert. In 5 ist anhand eines Blockdiagramms die gesamte Bildverarbeitung dargestellt. Zunächst erfolgt, wie vorbeschrieben, die Erfassung der beiden Umgebungsbilder bei gleichzeitiger optischer Datenreduktion. Die beiden Bilder werden auf den CCD-Chip projiziert und in Daten gewandelt. Anschließend wird die Datenmenge der beiden Bilder mit geeigneten statistischen Mitteln, beispielsweise durch Mittelung der Grauanteile der Bildpunkte in einer vertikalen Reihe, auf je einen Bildpunkt in vertikaler Richtung reduziert. Es entstehen somit zwei linienhafte Bilder, wie in 6c dargestellt. Aus diesen beiden Linien wird durch Triangulation der Abstand von Hindernissen 13 oder markanten Punkten 14 ermittelt und zur weiteren Navigation verwendet.Special requirements for the image resolution in the vertical direction are not provided, since it is sufficient if an obstacle 13 is detected in the field of view. The height of the obstacle does not matter, the Robotsauger 1 must dodge in any case. Therefore, the data of the captured image is further reduced for evaluation. In 5 is a block diagram of the entire image processing shown. First, as described above, the detection of the two environmental images with simultaneous optical data reduction. The two images are projected onto the CCD chip and converted into data. Subsequently, the amount of data of the two images is reduced by suitable statistical means, for example by averaging the gray portions of the pixels in a vertical row, each per pixel in the vertical direction. This results in two linear images, as in 6c shown. From these two lines, the distance between obstacles is determined by triangulation 13 or distinctive points 14 determined and used for further navigation.

Claims (8)

Selbstfahrender Roboter (1) zur Behandlung von Bodenflächen mit einer Einrichtung zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung, dadurch gekennzeichnet, dass zur Bilderfassung ein einziges Kameramodul (CCD-Chip 11) verwendet wird, und dass zwei horizontal voneinander beabstandete optische Teilsysteme (6, 9, 10) je ein Umgebungsbild auf je eine Bildhälfte des Kameramoduls (CCD-Chip 11) lenken.Self-propelled robot ( 1 ) for the treatment of ground surfaces with a device for detecting and evaluating a three-dimensional image of the driving environment, characterized in that for image capture a single camera module (CCD chip 11 ) and that two horizontally spaced optical subsystems ( 6 . 9 . 10 ) per one image on each half of the camera module (CCD chip 11 ) to steer. Selbstfahrender Roboter nach Anspruch 1, dadurch gekennzeichnet, dass der Bilderfassungsbereich auf einen Ausschnitt beschränkt ist, der in der zur Fahrebene senkrechten Richtung wenigstens annähernd der Roboterhöhe entspricht.Self-propelled robot according to claim 1, characterized characterized in that the image capture area on a section limited is, in the direction perpendicular to the plane of travel at least approximately robot height equivalent. Selbstfahrender Roboter nach Anspruch 1 oder 2, dadurch gekennzeichnet, dass die beiden Bildhälften das Kameramodul (CCD-Chip 11) in horizontaler Richtung teilen.Self-propelled robot according to claim 1 or 2, characterized in that the two image halves of the camera module (CCD chip 11 ) in a horizontal direction. Selbstfahrender Roboter nach einem der Ansprüche 1 bis 3, dadurch gekennzeichnet, dass die Bildauswertung eine Reduktion der vom Kameramodul (CCD-Chip 11) erzeugten Daten der beiden Bildhälften in der zur Fahrebene senkrechten Richtung umfasst.Self-propelled robot according to one of claims 1 to 3, characterized in that the image evaluation is a reduction of the camera module (CCD chip 11 ) comprises data of the two image halves in the direction perpendicular to the plane of travel. Selbstfahrender Roboter nach Anspruch 4, dadurch gekennzeichnet, dass die Daten einer Bildhälfte in der zur Fahrebene senkrechten Richtung auf einen Bildpunkt reduziert werden.Self-propelled robot according to claim 4, characterized characterized in that the data of one half of the image in the plane perpendicular to the plane Direction can be reduced to one pixel. Selbstfahrender Roboter nach Anspruch 4 oder 5, gekennzeichnet durch Mittel zur Triangulation der reduzierten Daten der beiden Bildhälften des Kameramoduls (CCD-Chip 11).Self-propelled robot according to claim 4 or 5, characterized by means for triangulating the reduced data of the two halves of the camera module (CCD chip 11 ). Verfahren zur Erfassung und Auswertung eines dreidimensionalen Bildes der Fahrumgebung eines selbstfahrenden Roboters (1), dadurch gekennzeichnet, • dass zwei horizontal voneinander beabstandete optische Einrichtungen (6, 9, 10) je ein Umgebungsbild auf je eine Bildhälfte eines Kameramoduls (CCD-Chip 11) lenken, • dass die Daten der beiden Bildhälften mittels einer Auswerteschaltung in der zur Fahrebene senkrechten Richtung weiter reduziert werden, • und dass die reduzierten Daten der beiden Bildhälften zur Abstandserkennung trianguliert werden.Method for detecting and evaluating a three-dimensional image of the driving environment of a self-propelled robot ( 1 ), characterized in that two horizontally spaced-apart optical devices ( 6 . 9 . 10 ) per one image on each half of a camera module (CCD chip 11 ), • that the data of the two image halves are further reduced by means of an evaluation circuit in the direction perpendicular to the plane of travel, and • that the reduced data of the two halftone image halves are triangulated. Verfahren nach Anspruch 7, dadurch gekennzeichnet, dass der Bilderfassungsbereich auf einen Ausschnitt beschränkt ist, der in der zur Fahrebene senkrechten Richtung wenigstens annähernd der Roboterhöhe entspricht.Method according to claim 7, characterized in that that the image capture area is limited to a section, which corresponds at least approximately to the robot height in the direction perpendicular to the plane of travel.
DE200710015552 2007-03-29 2007-03-29 Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module Expired - Fee Related DE102007015552B3 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE200710015552 DE102007015552B3 (en) 2007-03-29 2007-03-29 Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
DE200710015552 DE102007015552B3 (en) 2007-03-29 2007-03-29 Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module

Publications (1)

Publication Number Publication Date
DE102007015552B3 true DE102007015552B3 (en) 2008-08-07

Family

ID=39587574

Family Applications (1)

Application Number Title Priority Date Filing Date
DE200710015552 Expired - Fee Related DE102007015552B3 (en) 2007-03-29 2007-03-29 Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module

Country Status (1)

Country Link
DE (1) DE102007015552B3 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010020537A1 (en) 2010-05-14 2011-11-17 H&S Robotic Solutions GbR (vertretungsberechtigter Gesellschafter: Bernd-Helge Schäfer, 67661 Kaiserslautern) Passive water surface detector for use in autonomous system of self-propelled lawn mower moved over area of golf course, has sensor elements connected to data evaluation device and generating image with different polarizations from scene
EP2764812A1 (en) * 2013-02-12 2014-08-13 Hako GmbH Cleaning robot
EP3037860A3 (en) * 2014-12-24 2016-11-02 Samsung Electronics Co., Ltd. Lens assembly, obstacle detecting unit using the same, and moving robot having the same
EP3082006A3 (en) * 2015-04-16 2016-12-28 Samsung Electronics Co., Ltd. Cleaning robot and method of controlling the same
EP3139806B1 (en) 2014-05-08 2019-07-17 Alfred Kärcher SE & Co. KG Self driving and self steering floor cleaning device and method for cleaning a floor surface

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5101351A (en) * 1989-04-12 1992-03-31 Nissan Motor Company, Limited Autonomous vehicle using fuzzy control
DE19614916A1 (en) * 1996-04-16 1997-11-06 Detlef Raupach Floor cleaning robot vehicle for use in rooms

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0358628A2 (en) * 1988-09-06 1990-03-14 Transitions Research Corporation Visual navigation and obstacle avoidance structured light system
US5101351A (en) * 1989-04-12 1992-03-31 Nissan Motor Company, Limited Autonomous vehicle using fuzzy control
DE19614916A1 (en) * 1996-04-16 1997-11-06 Detlef Raupach Floor cleaning robot vehicle for use in rooms

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102010020537A1 (en) 2010-05-14 2011-11-17 H&S Robotic Solutions GbR (vertretungsberechtigter Gesellschafter: Bernd-Helge Schäfer, 67661 Kaiserslautern) Passive water surface detector for use in autonomous system of self-propelled lawn mower moved over area of golf course, has sensor elements connected to data evaluation device and generating image with different polarizations from scene
EP2764812A1 (en) * 2013-02-12 2014-08-13 Hako GmbH Cleaning robot
CN103976694A (en) * 2013-02-12 2014-08-13 哈高有限责任公司 Cleaning robot
US9468352B2 (en) 2013-02-12 2016-10-18 Hako Gmbh Cleaning robot
EP3139806B1 (en) 2014-05-08 2019-07-17 Alfred Kärcher SE & Co. KG Self driving and self steering floor cleaning device and method for cleaning a floor surface
EP3037860A3 (en) * 2014-12-24 2016-11-02 Samsung Electronics Co., Ltd. Lens assembly, obstacle detecting unit using the same, and moving robot having the same
US9864914B2 (en) 2014-12-24 2018-01-09 Samsung Electronics Co., Ltd. Lens assembly, obstacle detecting unit using the same, and moving robot having the same
EP3082006A3 (en) * 2015-04-16 2016-12-28 Samsung Electronics Co., Ltd. Cleaning robot and method of controlling the same
US10248125B2 (en) 2015-04-16 2019-04-02 Samsung Electronics Co., Ltd. Cleaning robot and method of controlling the same

Similar Documents

Publication Publication Date Title
DE102006010735B4 (en) Vehicle environment monitoring device
EP2755101B1 (en) Self-propelled robot and method for distance determination in a self-propelled robot
DE102012221563B4 (en) FUNCTIONAL DIAGNOSIS AND VALIDATION OF A VEHICLE-BASED IMAGING SYSTEM
EP1531343B1 (en) Method for tracking objects
DE102016206493A1 (en) Method and camera system for determining the distance of objects to a vehicle
EP2764812B1 (en) Cleaning robot
EP2479584B1 (en) Method for determining the position of a self-propelled device
DE102007015552B3 (en) Self-propelled automotive robot for e.g. treatment of floor space, has lenses and mirrors spaced apart from each other in horizontal direction to guide environment image to image halves of camera module
DE102011000009A1 (en) Method of simultaneous determination and map formation
DE102012211791B4 (en) Method and arrangement for testing a vehicle underbody of a motor vehicle
DE112016001150T5 (en) ESTIMATION OF EXTRINSIC CAMERA PARAMETERS ON THE BASIS OF IMAGES
EP1920406A1 (en) Object detection on a pixel plane in a digital image sequence
DE112017002743T5 (en) Parking assistance device and parking assistance method
DE102006002794A1 (en) Light-section method for use in controlling of car wash facility, involves washing vehicle using treatment device, where vehicle and treatment device are moved in washing direction relative to each other
DE102006039104A1 (en) Method for ranging or detection of stationary objects in surroundings of road vehicle using graphic data, involves testing object hypothesis such as stationary object against object hypothesis such as flat road
DE102006044615A1 (en) Image capturing device calibrating method for vehicle, involves locating objects in vehicle surrounding based on image information detected by image capturing device, and evaluating geometrical relations of objects
DE102019202269B4 (en) Method for calibrating a mobile camera unit of a camera system for a motor vehicle
DE60023339T2 (en) SYSTEM FOR DETECTING OBSTACLES TO A VEHICLE MOVEMENT
DE102017219119A1 (en) Method for detecting the shape of an object in an exterior of a motor vehicle and motor vehicle
DE102012209224A1 (en) Device and method for taking pictures of a vehicle underbody
EP0290633B1 (en) Method for detecting changes in the driving range of an unmanned vehicle
DE10148062A1 (en) Localizing system for objects uses transmitter for pulsed emission of laser beams and receiver with sensor to pick up reflected beam pulses and to analyze them regarding their execution time
DE102014118203B4 (en) DISTANCE DETERMINATION SYSTEM USING A MONOSCOPIC IMAGER IN A VEHICLE
DE102020110809B3 (en) Method and device for recognizing blooming in a lidar measurement
DE102009038406B4 (en) Method and device for measuring the environment of a motor vehicle

Legal Events

Date Code Title Description
8364 No opposition during term of opposition
8320 Willingness to grant licences declared (paragraph 23)
R119 Application deemed withdrawn, or ip right lapsed, due to non-payment of renewal fee