EP3507666A1 - System for sensing position and method for sensing position - Google Patents
System for sensing position and method for sensing positionInfo
- Publication number
- EP3507666A1 EP3507666A1 EP17740622.0A EP17740622A EP3507666A1 EP 3507666 A1 EP3507666 A1 EP 3507666A1 EP 17740622 A EP17740622 A EP 17740622A EP 3507666 A1 EP3507666 A1 EP 3507666A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- optically readable
- readable code
- coordinate system
- dimensional optically
- camera
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000000034 method Methods 0.000 title claims abstract description 75
- 238000001514 detection method Methods 0.000 claims description 24
- 238000011156 evaluation Methods 0.000 claims description 15
- 230000003287 optical effect Effects 0.000 claims description 5
- 230000006870 function Effects 0.000 claims description 3
- 230000001419 dependent effect Effects 0.000 claims description 2
- 230000004913 activation Effects 0.000 claims 1
- 230000004807 localization Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
- G05D1/0234—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means using optical markers or beacons
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/70—Determining position or orientation of objects or cameras
- G06T7/73—Determining position or orientation of objects or cameras using feature-based methods
- G06T7/74—Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V30/00—Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
- G06V30/10—Character recognition
- G06V30/22—Character recognition characterised by the type of writing
- G06V30/224—Character recognition characterised by the type of writing of printed characters having additional code marks or containing code marks
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
Definitions
- the invention relates to a system for position detection and a method for position detection.
- the invention is therefore based on the object of developing a system for position detection and a method for position detection, wherein the system and the method should be simplified.
- the object is achieved in the system for position detection according to the features specified in claim 1 and in the method for position detection according to the features specified in claim 8.
- the system the object, a camera connected to the object and a
- two-dimensional optically readable code in particular QR code
- the two-dimensional optically readable code is spaced from the object
- the camera being arranged to detect the two-dimensional optically readable code, wherein the spatial position of the two-dimensional optically readable code, in particular in a world coordinate system, is known.
- the advantage here is that in the two-dimensional optically readable code information is written, such as the spatial position of the two-dimensional optically readable code in the world coordinate system and / or the size of the two-dimensional optically readable code, ie the length and width of the two-dimensional optically readable code ,
- the two-dimensional optically readable code has positioning symbols that can be used to determine the orientation of the optically readable code in space.
- the two-dimensional optically readable code functions both as a storage means and as a reference point for the determination of the spatial position. Both information can be detected in a simple manner by means of the optical camera in one step.
- the two-dimensional optically readable code has the following information: a spatial position information of the two-dimensional optically readable code in the world coordinate system and / or
- size information of the two-dimensional optically readable code in particular indicating the length and width of the two-dimensional optically readable code, and / or
- the system has a storage means, in particular wherein the object has the storage means, wherein the storage means is arranged, an assignment of the two-dimensional optically readable code to the respective
- Size information and / or the respective spatial position information store is compact executable, since the identity information of the two-dimensional optically readable code in the storage means with the spatial position information and / or the size information of the two-dimensional code can be linked.
- the system has an evaluation means, in particular wherein the object has the evaluation means, wherein the evaluation means is arranged from a captured by the camera image of the two-dimensional optically readable code and the size information, the spatial position of the two-dimensional optically readable code in a first coordinate system whose origin is in the camera to determine.
- the object is a driverless transport vehicle, wherein the movement of the driverless transport vehicle by means of the system for position detection is controllable, in particular wherein the evaluation means as a control means of the driverless
- Transport vehicle has all the means for control and navigation in the world coordinate system.
- the driverless transport vehicle is autonomous by means of the two-dimensional optically readable code or a plurality of two-dimensional optically readable codes in the world coordinate system, in particular a warehouse and / or a factory hall,
- the camera is rigidly connected to the object, wherein the spatial position of the camera in a second coordinate system, in whose origin the object is located, is constant.
- two-dimensional optically readable codes in the first coordinate system in a simple manner in the spatial position of the two-dimensional optically readable code in the second coordinate system is convertible.
- the camera is pivotally connected to the object, wherein the system comprises an angle sensor for detecting a pivot angle of the camera relative to the object, wherein the spatial position of the camera in a second
- Coordinate system in whose origin the object lies, is dependent on the swivel angle.
- the advantage here is that by means of the pivotally mounted on the object camera, the two-dimensional optically readable code can be detected in a larger area than by means of a rigidly connected to the object camera.
- the method comprises the following method steps: wherein in a first method step, a two-dimensional optically readable code, in particular a QR code, is detected by a camera, wherein in a second method step, the temporally after the first method step, a size information of the two-dimensional optical code from one of the
- Camera is detected image of the two-dimensional optically readable code is evaluated and from the size information and the image, the spatial position of the two-dimensional optically readable code in the first coordinate system, in the origin of the camera is determined, wherein in a third step, the time after the second method step takes place, from the spatial position of the two-dimensional optically readable code in the first coordinate system and the spatial position of the camera in a second Coordinate system, in whose origin the object is located, the spatial position of the two-dimensional optically readable code is determined in the second coordinate system, wherein in a fourth method step, which occurs temporally after the third method step, from the spatial position of the two-dimensional optically readable code in the second Coordinate system, the relative orientation of the object is determined to the two-dimensional optically readable code in the second coordinate system, wherein in a fifth process step, the time after the fourth process step or at the same time takes place with the second process step and / or the third process step and / or the fourth process step, the image is evaluated and the spatial position of
- the advantage here is that in the two-dimensional optically readable code information is written, such as the spatial position of the two-dimensional optically readable code in the world coordinate system and / or the size of the two-dimensional optically readable code, ie the length and width of the two-dimensional optically readable code ,
- the two-dimensional optically readable code has positioning symbols that can be used to determine the orientation of the optically readable code in space.
- the two-dimensional optically readable code functions both as a storage means and as a reference point for the determination of the spatial position. Both information can be detected in a simple manner by means of the optical camera in one step.
- a fifth method step in the fifth method step, a
- the advantage here is that the two-dimensional optically readable code is compact executable.
- Size information determines the distance of the two-dimensional optically readable code to the camera.
- the advantage here is that the distance of the two-dimensional optically readable code to the camera from the code itself can be determined. There is no need for another reference point.
- the advantage here is that the two-dimensional optically readable code positioning symbols and a predetermined shape, in particular a rectangular shape, in particular square shape, from which the perspective distortion in a simple manner by means of the figure can be determined.
- the spatial position of the two-dimensional optically readable code in the first coordinate system is determined from the distance of the two-dimensional optically readable code to the camera and the orientation of the two-dimensional optically readable code relative to the camera.
- the advantage here is that from the spatial position of the two-dimensional optically readable code in the first coordinate system, the spatial position of the object relative to the two-dimensional optically readable code can be determined.
- in the second method step is determined from the distance of the two-dimensional optically readable code to the camera and the orientation of the two-dimensional optically readable code relative to the camera.
- FIG. 1 schematically shows a system according to the invention for position detection.
- Figure 2 shows a method according to the invention for position detection in a schematic
- the position detection system shown in FIG. 1 has a two-dimensional optically readable code 3, in particular a QR code, and an object 1, in particular a vehicle, which is movable in space.
- the object 1, whose spatial position is detected by means of the system, has a camera 2, an evaluation means and a storage means.
- the spatial position of an object is the combination of position and orientation of the object in space.
- the position in the Cartesian coordinate system is given by three coordinates (x, y, z).
- the orientation is defined by another coordinate system and its angular offset from the coordinate system.
- the spatial position is thus specified by means of six coordinates.
- the camera 2 is rigidly connected to the object 1.
- An objective of the camera 2 is arranged so as to be deflected by the object 1 on the camera 2.
- the spatial position of the camera 2 relative to the object that is to say in a second coordinate system 5 whose origin lies in the object 1, is known and stored in the storage means.
- the orientation of the objective in the second coordinate system 5 is also known and stored in the storage means.
- the two-dimensional optically readable code 3 is spaced from the object.
- the two-dimensional optically readable code 3 has the following information: a spatial position information of the two-dimensional optically readable code 3 in a world coordinate system 6 and / or
- the world coordinate system 6 designates the origin coordinate system with which the associated relative coordinate systems, ie the first coordinate system 4 and the second coordinate system 5, are referenced.
- the world coordinate system 6 gives
- the spatial location in an industrial plant such as warehouse or
- the identity information and the spatial location information of the two-dimensional optically readable code 3 are in the
- the evaluation means reads out from the storage means the spatial position information of the two-dimensional optically readable code 3 associated with the identity information in the world coordinate system 6 and the size information.
- the codes 3 are stationary in the world coordinate system 6, that is immovable, arranged.
- the camera 2 has a first coordinate system 4 whose origin lies in the camera 2. From a photograph of an object taken by the camera 2 and the actual size of the object, the spatial position of the object in the first coordinate system 4 can be determined by means of the evaluation means.
- the camera 2 is pivotally connected to the object 1.
- the tilt angle of the camera 2 by means of a
- World coordinate system 6 the position of the camera 2 and the swivel angle are used.
- the object 1 is designed as a driverless transport vehicle.
- the driverless transport vehicle is controlled by the evaluation, wherein the Evaluation means the spatial position of the driverless transport vehicle in the
- World coordinate system 6 determined by means of the two-dimensional optically readable codes 3 and so controls the direction of travel of the driverless transport vehicle.
- the inventive method for position detection of an object shown in FIG. 2 has the following temporally successive method steps:
- a first method step A the two-dimensional optically readable code 3 is detected by the camera 2.
- a second method step B the size information is evaluated from an image of the two-dimensional optically readable code 3 recorded by the camera 2. From the size information and the map of the two-dimensional optically readable code 3, the spatial position of the two-dimensional optically readable code 3 in the first
- Coordinate system 4 determined. In this case, from the relative size of the two-dimensional optically readable code 3 in the image and the size information, the distance of the two-dimensional optically readable code 3 to the camera 2 is determined. From the
- the orientation of the two-dimensional optically readable code 3 relative to the camera 2 is determined.
- the spatial position of the two-dimensional optically readable code 3 in the first coordinate system 4 is determined from the distance to the camera 2 and the orientation relative to the camera 2.
- a third method step C the spatial position of the two-dimensional optically readable code 3 in the second coordinate system 5 is determined from the spatial position of the two-dimensional optically readable code 3 in the first coordinate system 4 and the spatial position of the camera 2 in the second coordinate system 5.
- a fourth method step D the relative position of the spatial position of the two-dimensional optically readable code 3 in the second coordinate system 5 becomes
- Object 1 to the two-dimensional optically readable code 3 in the second coordinate system 5 determined.
- a fifth method step E which takes place after the fourth method step D or at the same time as the second method step B and / or the third method step C and / or the fourth method step D, the identity information of the
- a sixth method step F which takes place after the fourth method step D and the fifth method step E, the spatial position of the two-dimensional optically readable code 3 in the world coordinate system 6 and the relative orientation of the object 1 relative to the two-dimensional optically readable code 3 in FIG the second
- Coordinate system 5 determines the spatial position of the object 1 in the world coordinate system 6 as a method result G.
- the identity information of the two-dimensional optically readable code 3 is evaluated temporally after the first method step and before the second method step from the image of the two-dimensional optically readable code 3 detected by the camera 2. Thereafter, the information associated with the identity information of the two-dimensional optically readable code 3
- Size information is read from the storage means.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Aviation & Aerospace Engineering (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Automation & Control Theory (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
Claims
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102016010486 | 2016-08-31 | ||
PCT/EP2017/025205 WO2018041408A1 (en) | 2016-08-31 | 2017-07-12 | System for sensing position and method for sensing position |
Publications (1)
Publication Number | Publication Date |
---|---|
EP3507666A1 true EP3507666A1 (en) | 2019-07-10 |
Family
ID=59366366
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP17740622.0A Pending EP3507666A1 (en) | 2016-08-31 | 2017-07-12 | System for sensing position and method for sensing position |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP3507666A1 (en) |
DE (1) | DE102017006616A1 (en) |
WO (1) | WO2018041408A1 (en) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10558872B2 (en) | 2018-03-23 | 2020-02-11 | Veoneer Us Inc. | Localization by vision |
DE102019211984A1 (en) * | 2019-08-09 | 2021-02-11 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Device, method for controlling the same and device network or swarm |
CN114358038B (en) * | 2022-03-10 | 2022-06-03 | 华南理工大学 | Two-dimensional code coordinate calibration method and device based on vehicle high-precision positioning |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2938338B2 (en) | 1994-03-14 | 1999-08-23 | 株式会社デンソー | Two-dimensional code |
US5525883A (en) * | 1994-07-08 | 1996-06-11 | Sara Avitzour | Mobile robot location determination employing error-correcting distributed landmarks |
EP1828862A2 (en) | 2004-12-14 | 2007-09-05 | Sky-Trax Incorporated | Method and apparatus for determining position and rotational orientation of an object |
US8381982B2 (en) * | 2005-12-03 | 2013-02-26 | Sky-Trax, Inc. | Method and apparatus for managing and controlling manned and automated utility vehicles |
DE102012208132A1 (en) | 2012-05-15 | 2013-11-21 | Bayerische Motoren Werke Aktiengesellschaft | Method for vehicle localization |
US9207677B2 (en) * | 2014-01-02 | 2015-12-08 | Automotive Research & Testing Center | Vehicle positioning method and its system |
-
2017
- 2017-07-12 EP EP17740622.0A patent/EP3507666A1/en active Pending
- 2017-07-12 DE DE102017006616.7A patent/DE102017006616A1/en active Pending
- 2017-07-12 WO PCT/EP2017/025205 patent/WO2018041408A1/en unknown
Also Published As
Publication number | Publication date |
---|---|
DE102017006616A1 (en) | 2018-03-01 |
WO2018041408A1 (en) | 2018-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP3105547B1 (en) | Method for determining the absolute position of a mobile unit, and mobile unit | |
EP3324362B1 (en) | Method and device for commissioning a multi-axis system | |
EP3507666A1 (en) | System for sensing position and method for sensing position | |
DE102017213601A1 (en) | Method of creating an object map for a factory environment | |
EP3974936B1 (en) | Configuration of a visualisation device for a machine area | |
DE102016211227A1 (en) | Method and vehicle control system for generating images of an environment model and corresponding vehicle | |
DE102018009114A1 (en) | Method for determining the position of a mobile part movable on a travel surface and installation with mobile part for carrying out the method | |
DE102010012187A1 (en) | An installation, method for determining the position of a vehicle within an installation, and method for creating an improved target trajectory for a vehicle within a facility | |
EP3323565B1 (en) | Method and device for commissioning a multiple axis system | |
DE102020201785A1 (en) | Marker to define the movement trajectory of a vehicle | |
DE102019203484A1 (en) | Method, device and system for the navigation of autonomous vehicles | |
EP4128017A1 (en) | Mobile system and method for operating a mobile system | |
EP3977225B1 (en) | Method for creating an environment map for use in the autonomous navigation of a mobile robot | |
DE102019211459A1 (en) | Method and device for checking a calibration of environmental sensors | |
EP3825731B1 (en) | Optoelectronic safety sensor and method for guaranteed determination of own position | |
WO2018033274A1 (en) | Method and device for recognising obstacles using landmarks | |
EP4176424A1 (en) | Traffic light lane assignment from swarm data | |
DE102019220562A1 (en) | Method for detecting the relative position of an object or a person to a vehicle | |
EP3423911B1 (en) | Method for updating an occupancy map and autonomous vehicle | |
EP2703920B1 (en) | Method for teaching a machine controller | |
DE102014217954A1 (en) | Method and device for determining a desired inclination angle of a rail vehicle | |
DE102018217834A1 (en) | Method and system for operating a detection and display device | |
DE102018001581A1 (en) | Method for predicting the driving behavior of other road users | |
DE102018209366B4 (en) | Method for determining a position and / or orientation of a device | |
DE102019218929A1 (en) | Marker to define the movement trajectory of a vehicle |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: UNKNOWN |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20190401 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
AX | Request for extension of the european patent |
Extension state: BA ME |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
17Q | First examination report despatched |
Effective date: 20200811 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: EXAMINATION IS IN PROGRESS |