EP3891653A1 - Système de détermination et d'identification de position optique - Google Patents

Système de détermination et d'identification de position optique

Info

Publication number
EP3891653A1
EP3891653A1 EP19816634.0A EP19816634A EP3891653A1 EP 3891653 A1 EP3891653 A1 EP 3891653A1 EP 19816634 A EP19816634 A EP 19816634A EP 3891653 A1 EP3891653 A1 EP 3891653A1
Authority
EP
European Patent Office
Prior art keywords
objects
acquisition unit
image acquisition
texture
database
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19816634.0A
Other languages
German (de)
English (en)
Inventor
Pawel Piotrowski
Tsen Miin Tran
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Lufthansa Technik AG
Original Assignee
Lufthansa Technik AG
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Lufthansa Technik AG filed Critical Lufthansa Technik AG
Publication of EP3891653A1 publication Critical patent/EP3891653A1/fr
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/74Determining position or orientation of objects or cameras using feature-based methods involving reference images or patches
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/24Aligning, centring, orientation detection or correction of the image
    • G06V10/245Aligning, centring, orientation detection or correction of the image by locating a pattern; Special marks for positioning
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/10Terrestrial scenes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/80Recognising image objects characterised by unique random patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30204Marker
    • G06T2207/30208Marker matrix
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30248Vehicle exterior or interior
    • G06T2207/30268Vehicle interior

Definitions

  • the invention relates to an optical position determination and identification system, in particular for use on board commercial aircraft.
  • Augmented reality applications for passengers on board are also conceivable, in which an image shown on a mobile terminal is supplemented with computer-generated additional information or virtual objects. In order for this additional information to be visually correctly fitted, it is necessary to determine the exact position and orientation of the mobile device.
  • An additional optical position determination on board an aircraft is not currently possible, at least not reliably.
  • Objects inside the aircraft cabin for example a passenger seat, can basically be recognized via so-called edge-based model tracking, in which object edges are determined in a camera image and compared with computer models of the objects in question, and the relative position of the camera relative to it recorded object can be determined, but due to the large number of uniform objects within an aircraft cabin - such as, for example, passenger seats - an actual position determination within the cabin or unambiguous assignment of objects in question is, however, not reliably possible.
  • the object of the present invention is to provide an optical position determination and identification system which no longer has the disadvantages known from the prior art, or only does so to a limited extent.
  • the invention relates to an optical position determination and identification system in environments with a large number of identical objects comprising a mobile image acquisition unit and texture elements which can be attached to the objects, each texture element having a pattern which each has pseudo-random deviations from a predetermined basic pattern - ter, which cannot be seen freely at a distance of 1 m, and the image capture unit is designed to uniquely identify the texture elements on the basis of their pseudo-random deviations.
  • the invention has recognized that for position determination and identification in environments with a large number of uniform objects, simple, unambiguous identification of the individual objects is advantageous and in particular elementary for optical position determination based on the optically detected objects.
  • the objects have to be provided with well-known optically readable labels such as bar or QR codes in areas that are easy to see (possibly several times) for problem-free identification, which is regularly detrimental to the design and ambience in the aircraft cabin is, texture elements are provided according to the invention for identification purposes.
  • Each texture element has a pattern.
  • the pattern can preferably be a multiple repeating pattern, with which the texture elements can be of any size.
  • they are particularly well suited as a means of surface design for the objects and can be regularly inserted into the design of the aircraft cabin without being perceived by passengers as disturbing, as would be the case, for example, with immediately visible QR codes re.
  • the picture elements can have almost any complexity and can range from non-repeating patterns to photographic picture elements. Even such texture elements are not perceived as disruptive by passengers, for example as QR codes.
  • the patterns of the texture elements each have a clearly identifiable pseudo-random deviation from a predetermined basic pattern, which cannot be seen with a free eye at a distance of 1 m.
  • "Free-eyed” refers to a viewing of the texture elements by the human eye, in which no optical aids are used.
  • the texture elements By relying on a free-eyed non-recognizability of the pseudo-random deviations of the pattern of the texture elements at a distance of 1 m between texture elements and viewer , the texture elements appear to be basically identical to any casual observer, but the deviations in the patterns are so sufficient that they differ from optical image acquisition units, such as cameras, which have a sufficiently high resolution and, in particular, that of the human eye to be able to recognize the pseudo-random deviations.
  • a corresponding pseudo-random deviation from a basic pattern is a planned and clearly identifiable change in the basic pattern.
  • the deviations from the basic pattern must be sufficiently clear that they can be detected and ascertained by the image acquisition unit.
  • the deviations must also be determinable, for example, in the resolution available in the captured images from the image acquisition unit, whereby, of course, a maximum distance between the texture element and the image acquisition unit can be specified up to which the deviations from the basic pattern should be possible .
  • the deviations from the basic pattern according to the invention can be so small that they do not immediately occur when a passenger looks at the pattern of a texture element in the usual free-eyed manner, even when comparing neighboring texture elements stand out, but - if at all - can only be recognized by one person if you take a closer look. Due to the high complexity of the pattern, the subjective resolving power of human vision can be reduced, which means that the deviations from the basic pattern may be larger and thus easier for the image acquisition unit to identify.
  • the basic pattern consists of a random arrangement of geometric figures or graphics, such as company logos, in different colors and sizes.
  • texture elements are based on the same basic pattern. However, it is also possible that part of it is based on a basic pattern, while another part is based on another basic pattern.
  • texture elements provided for different types of objects can be based on different basic patterns. For example, texture elements for passenger seats can be designed differently than texture elements for wall panels.
  • the image capture unit is designed to analyze the patterns of the texture elements on an image captured by the image capture unit and to use the pseudo-random deviations clearly identify.
  • the basic pattern on which the analyzed pattern of the texture element is based can be stored in the image acquisition unit, so that the deviations are determined by comparing the recorded pattern with the basic pattern.
  • the image acquisition unit can be designed to derive a unique identification from the pattern itself. This is possible, for example, by determining characteristic averages from the recorded sample, such as the average distance between the individual sample components or the average size of the individual sample components, and then determining the deviations from these average values, which then result the unique identification it gives.
  • an effective identification of the pattern of a texture element by the image acquisition unit may already be sufficient Position determination guaranteed. For example, it can already be deduced from the identification described that the image acquisition unit must be in (immediate) proximity to the texturing element and thus to the object equipped with it.
  • a database connected to the image acquisition unit is preferably provided with the assignment of the texture elements to individual objects to which the texture elements are attached, the image acquisition unit preferably being designed to edge-model model the objects clearly identified via the texture elements and the database relative position of the image acquisition unit against to determine over the identified objects.
  • the exact relative position of the image acquisition unit relative to the object can be determined by edge-based model tracking using a computer model corresponding to the object. This is particularly advantageous for augmented reality applications.
  • the image acquisition unit can be designed to determine the absolute position of the image acquisition unit in the coordinate system of the position data stored in the database.
  • the above-described determination of the relative position of the image acquisition unit relative to an identified object can also determine the absolute position of the image acquisition unit if the position of the object is known. As a rule, it will have to be based on the coordinate system in which the position data of the objects are stored in the database.
  • the texture elements can be designed in almost any way.
  • the texture elements are adhesive films, covers and / or object covers, each of which has a pattern as described. It is also possible that at least some of the texture elements, preferably by printing and / or embroidery, are formed in one piece with one object each. In this case, the printed or embroidered surface of the object can be seen as a texture element.
  • the database described above is stored centrally and the image acquisition unit has access to the database via a wireless connection.
  • Providing a central database allows simultaneous access to the database by several image acquisition units.
  • the image acquisition unit In order to avoid constant data exchange via the wireless connection, it is also possible for the image acquisition unit to maintain an image of the centrally stored database and to update the image only when there are changes in the database.
  • the image captured by the image capture unit it is possible for the image captured by the image capture unit to be sent wirelessly to the central database if the image capture unit does not have sufficient computing capacities.
  • the identification of the texture elements and possibly further steps can then be carried out before the result is then transmitted back to the image acquisition unit.
  • the basic pattern has a chaotic character.
  • Chaotic character means that when you look at the basic pattern, no, or at least no, apparently functional regularity can be seen in the arrangement of the individual sample components. With a corresponding pattern, pseudo-random deviations are even less noticeable.
  • the system according to the invention is particularly suitable for position determination and / or object identification on board aircraft.
  • the invention also relates to an arrangement of a commercial aircraft with a multiplicity of objects arranged in the passenger cabin and an optical positioning and identification system according to the invention according to one of the preceding claims, the objects each being provided with at least one texture element.
  • the objects can preferably be aircraft seats and / or wall panels.
  • the position determination and identification system includes a database
  • this is preferably stored on a server on board the commercial aircraft. This offers the advantage that the data in the database, for example for a position determination, are also available when the aircraft is in the air and there is no adequate data connection to a server located on the ground.
  • the system can also be used to monitor the condition of the cabin of an aircraft using cameras.
  • a corresponding component can have, for example, two different texture elements, of which only one or at least one is not visible in an operating state.
  • the status of the constituent in question can be inferred from the identified texture elements. It is possible, for example, for an autonomous drone, which preferably navigates independently on the basis of the position which can be determined according to the invention, to inspect the state of the components of the cabin of an aircraft accordingly.
  • Figure 1 an embodiment of an inventive
  • FIG. 2 Examples of the repeatedly repeating pattern of the texture elements from FIG. 1.
  • An arrangement 1 according to the invention is shown in FIG. It is in the passenger cabin 2 of a commercial aircraft ei ne plurality of passenger seats 4 - also referred to as objects 4 - arranged.
  • a head cover 6 is provided on each of the passenger seats 4, which is provided with a pattern 7 that is repeated many times.
  • the head covers 6 represent texture elements 8 according to the invention, the patterns 7 of the individual texture elements 8 each representing pseudo-random deviations from a basic pattern ter 9.
  • FIG. 2 an exemplary basic pattern 9 (FIG. 2a) and two pseudo-random deviations therefrom (FIGS. 2b and 2c) are shown.
  • the basic pattern 9 is shown enlarged, while on the right-hand side this basic pattern 9 is repeated many times as a flat texture, as is basically shown as a pattern 7 on the texture elements 8.
  • FIG. 2b shows a first pseudo-random deviation 10 from the basic pattern 9 according to FIG. 2a, again both in an enlarged form (left) and as a flat texture.
  • a sample component - the circle marked with reference numeral 11 - is slightly shifted upwards compared to the basic sample 9.
  • the sample component provided with reference numeral 12 - in addition to the sample component 11 shifted upwards analogously to the first pseudo-random deviation 10, the sample component provided with reference numeral 12 - likewise a circle - is slightly shifted to the right.
  • the deviations 10 in the individual patterns 7 are hardly visible.
  • a viewer on board the airliner in the passenger cabin 2 will not notice the difference in the patterns 7 of the individual texture elements 8 regularly.
  • the deviations in the individual patterns 7 are so sufficient, however, that they are captured by an image acquisition unit 12 (cf. FIG. 1) and can be used for unambiguous identification.
  • the image acquisition unit 12 shown in FIG. 1 is a so-called tablet PC, on one side of which a large-area, touch-sensitive screen 13 is seen, while on the other side a digital camera module (not shown) is arranged.
  • the image captured by the digital camera module is processed by the image capture unit 12 before it is displayed on the screen 13 in a form supplemented by additional information.
  • the image capture unit 12 is designed, at least for those texture elements 8 that are sufficiently close to the digital camera module that the deviations contained therein from the basic pattern 9 are actually depicted in the captured image and when the digital camera module is resolved, the respective deviation and thus to clearly identify the texture element 8 itself.
  • Known image processing methods can be used for this purpose, which can, if necessary, fall back on the basic pattern 9 stored in a database 14 located on a server located on board the aircraft.
  • the image acquisition unit 12 and the server with the database 14 are wirelessly connected to one another.
  • the database 14 there is also an assignment of texture elements 8 to the respective objects 4, in this case the individual aircraft seats 4, and in addition to the direct assignment also information about the position of the individual objects 4 within the aircraft cabin 2 and computer models of the individual objects 4 are stored in the database 14.
  • the latter can also be clearly identified by the direct assignment of the identified texture elements 8 to the individual objects 4.
  • the relative position of the image acquisition unit 12 relative to the identified object 4 can be determined via edge-based model tracking. Since the position of the identified object 4 inside the aircraft cabin 2 is also known, the absolute position of the image capturing unit 12 in the aircraft cabin 2 can be determined.
  • the image taken by the digital camera module is supplemented by additional information before it is displayed on the screen 13.
  • additional information is shown.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Multimedia (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)

Abstract

Système de détermination et d'identification de position optique dans des environnements comportant une pluralité d'objets identiques (4), comprenant une unité de détection d'image mobile (12) et des éléments de texture (8) pouvant être appliqués aux objets (4), chaque élément de texture (8) présentant un motif (7) qui se répète plusieurs fois, présentant des écarts pseudo-aléatoires (10) par rapport à un motif de base prédéfini (9), l'unité de détection d'image mobile (12) étant conçue pour identifier de façon univoque des éléments de texture (8) sur la base de leurs écarts pseudo-aléatoires (10). Le système de détermination et d'identification de position optique selon l'invention est particulièrement adapté à une utilisation à bords d'avions de ligne.
EP19816634.0A 2018-12-05 2019-12-04 Système de détermination et d'identification de position optique Pending EP3891653A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102018131000.5A DE102018131000A1 (de) 2018-12-05 2018-12-05 Optisches Positionsbestimmungs- und Identifikationssystem
PCT/EP2019/083653 WO2020115121A1 (fr) 2018-12-05 2019-12-04 Système de détermination et d'identification de position optique

Publications (1)

Publication Number Publication Date
EP3891653A1 true EP3891653A1 (fr) 2021-10-13

Family

ID=68808360

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19816634.0A Pending EP3891653A1 (fr) 2018-12-05 2019-12-04 Système de détermination et d'identification de position optique

Country Status (4)

Country Link
US (1) US20220027619A1 (fr)
EP (1) EP3891653A1 (fr)
DE (1) DE102018131000A1 (fr)
WO (1) WO2020115121A1 (fr)

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6937349B2 (en) * 2003-05-02 2005-08-30 Mitutoyo Corporation Systems and methods for absolute positioning using repeated quasi-random pattern
US20050162396A1 (en) * 2004-01-28 2005-07-28 The Boeing Company Dynamic seat labeling and passenger identification system
WO2008060656A2 (fr) * 2006-04-03 2008-05-22 3M Innovative Properties Company Interface utilisateur de contrôle de véhicule
DE102008033733B4 (de) * 2008-07-17 2020-06-18 Airbus Operations Gmbh System und Verfahren zur Bedienung einer Vielzahl von Servicepositionen
DE102010035374A1 (de) * 2010-08-25 2012-03-01 Airbus Operations Gmbh System und Verfahren zum Sammeln von Defektdaten von Bauteilen in einer Passagierkabine eines Fahrzeugs
US9448758B2 (en) * 2012-07-18 2016-09-20 The Boeing Company Projecting airplane location specific maintenance history using optical reference points
US9881349B1 (en) * 2014-10-24 2018-01-30 Gopro, Inc. Apparatus and methods for computerized object identification
US9524435B2 (en) * 2015-03-20 2016-12-20 Google Inc. Detecting the location of a mobile device based on semantic indicators
WO2017176748A1 (fr) * 2016-04-04 2017-10-12 B/E Aerospace, Inc. Surveillance d'activité de passager d'aéronef
WO2018129051A1 (fr) * 2017-01-04 2018-07-12 Advanced Functional Fabrics Of America Articles de tissu identifiables de manière unique et réseaux sociaux les utilisant

Also Published As

Publication number Publication date
WO2020115121A1 (fr) 2020-06-11
DE102018131000A1 (de) 2020-06-10
US20220027619A1 (en) 2022-01-27

Similar Documents

Publication Publication Date Title
DE102008016215A1 (de) Informationsvorrichtungsbediengerät
DE102007053812A1 (de) Konfigurationsmodul für ein Videoüberwachungssystem, Überwachungssystem mit dem Konfigurationsmodul, Verfahren zur Konfiguration eines Videoüberwachungssystems sowie Computerprogramm
EP2711869A2 (fr) Procédé et dispositif de prélèvement d'empreintes digitales sur la base de scanners d'empreintes digitales dans une qualité élevée et fiable
EP2269130A1 (fr) Représentation des résultats de mesure de pièces en fonction de la détection des gestes d'un utilisateur
DE19962201A1 (de) Verfahren und Anordnung zur Erfassung und Analyse des Rezeptionsverhaltens von Personen
DE102013220107A1 (de) Verfahren zur Anleitung und/oder Kontrolle von an einem Arbeitsplatz auszuführenden Montage- und Kommissionierungsprozessen
DE102009035755A1 (de) Verfahren und Vorrichtung zum Überwachen eines Raumbereichs
DE10027365A1 (de) Objektbezogene Steuerung von Informations- und Werbeträgern
CH713061A1 (de) System und Verfahren zur berührungslosen biometrischen Authentifizierung.
EP3900330A1 (fr) Établissement et mesure d'une structure pour l'étalonnage d'une caméra
EP1821064B1 (fr) Procédé et dispositif destinés à la détection d'un contour d'une surface réfléchissante
WO2014012714A1 (fr) Schéma de processus d'une installation technique, en particulier d'une installation de voies de chemin de fer
EP2715279B1 (fr) Procédé et dispositif de contrôle de la géométrie des trains roulants d'un véhicule automobile
EP3586322A1 (fr) Procédé de détection d'au moins un objet jeton
WO2020115121A1 (fr) Système de détermination et d'identification de position optique
DE102018130569B4 (de) System und Verfahren zur Darstellung eines Kabinenlayouts im Originalmaßstab
DE102019114531A1 (de) Vorrichtung zur Lage- und Positionserkennung von Markierungen im dreidimensionalen Raum
WO2020160861A1 (fr) Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence
DE102012211734A1 (de) Verfahren und Vorrichtung zum Erfassen der Lage eines Objekts in einer Werkzeugmaschine
EP3599571B1 (fr) Dispositif d'identification biométrique d'une personne par reconnaissance faciale et procédé biométrique
DE102020100153A1 (de) Verfahren und Lagerfacheinrichtung zur Erkennung von Eingriffen in ein Lagerfach
EP4121897B1 (fr) Procédés et systèmes de fourniture des ensembles de données d'apprentissage labellisés synthétiques et leurs applications
DE102021004071B3 (de) Vorrichtung und Verfahren zum Erzeugen photometrischer Stereobilder und eines Farbbildes
DE102007056835A1 (de) Bildverarbeitunsmodul zur Schätzung einer Objektposition eines Überwachungsobjekts, Verfahren zur Bestimmung einer Objektposition eines Überwachungsobjekts sowie Computerprogramm
DE4335121A1 (de) Automatische Flächenrückführung bei optischen 3D Digitalisierer

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: UNKNOWN

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210608

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
P01 Opt-out of the competence of the unified patent court (upc) registered

Effective date: 20230516

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20231004