WO2015165618A1 - Objekterkennung - Google Patents
Objekterkennung Download PDFInfo
- Publication number
- WO2015165618A1 WO2015165618A1 PCT/EP2015/054389 EP2015054389W WO2015165618A1 WO 2015165618 A1 WO2015165618 A1 WO 2015165618A1 EP 2015054389 W EP2015054389 W EP 2015054389W WO 2015165618 A1 WO2015165618 A1 WO 2015165618A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- module
- detected
- signal
- detection means
- primary beam
- Prior art date
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/10—Image acquisition
- G06V10/17—Image acquisition using hand-held instruments
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
- G06V40/28—Recognition of hand or arm movements, e.g. recognition of deaf sign language
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/002—Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/14—Measuring arrangements characterised by the use of optical techniques for measuring distance or clearance between spaced objects or spaced apertures
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S17/00—Systems using the reflection or reradiation of electromagnetic waves other than radio waves, e.g. lidar systems
- G01S17/02—Systems using the reflection of electromagnetic waves other than radio waves
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S3/00—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received
- G01S3/78—Direction-finders for determining the direction from which infrasonic, sonic, ultrasonic, or electromagnetic waves, or particle emission, not having a directional significance, are being received using electromagnetic waves other than radio waves
- G01S3/782—Systems for determining direction or deviation from predetermined direction
- G01S3/785—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system
- G01S3/786—Systems for determining direction or deviation from predetermined direction using adjustment of orientation of directivity characteristics of a detector or detector system to give a desired condition of signal derived from that detector or detector system the desired condition being maintained automatically
-
- G—PHYSICS
- G03—PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
- G03B—APPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
- G03B21/00—Projectors or projection-type viewers; Accessories therefor
- G03B21/14—Details
- G03B21/20—Lamp housings
- G03B21/2006—Lamp housings characterised by the light source
- G03B21/2033—LED or laser light sources
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V10/00—Arrangements for image or video recognition or understanding
- G06V10/40—Extraction of image or video features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30196—Human being; Person
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/20—Movements or behaviour, e.g. gesture recognition
Definitions
- the invention is based on a method for contactless interaction with a module according to the preamble of claim 1. Further, the present invention is based on a laser projector and a module with an interface for contactless interaction with an object.
- Control commands is realized by detecting user gestures with relatively high precision.
- a non-contact interaction of the object with the module preferably comprises a control of the module or an electrical device, if the module is integrated, for example, in the electrical device or is connected to the electrical device.
- the first sub-module is a red-green-blue (RGB) module, in particular a semiconductor laser component, wherein the first sub-module for generating a laser beam (primary beam) is configured.
- RGB red-green-blue
- the scanning movement preferably relates to such a movement of the primary beam through which an image visible to the user, for example a single image of a video sequence or a still image, is assembled by line-by-line projection of the image information into the projection area.
- a control command is preferably an input command for controlling the module and / or the laser projector.
- Control command takes place in particular by locating the object using the primary beam and detecting a generated by reflection on the object secondary signal.
- the geometric shape of the object is detected by object contour detection.
- Object recognition is realized, wherein the object recognition, for example, by the same primary beam, which is also used for the projection of the image information.
- the object outline detection comprises in particular a
- the control command is recognized by the module as a function of the detected geometric shape of the object and / or as a function of a detected further geometric shape of the object.
- Shape of the object is changed, for example, the hand or the finger is transferred from a curved position to an extended position.
- the second sub-module has a scanning mirror structure, wherein the
- Scanning mirror structure is so applied with a deflection movement, that the primary beam sweeps the object during the scanning movement line-like.
- Modular principle in an adapted manner in an electrical device - in particular a portable laser projector - can be integrated.
- the electrical device - in particular a portable laser projector - can be integrated.
- Scan mirror structure a microelectromechanical scanning mirror structure.
- the module has a third submodule, wherein in the third method step, a secondary signal generated by reflection of the primary beam at the object is detected by the third submodule, wherein the module
- Positioning signal is generated in response to the detected secondary signal such that the locating signal has information about the geometric shape of the object.
- the information relating to the geometric shape of the object is deducible from the locating signal.
- the locating signal comprises distance information relating to a distance of a projection point generated by the primary beam on the surface of the object to the module, wherein the distance information can be assigned to a deflection position of the scanning mirror structure.
- the third submodule is spatially spaced from the second submodule, the locating signal being configured in dependence on the detected secondary signal such that the locating signal comprises shadowing information relating to a subarea of the object, wherein in particular the subarea relative to is shadowed another area.
- the object contour detection is advantageously possible for the object contour detection to be realized by detecting a shaded area, wherein the shaded area relates in particular to a partial area of the object which is not or only partially detected by the primary beam during the scanning movement of the primary beam.
- the module comprises two spatially spaced-apart detection means, wherein in the third method step, the secondary signal through the two
- Detection is detected stereoscopically. According to a further preferred development, it is provided that in the third method step, a first detection means of the two detection means, a first
- Secondary signal of the secondary signal is detected and detected by a second detection means of the two detection means, a second secondary signal of the secondary signal. According to another preferred
- a first locating part signal is generated by the module as a function of the detected first secondary part signal and a second locating part signal depending on the detected second
- Secondary signal is generated, wherein the locating signal is generated by superposition of the first and second locating part signal, in particular, the geometric shape of the object is detected by evaluating the locating signal.
- recognition of control commands is realized by detecting user gestures with comparatively high precision.
- a comparatively fast and reliable object recognition is thereby realized by object contour detection, wherein preferably at least one image of the object is formed by at least two
- Detection means - in particular optical sensors - is detected, for example, on both sides relative to the second sub-module (or the
- Scanning mirror structure are arranged.
- the at least two images of the object detected by the at least two detection means are superimposed in such a way, wherein in particular a contour or an outline of the object and / or a shadowed area is detected.
- the second submodule has a scanning mirror structure for deflecting the primary steel in a deflection position of the scanning mirror structure, the scanning mirror structure being configured to change the deflection position such that the primary beam performs a line-like scanning movement.
- Modular principle in an adapted manner in an electrical device - in particular a portable laser projector - can be integrated.
- the electrical device - in particular a portable laser projector - can be integrated.
- Scan mirror structure a microelectromechanical scanning mirror structure.
- the module has a first detection means for detecting a first partial signal of a secondary signal generated by reflection of the primary beam on the object and / or wherein the module has a second detection means for detecting a second partial signal of the secondary signal,
- first and / or second detection means is spaced from the second sub-module and / or
- first and second detection means are spaced apart from each other such that by the two detection means, the object stereoscopic is detectable.
- an improved object recognition is realized by detecting a shaded subregion, wherein in particular an outline or a contour of an object or of a subregion of the object (for example of the shadowed region) is detected.
- the first and / or second detection means is integrated in the third submodule of the module.
- FIG. 1 shows a module according to an embodiment of the present invention
- FIG. 2 shows a laser projector according to an embodiment of the present invention
- FIGS. 3 and 4 show a module according to different embodiments of the present invention.
- Module 1 shows a module 2 according to an embodiment of the present invention.
- Module 2 provides an interface, in particular a user interface or human-machine interface (HMI), for non-contact interaction with an object 4.
- the object 4 is a selection object or control object guided by a user - for example a finger, a pencil or another spatial-physical object.
- the interaction takes place of the module 2 with the object 4 by detecting a movement and / or position of the object 4, wherein the object 4 is located in particular.
- HMI human-machine interface
- the module 2 has a first submodule 21 for generating a primary beam 3.
- the first submodule 21 is in particular a light module 21, preferably a laser module 21, particularly preferably a red-green-blue (RGB) module 21.
- RGB red-green-blue
- the primary beam 3 is a primary laser beam 3, wherein the
- Primary laser beam 3 red light, green light, blue light and / or infrared light has.
- the module 2 has a second sub-module 22 for deflecting the primary beam 3, so that the primary beam 3 in particular performs a line-like scanning movement.
- the second sub-module 22 is configured such that by deflection of the primary beam 3 image information in a
- Projection area 200 - in particular on a projection surface 200 of a projection object 20 - is projected.
- the scanning movement of the primary beam 3 takes place in such a way that with the primary beam 3 an image visible to the user is projected onto the projection object 20, for example a wall.
- the image information refers to a line-by-line composite image, such as a still image of a video sequence, a photographic image, a computer-generated image, and / or another image.
- the second sub-module 22 is a scanning module 22 or scanning mirror module 22, wherein the
- Scan mirror module 22 particularly preferably comprises a microelectromechanical system (MEMS) for deflecting the primary beam 3.
- MEMS microelectromechanical system
- the primary beam 3 through the second sub-module 22 is so with a MEMS microelectromechanical system
- Distraction movement acts on the primary beam 3, the scanning movement (ie in particular a multi-cell or grid-like scanning movement) along the projection area 200 (ie in particular along the projection surface 200 of the projection object 20) performs.
- the scanning mirror module 22 is configured to generate a (time-dependent) deflection position signal with respect to a deflection position of the scanning mirror module 22 during the scanning movement.
- the module 2 preferably has a third submodule 23, in particular
- Detection module 23 for detecting an interaction of the
- the secondary signal is generated by reflection of the primary beam 3 on the object 4 when the object 4 is positioned and / or moved relative to the module 2 such that the object 4 is detected by the primary beam 3 during the scanning movement of the primary beam 3.
- the object 4 is positioned in a location zone 30 associated with the primary beam 3.
- the detection module 23 a by the detection module 23 a
- the detection signal in particular comprises information relating to the detected secondary signal 5.
- the module 2 preferably has a fourth submodule 24 for generating a locating signal, the locating signal in particular having information relating to a (temporal) correlation of the detection signal with the locating signal
- Deflection signal includes.
- location means in particular a position determination and / or distance determination (using the primary beam 3).
- the module 2 preferably also has a fifth submodule 25 for controlling the first submodule 21 and / or the second submodule 22.
- the fifth submodule 25 is a control module 25 for generating a control signal for controlling the first submodule 21 and / or the second submodule 22, wherein the control signal is generated in particular as a function of the locating signal.
- FIG. 2 shows a laser projector 1 according to one embodiment of the
- a module 2 according to an embodiment of the present invention is integrated.
- the embodiment shown here is in particular substantially identical to the other embodiments according to the invention.
- the laser projector 1 arranged on a base 10, for example, a table 10, wherein in the laser projector 1, the module 2 is integrated.
- the primary beam 3 - ie in particular an RGB laser beam - generated by the RGB module 21 and directed to a scanning mirror structure 7 of the scanning module 22, wherein the primary beam 3 is deflected by the scanning mirror structure 7, that the primary beam 3 performs a scanning movement.
- the scanning movement of the primary beam 3 takes place in such a way that image information on a projection surface 200 on a projection object 20 - for example, a wall or another
- Umbrella means - is projected.
- FIG. 3 shows a module 2 according to an embodiment of the present invention, the embodiment shown here being shown in FIG.
- the module 2 comprises a
- Detection means 431 for detecting a generated by reflection of the primary beam 3 at the object 4 secondary signal 5.
- the object 4 secondary signal 5 For example, the
- Projection area 4 ' is generated, wherein the projection area 4' in the further portion 402 of the object 4 is arranged. Furthermore, by the
- shadowed area 401 is realized by the detection means 231 is spaced from the second sub-module 22, through which the
- FIG. 4 shows a module 2 according to an embodiment of the present invention, wherein the embodiment shown here is shown in FIG.
- the module 2 comprises two spatially spaced-apart detection means 231, 232.
- the two detection means 231, 232 with respect to the second sub-module 22 arranged on both sides, wherein the two detection means 231, 232 each have an equal distance 230 to the second sub-module 22.
- the secondary signal 5 which is generated by reflection of the primary beam 3 in a projection area 4 '(or pixel 4') generated on the object 4 during the scanning movement, is stereoscopically detected by the two detection means n 231, 232.
- the secondary signal 5 comprises two secondary part signals 51, 52.
- a first secondary part signal 51 of the secondary signal 5 is detected by a first detection means 231 of the two detection means 231, 232, and a second secondary part signal 52 of the secondary signal 5 is detected by a second detection means 232 of the two detection means 231, 232.
- Detection means 231, 232 - which are for example two optical sensors - detected.
- the at least two images of the object 4 detected by the at least two detection means 231, 232 are superimposed, wherein in particular a contour or an outline of the object 4 and / or a shaded area 401 is detected (see reference numeral 200 ').
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Electromagnetism (AREA)
- Radar, Positioning & Navigation (AREA)
- Remote Sensing (AREA)
- Computer Vision & Pattern Recognition (AREA)
- General Health & Medical Sciences (AREA)
- Psychiatry (AREA)
- Social Psychology (AREA)
- Health & Medical Sciences (AREA)
- Computer Networks & Wireless Communication (AREA)
- Optics & Photonics (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Mechanical Optical Scanning Systems (AREA)
Abstract
Description
Claims
Priority Applications (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/304,967 US20170185157A1 (en) | 2014-04-28 | 2015-03-03 | Object recognition device |
KR1020167033029A KR20160148643A (ko) | 2014-04-28 | 2015-03-03 | 객체 인식 방법 |
CN201580021780.0A CN106233307A (zh) | 2014-04-28 | 2015-03-03 | 对象识别 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102014207932.2 | 2014-04-28 | ||
DE102014207932.2A DE102014207932A1 (de) | 2014-04-28 | 2014-04-28 | Objekterkennung |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2015165618A1 true WO2015165618A1 (de) | 2015-11-05 |
Family
ID=52596511
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/EP2015/054389 WO2015165618A1 (de) | 2014-04-28 | 2015-03-03 | Objekterkennung |
Country Status (5)
Country | Link |
---|---|
US (1) | US20170185157A1 (de) |
KR (1) | KR20160148643A (de) |
CN (1) | CN106233307A (de) |
DE (1) | DE102014207932A1 (de) |
WO (1) | WO2015165618A1 (de) |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
TWI573985B (zh) * | 2016-01-19 | 2017-03-11 | 台達電子工業股份有限公司 | 感測裝置安裝輔助裝置及其輔助調整感測範圍之方法 |
EP3551995B1 (de) * | 2016-12-09 | 2024-05-01 | TRUMPF Photonic Components GmbH | Lasersensormodul zur partikeldichtedetektion |
CN114820670A (zh) * | 2022-03-23 | 2022-07-29 | 合肥嘉石科普服务有限公司 | 一种激光投影互动方法、系统及装置 |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101441513B (zh) * | 2008-11-26 | 2010-08-11 | 北京科技大学 | 一种利用视觉进行非接触式人机交互的系统 |
JP2013521576A (ja) * | 2010-02-28 | 2013-06-10 | オスターハウト グループ インコーポレイテッド | 対話式ヘッド取付け型アイピース上での地域広告コンテンツ |
CN102722254B (zh) * | 2012-06-20 | 2015-06-17 | 清华大学深圳研究生院 | 一种定位交互方法及系统 |
CN103365489A (zh) * | 2013-06-25 | 2013-10-23 | 南京信息工程大学 | 一种交互雾屏投影系统 |
-
2014
- 2014-04-28 DE DE102014207932.2A patent/DE102014207932A1/de not_active Withdrawn
-
2015
- 2015-03-03 US US15/304,967 patent/US20170185157A1/en not_active Abandoned
- 2015-03-03 WO PCT/EP2015/054389 patent/WO2015165618A1/de active Application Filing
- 2015-03-03 CN CN201580021780.0A patent/CN106233307A/zh active Pending
- 2015-03-03 KR KR1020167033029A patent/KR20160148643A/ko unknown
Non-Patent Citations (4)
Title |
---|
ANDREW D WILSON: "PlayAnywhere: A Compact Interactive Tabletop Projection-Vision System", UIST 05. PROCEEDINGS OF THE 18TH. ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY,, vol. 7, no. 2, 23 October 2005 (2005-10-23), pages 83 - 92, XP002541983, ISBN: 978-1-59593-023-1 * |
ANONYMOUS: "Handheld projector - Wikipedia, the free encyclopedia", 19 April 2014 (2014-04-19), XP055192657, Retrieved from the Internet <URL:http://en.wikipedia.org/w/index.php?title=Handheld_projector&oldid=604830436> [retrieved on 20150601] * |
DAVID MOLYNEAUX ET AL: "Interactive Environment-Aware Handheld Projectors for Pervasive Computing Spaces", 18 June 2012, PERVASIVE COMPUTING, SPRINGER BERLIN HEIDELBERG, BERLIN, HEIDELBERG, PAGE(S) 197 - 215, ISBN: 978-3-642-31204-5, XP047009401 * |
LISA G COWAN ET AL: "ShadowPuppets", HUMAN FACTORS IN COMPUTING SYSTEMS, ACM, 2 PENN PLAZA, SUITE 701 NEW YORK NY 10121-0701 USA, 7 May 2011 (2011-05-07), pages 2707 - 2716, XP058041566, ISBN: 978-1-4503-0228-9, DOI: 10.1145/1978942.1979340 * |
Also Published As
Publication number | Publication date |
---|---|
US20170185157A1 (en) | 2017-06-29 |
KR20160148643A (ko) | 2016-12-26 |
CN106233307A (zh) | 2016-12-14 |
DE102014207932A1 (de) | 2015-10-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2924477B1 (de) | Optoelektronische vorrichtung und verfahren zur erfassung von objektinformationen | |
EP2772676B1 (de) | 3D-Kamera und Verfahren zur dreidimensionalen Überwachung eines Überwachungsbereichs | |
EP2048557B1 (de) | Optoelektronischer Sensor und mobile Vorrichtung sowie Verfahren zur Konfiguration | |
EP3049823B1 (de) | Verfahren zur steuerung eines mikrospiegelscanners und mikrospiegelscanner | |
DE102010017857A1 (de) | 3D-Sicherheitsvorrichtung und Verfahren zur Absicherung und Bedienung mindestens einer Maschine | |
DE102014207896A1 (de) | 3D Grob-Laserscanner | |
DE102015115526A1 (de) | Verfahren zur Zielerfassung von Zielobjekten, insbesondere zur Zielerfassung von Bedienelementen in einem Fahrzeug | |
EP2275990A1 (de) | 3D-Sensor | |
EP2019281B1 (de) | Verfahren zum Betrieb eines 3D-Sensors | |
WO2015165618A1 (de) | Objekterkennung | |
DE102012104218A1 (de) | Umgebungserkennungsvorrichtung und Umgebungserkennungsverfahren | |
WO2015135737A1 (de) | Verfahren und vorrichtung zum bereitstellen einer graphischen nutzerschnittstelle in einem fahrzeug | |
EP1352301B1 (de) | Verfahren zum steuern von geräten | |
EP3034984A1 (de) | Verfahren und vorrichtung zur lokalen stabilisierung eines strahlungsflecks auf einem entfernten zielobjekt | |
DE202015100273U1 (de) | Eingabevorrichtung | |
WO2015165613A1 (de) | Interaktives menü | |
WO2015165619A1 (de) | Modul und verfahren zum betrieb eines moduls | |
DE102014224599A1 (de) | Verfahren zum Betreiben einer Eingabevorrichtung, Eingabevorrichtung | |
EP2270424B1 (de) | Optoelektronischer Sensor und Verfahren zur Überwachung | |
DE102017212062A1 (de) | Beleuchtungsvorrichtung für den Innenraum eines Kraftfahrzeugs | |
EP3097511A1 (de) | Verfahren zur erkennung einer bewegungsbahn mindestens eines bewegten objektes innerhalb eines erfassungsbereiches, verfahren zur gestikerkennung unter einsatz eines derartigen erkennungsverfahrens sowie vorrichtung zur durchführung eines derartigen erkennungsverfahrens | |
DE102013221850A1 (de) | Verfahren und Vorrichtung zur Laserbeschriftung eines Bauteils | |
DE102019002885A1 (de) | Bedienvorrichtung | |
WO2015165609A1 (de) | Programmierbare bedienflächen | |
DE102012021885A1 (de) | Linsenanordnung für ein Umgebungserfassungssystem mit dieser ein "Stereo-Bild" erzeugt werden kann, sowie ein dazugehöriges Verfahren und ein damit ausgestattetes Fahrerassistenzsystem und/oder Kraftfahrzeug (Kfz) |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 15707158 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15304967 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 20167033029 Country of ref document: KR Kind code of ref document: A |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 15707158 Country of ref document: EP Kind code of ref document: A1 |