KR101746014B1 - Touchless 3d position sensor - Google Patents
Touchless 3d position sensor Download PDFInfo
- Publication number
- KR101746014B1 KR101746014B1 KR1020150167715A KR20150167715A KR101746014B1 KR 101746014 B1 KR101746014 B1 KR 101746014B1 KR 1020150167715 A KR1020150167715 A KR 1020150167715A KR 20150167715 A KR20150167715 A KR 20150167715A KR 101746014 B1 KR101746014 B1 KR 101746014B1
- Authority
- KR
- South Korea
- Prior art keywords
- light
- light receiving
- output terminal
- converter
- photodiode array
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Position Input By Displaying (AREA)
Abstract
A non-contact type 3D position sensing sensor according to an embodiment of the present invention is a non-contact type 3D position sensing sensor that detects light and detects the position and tilting angle of an object. The non-contact type 3D position sensing sensor includes a light emitting unit for emitting light, The light receiving unit includes a light receiving sensor for detecting an optical signal flowing into the light receiving unit by reflected light, and a controller connected to the light receiving sensor to determine a 3D position or a tilting angle of the object do.
Description
Field of the Invention [0002] The present invention relates to a non-contact 3D position sensing sensor, and more particularly, to an apparatus for sensing a 3D position or a tilting angle of an object using an optical signal reflected by an object.
A display device such as a TV or a computer monitor generally inputs and processes desired information through a button input or a mouse input directly connected to a computer or a monitor through a computing device built in or connected to the display device.
In recent years, various types of main or auxiliary input methods have been adopted to overcome this conventional input method. Touch screen method is one of new input methods that is useful. The touch screen method uses a resistance film or an ultrasonic wave on a screen side of a display device to change a resistance of a screen generated when a user touches the screen with a finger or other input means or to generate a surface acoustic wave Detecting the coordinate on the screen of the touched portion, inputting information in the coordinates, and executing the command accordingly.
When such a touch screen type input method is used, there is an advantage that desired information can be easily input.
However, as the display device using the input device is becoming larger and larger, the distance between the display device and the operator is also distant, and thus, compared with the conventional condition in which the operator can easily reach out and touch, The operator often moves away from the operator's arm length, and the operator must move to touch the surface of the screen.
In addition, the contact-type coordinate input method such as the touch screen method causes a problem of contamination of the screen due to contact, and repeatedly applies a load load to the screen, thereby deteriorating the durability of the display device panel.
To solve this problem, a non-contact type coordinate input system has been developed. Unlike the contact-type coordinate input system described above, the non-contact type coordinate input system uses a different type of input means and sensor because the screen and the input means (finger, pen, etc.) do not make physical contact.
That is, the non-contact type coordinate input system generally uses a method using light, and performs the same function as the touch screen method by irradiating light at a desired position on the screen and inputting coordinates of a position where the light is irradiated. As the light used for such a purpose, mainly a laser beam is used but it is not necessarily limited to this, and the term " light " is a concept including all electromagnetic waves in all areas such as infrared rays and ultraviolet rays.
An example of a non-contact type coordinate input system is Korean Patent Laid-Open Publication No. 2001-0026856, which discloses a direct pointing system using light. The present invention is directed to a method for directly selecting a desired menu by directly pointing a remote controller without operating the remote controller. The pointer is a pointer for emitting and selecting light such as a laser beam toward a direction to be instructed, A position calculator for calculating a position on the screen from a sensing signal of the sensing unit, a controller for controlling the cursor to be displayed at the calculated position, A CPU for controlling an operation corresponding to a menu at a cursor position during operation, and a cursor generating unit for generating and displaying a cursor under the control of the CPU.
In case of using the direct pointing system, the user can easily select the menu displayed on the screen by directly pointing to the light without manipulating the remote controller.
Another example of such a non-contact type coordinate input system is the input / output device described in Japanese Patent Laid-Open No. 11-119910.
An input / output device detects an arbitrary position and performs an input corresponding to the input / output device. The input / output device causes the display screen to input a position in a non-contact manner. More specifically, the input / output device is an input / output device that detects a position of an arbitrary position of a display screen and performs input / output processing corresponding to the detected position, And the irradiation position is detected by a plurality of matrix-arranged light switching elements provided integrally with the display device to perform position detection corresponding to the output state of the photoelectric conversion element.
The above-mentioned input / output device is also a useful means for inputting information on a screen in a non-contact manner.
However, the invention disclosed in Korean Patent Laid-Open Publication No. 2001-0026856 or the invention disclosed in Japanese Laid-Open Patent Publication No. 11-119910 is disadvantageous in that a sensor capable of detecting light is arranged at a lattice point corresponding to each coordinate and a pointer And the coordinates are input by the information when the emitted light reaches the respective sensors.
However, this type of coordinate input method may have a problem when the display device is enlarged or the input coordinates must be more finely divided. That is, in the coordinate input method, since a sensor such as an optical sensor corresponding to each coordinate must be arranged, the sensor number is required as many as the coordinate number used in the input, and when the sensor is arranged at the same interval, The required amount of the sensor increases in proportion to the square of the degree of increase of the screen size (such as the width or the height), and the process of disposing the sensor in the enlarged input device is not practical.
Therefore, when the display device is enlarged, the sensor usage amount may become a heavy burden. In the same way, when the resolution of the coordinates increases, the number of sensors is increased as the resolution increases, and the number of the sensors is increased in proportion to the square of the resolution of the coordinates .
In addition, in the method of arranging the sensor at each coordinate, since the area where the sensor is located is set separately due to the opacity of the sensor and the remaining area is used as the pixel area, the sensor must be very small in order to secure the area of the pixel area. There is a possibility that the image quality may deteriorate
It is an object of the present invention to provide a
A
Here, the light receiving sensor includes first to fourth photodiode arrays arranged in the longitudinal direction of each side of a quadrangle, and includes a first photodiode array and a second photodiode array positioned in a direction opposite to the first photodiode array The output terminal of the third photodiode array is connected to the first differential amplifier and the output terminal of the fourth photodiode array located in the direction facing the third photodiode array and the third photodiode array can be connected to the second differential amplifier.
Here, the output terminal of the first differential amplifier may be connected to the first converter, and the output terminal of the second differential amplifier may be connected to the second converter so as to be respectively converted into digital signals.
Here, the output terminal of the first converter and the output terminal of the second converter may be connected to the controller.
Each of the sub output terminals connected in parallel to the output terminals of the first through fourth photodiode arrays may be connected to the controller via a sub amplifier and a sub converter.
Here, the controller determines a position or tilting angle in the X direction using a signal input from the output terminal of the first converter, and determines a position or a tilting angle in the Y direction using a signal input from the output terminal of the second converter And the position of the object in the Z direction can be determined using the signal input at the output terminal of each of the sub-converters.
The
In addition, the operation of the user can be accurately recognized.
Also, it can be realized in chip form and can be manufactured in small size.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic diagram of coordinate definition of a non-contact 3D position sensing sensor and an object for recognition according to an embodiment of the present invention; FIG.
2 is a plan view and a cross-sectional view of a light receiving unit according to an embodiment of the present invention.
3 is a circuit diagram of a light-receiving unit according to an embodiment of the present invention.
4 is a conceptual diagram for determining the position or tilting angle of an object through mapping in a controller according to an embodiment of the present invention.
5 is a graph showing a change in the voltage value according to the detected light.
It is to be understood that the specific structural or functional description of embodiments of the present invention disclosed herein is for illustrative purposes only and is not intended to limit the scope of the inventive concept But may be embodied in many different forms and is not limited to the embodiments set forth herein.
The embodiments according to the concept of the present invention can make various changes and can take various forms, so that the embodiments are illustrated in the drawings and described in detail herein. It is not intended to be exhaustive or to limit the invention to the particular forms disclosed, but on the contrary, is intended to cover all modifications, equivalents, or alternatives falling within the spirit and scope of the invention.
The terms first, second, etc. may be used to describe various elements, but the elements should not be limited by the terms. The terms may be named for the purpose of distinguishing one element from another, for example, without departing from the scope of the right according to the concept of the present invention, the first element may be referred to as a second element, The component may also be referred to as a first component.
It is to be understood that when an element is referred to as being "connected" or "connected" to another element, it may be directly connected or connected to the other element, . On the other hand, when an element is referred to as being "directly connected" or "directly connected" to another element, it should be understood that there are no other elements in between. Other expressions that describe the relationship between components, such as "between" and "between" or "neighboring to" and "directly adjacent to" should be interpreted as well.
The terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of the invention. The singular expressions include plural expressions unless the context clearly dictates otherwise. In this specification, the terms "comprises" or "having" and the like are used to specify that there are features, numbers, steps, operations, elements, parts or combinations thereof described herein, But do not preclude the presence or addition of one or more other features, integers, steps, operations, components, parts, or combinations thereof.
Unless defined otherwise, all terms used herein, including technical or scientific terms, have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs. Terms such as those defined in commonly used dictionaries are to be interpreted as having a meaning consistent with the meaning of the context in the relevant art and, unless explicitly defined herein, are to be interpreted as ideal or overly formal Do not.
BRIEF DESCRIPTION OF THE DRAWINGS FIG. 1 is a schematic diagram of coordinate definition of a non-contact 3D position sensing sensor and an object for recognition according to an embodiment of the present invention; FIG.
As shown in FIG. 1, the
The angle θ x formed between the plane on which the object is placed and the XY plane horizontal to the X axis and the angle θ y formed with the Y axis in the three-dimensional Cartesian coordinate system shown in FIG. 1 is the tilting angle (information) used in the embodiment of the present invention, to be.
A
The
The light emitted from the
The
2 is a plan view and a cross-sectional view of a light receiving unit according to an embodiment of the present invention.
3 is a circuit diagram of a light-receiving unit according to an embodiment of the present invention.
2 and 3, the
The shapes of the
The
Also, in FIG. 3, it is difficult to determine in which direction an object is placed when an object is placed in a blurred area where FOV (Field Of View) is overlapped. However, when the area overlapping with the size of the object for recognition is not large, the position of the object or the tilted information can be determined due to the change in the amount of light incident on the both-end photodiode array. The
3, the
3, the output terminal of the first
Further, all the other voltage signals connected in parallel at the output terminals of the first to
The
4 is a conceptual diagram for determining the position or tilting angle of an object through mapping in a controller according to an embodiment of the present invention.
The
The conventional gesture sensor performs the function of outputting the corresponding gesture according to the position of the object on the gesture sensor or the tilting angle. However, non-contact 3D position sensor according to an embodiment of the present invention includes a controller (3D mapper) according to the 3D position (X, Y, Z) with X or Y direction tilding angle (θ x, θ y) of the object (250) And the tilting angle and the height (? X ',? Y ', Z ') information as shown in FIG. 4A as 3D position detection information (X', Y ', Z' And output 2D (Fig. 4 (c)) or 1D position information and tilting angle (Fig. 4 (d)) according to the user's intention.
Equation (1) below is an example of 3D mapping by the controller.
[Equation 1]
(A) shows a case where the electrical signals (V x , V y , V z ) are linearly mapped to 3D position information, (b) shows an example in which an
Described in detail the operation of the non-contact 3D position sensor according to an embodiment of the present invention, the foregoing also by from 1 to 3, the left side of the
When the object is tilted in the y-axis direction (up or down direction) in consideration of FIG. 1, light is detected in PD T , which is the
Further, the detecting light in,
Furthermore, the object is positioned shifted to the + X direction when moving linearly, the signal is detected in the PD R
5 is a graph showing a change in the voltage value according to the detected light.
5 is a graph showing changes in the voltage level value obtained by connecting the respective terminals of the oscilloscope to the output terminal of the first
5, it can be seen that the voltage values of the
In PD L, the terminal (X) of the signal of the PD R pass through the first
The
The sub-amplifier 233 amplifies the signals input to the first through
The light receiving unit as described above can be implemented as a single chip, and the
100
210
221
223
231 First
233
242
Claims (6)
A light emitting portion for emitting light; And
And a light receiving unit for detecting reflected light reflected from the object by the light emitted from the light emitting unit,
The light receiving unit includes a light receiving sensor for detecting an optical signal flowing into the light receiving unit by reflected light;
And a controller connected to the light receiving sensor to determine a 3D position or a tilting angle of the object,
Wherein the light receiving sensor includes first through fourth photodiode arrays arranged in the longitudinal direction of each side of a quadrangle, the first photodiode array and the output end of a second photodiode array positioned in a direction opposite to the first photodiode array, Is connected to the first differential amplifier and the output terminal of the fourth photodiode array, which is located in the direction opposite to the third photodiode array, is connected to the second differential amplifier. Detection sensor.
Wherein the output terminal of the first differential amplifier is connected to the first converter and the output terminal of the second differential amplifier is connected to the second converter and converted into a digital signal.
Wherein the output terminal of the first converter and the output terminal of the second converter are connected to the controller.
And each sub output terminal connected in parallel to each of the output terminals of the first through fourth photodiode arrays is connected to the controller via the sub amplifier and the sub converter.
The controller determines a position or tilting angle in the X direction by using a signal input from the output terminal of the first converter and determines a position or tilting angle in the Y direction using a signal input at the output terminal of the second converter And determines the position of the object in the Z direction by using a signal input from an output terminal of each of the sub-converters.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150167715A KR101746014B1 (en) | 2015-11-27 | 2015-11-27 | Touchless 3d position sensor |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020150167715A KR101746014B1 (en) | 2015-11-27 | 2015-11-27 | Touchless 3d position sensor |
Publications (2)
Publication Number | Publication Date |
---|---|
KR20170062199A KR20170062199A (en) | 2017-06-07 |
KR101746014B1 true KR101746014B1 (en) | 2017-06-12 |
Family
ID=59219556
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
KR1020150167715A KR101746014B1 (en) | 2015-11-27 | 2015-11-27 | Touchless 3d position sensor |
Country Status (1)
Country | Link |
---|---|
KR (1) | KR101746014B1 (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102609695B1 (en) * | 2022-12-02 | 2023-12-05 | (주)대영엔씨디 | Dry road and building floor cutting device equipped with a dust inhaler capable of measuring cutting depth and distance and method for cutting road and building floors using the same |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101456983B1 (en) * | 2013-11-20 | 2014-11-26 | 주식회사 루멘스 | Angle sensing type command input apparatus, contactless joystick and Angle sensing command input method |
-
2015
- 2015-11-27 KR KR1020150167715A patent/KR101746014B1/en active IP Right Grant
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR101456983B1 (en) * | 2013-11-20 | 2014-11-26 | 주식회사 루멘스 | Angle sensing type command input apparatus, contactless joystick and Angle sensing command input method |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
KR102609695B1 (en) * | 2022-12-02 | 2023-12-05 | (주)대영엔씨디 | Dry road and building floor cutting device equipped with a dust inhaler capable of measuring cutting depth and distance and method for cutting road and building floors using the same |
Also Published As
Publication number | Publication date |
---|---|
KR20170062199A (en) | 2017-06-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10324566B2 (en) | Enhanced interaction touch system | |
JP5166713B2 (en) | Position detection system using laser speckle | |
US9448645B2 (en) | Digitizer using multiple stylus sensing techniques | |
CN102959494B (en) | An optical navigation module with capacitive sensor | |
US20070097097A1 (en) | Laser type coordinate sensing system for touch module | |
KR20160135179A (en) | Frequency conversion in a touch sensor | |
US20060055672A1 (en) | Input control for apparatuses | |
EP1687698A2 (en) | Light emitting stylus and user input device using same | |
CN102741782A (en) | Methods and systems for position detection | |
US20140035812A1 (en) | Gesture sensing device | |
US20160209929A1 (en) | Method and system for three-dimensional motion-tracking | |
TWI479391B (en) | Optical touch control device and method for determining coordinate thereof | |
US8780084B2 (en) | Apparatus for detecting a touching position on a flat panel display and a method thereof | |
Tsuji et al. | A layered 3D touch screen using capacitance measurement | |
KR101746014B1 (en) | Touchless 3d position sensor | |
KR20100066671A (en) | Touch display apparatus | |
TW201535165A (en) | Pen-type optical indexing apparatus and method for controlling the same | |
WO2009114821A9 (en) | Apparatus and method of finger-motion based navigation using optical sensing | |
US20150293612A1 (en) | Pen-type optical indexing apparatus and method for controlling the same | |
US20230297193A1 (en) | Detector system | |
TWI400641B (en) | Optical touch apparatus | |
US20110304586A1 (en) | Infrared type handwriting input apparatus and scanning method | |
KR101376907B1 (en) | Input device | |
KR100339923B1 (en) | Coordinate inputting apparatus using a light and its method | |
US8896553B1 (en) | Hybrid sensor module |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
E701 | Decision to grant or registration of patent right | ||
GRNT | Written decision to grant |