US11068108B2 - Input device - Google Patents
Input device Download PDFInfo
- Publication number
- US11068108B2 US11068108B2 US16/813,762 US202016813762A US11068108B2 US 11068108 B2 US11068108 B2 US 11068108B2 US 202016813762 A US202016813762 A US 202016813762A US 11068108 B2 US11068108 B2 US 11068108B2
- Authority
- US
- United States
- Prior art keywords
- sensing layer
- input device
- single object
- specific non
- unit
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Fee Related
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04812—Interaction techniques based on cursor appearance or behaviour, e.g. being affected by the presence of displayed objects
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/0482—Interaction with lists of selectable items, e.g. menus
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0484—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
- G06F3/04845—Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range for image manipulation, e.g. dragging, rotation, expansion or change of colour
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04106—Multi-sensing digitiser, i.e. digitiser using at least two different sensing technologies simultaneously or alternatively, e.g. for detecting pen and finger, for saving power or for improving position detection
-
- G—PHYSICS
- G06—COMPUTING OR CALCULATING; COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
Definitions
- the disclosure relates to an input device for receiving a specific non-contact operation performed on an operation screen by an object.
- a conventional input device forms an aerial image (real image) indicating an operation screen in an aerial display area, and detects movement of a finger of a user in a detection area facing the aerial image, thereby determining that a specific non-contact operation on the operation screen has been performed.
- the disclosure provides an input device that can precisely determine the presence or absence of a specific non-contact operation on an operation screen.
- an input device for receiving a specific non-contact operation performed on an operation screen by an object, including: a display control unit that displays the operation screen on a display surface; a first detection unit that detects a passing state of the object in a first sensing layer in air formed to face the display surface; a second detection unit that detects a passing state of the object in a second sensing layer in air formed between the display surface and the first sensing layer, and a determination unit that determines presence or absence of the specific non-contact operation performed by the object based on respective detection results of the first detection unit and the second detection unit.
- FIG. 1 is a plan view illustrating an input device according to Embodiment 1.
- FIG. 2 is a side view illustrating the input device according to Embodiment 1.
- FIG. 3 is a block diagram illustrating a configuration of the input device according to Embodiment 1.
- FIG. 4 is a flowchart illustrating an operation flow of the input device according to Embodiment 1.
- FIG. 5A is a diagram for description of a state in which an object passes through a first sensing layer at one place in the input device according to Embodiment 1.
- FIG. 5B is a diagram for description of a state in which the object passes through a second sensing layer at one place in the input device according to Embodiment 1.
- FIG. 6 is a diagram for description of a state in which the object passes through each of the first sensing layer and the second sensing layer at two places in the input device according to Embodiment 1.
- FIG. 7 is a diagram for description of a state in which the object passes through each of the first sensing layer and the second sensing layer at three places in the input device according to Embodiment 1.
- FIG. 8 is a plan view illustrating an input device according to Embodiment 2.
- FIG. 9 is a block diagram illustrating a configuration of the input device according to Embodiment 2.
- FIG. 10 is a flowchart illustrating an operation flow of the input device according to Embodiment 2.
- FIG. 1 is a plan view illustrating the input device 2 according to Embodiment 1.
- FIG. 2 is a side view illustrating the input device 2 according to Embodiment 1.
- FIG. 3 is a block diagram illustrating a configuration of the input device 2 according to Embodiment 1.
- the input device 2 includes a display unit 4 , a first detection unit 6 , and a second detection unit 8 .
- the input device 2 is applied as a user interface for operating a device (not illustrated) used in, for example, the food processing field or the medical field in a non-contact manner by an object 18 (for example, a finger of a user).
- the display unit 4 is, for example, a liquid crystal display panel.
- the display unit 4 has a display surface 12 for displaying an operation screen 10 .
- the operation screen 10 is, for example, a menu screen or the like for operating the device.
- an icon 14 an example of a predetermined display indicating an operation menu or the like of the device is displayed on the operation screen 10 .
- the first detection unit 6 detects a passing state of the object 18 in a first sensing layer 16 in air formed to face the display surface 12 of the display unit 4 . Specifically, the first detection unit 6 detects the number of passage places of the object 18 in the first sensing layer 16 .
- the first sensing layer 16 corresponds to a virtual plane (XY plane) formed at an aerial position substantially parallel to the display surface 12 of the display unit 4 .
- the passage place of the object 18 in the first sensing layer 16 refers to a cross-sectional region of the object 18 in the first sensing layer 16 . For example, when one finger of the user passes through the first sensing layer 16 , there is one passage place of the object 18 in the first sensing layer 16 . Further, for example, when two fingers of the user pass through the first sensing layer 16 , there are two passage places of the object 18 in the first sensing layer 16 .
- the first detection unit 6 includes, for example, a scan sensor, and is disposed to face a corner of the display unit 4 as illustrated in FIG. 1 . As illustrated in FIG. 2 , the first detection unit 6 includes a first light emitting unit 20 and a first light receiving unit 22 .
- the first light emitting unit 20 scans an infrared laser in the first sensing layer 16 in two-dimensional manner.
- the first light receiving unit 22 receives and detects light reflected by the object 18 passing through the first sensing layer 16 .
- the second detection unit 8 detects a passing state of the object 18 in a second sensing layer 24 in air formed between the display surface 12 of the display unit 4 and the first sensing layer 16 . Specifically, the second detection unit 8 detects the number of passage places of the object 18 in the second sensing layer 24 . As illustrated in FIG. 2 , a distance D between the second sensing layer 24 and the display surface 12 of the display unit 4 is a size (for example, about 1 cm to several centimeters) at which the object 18 does not directly come into contact with the display surface 12 of the display unit 4 when the object 18 passes through the second sensing layer 24 .
- the second sensing layer 24 is a virtual plane (XY plane) formed at an aerial position substantially parallel to the display surface 12 of the display unit 4 .
- the passage place of the object 18 in the second sensing layer 24 refers to a cross-sectional region of the object 18 in the second sensing layer 24 . For example, when one finger of the user passes through the second sensing layer 24 , there are one passage place of the object 18 in the second sensing layer 24 . Further, for example, when two fingers of the user pass through the second sensing layer 24 , there are two passage places of the object 18 in the second sensing layer 24 .
- the second detection unit 8 includes, for example, a scan sensor, and is disposed to face the corner of the display unit 4 as illustrated in FIG. 1 . As illustrated in FIG. 2 , the second detection unit 8 includes a second light emitting unit 26 and a second light receiving unit 28 .
- the second light emitting unit 26 scans an infrared laser in the second sensing layer 24 in a two-dimensional manner.
- the second light receiving unit 28 receives and detects light reflected by the object 18 passing through the second sensing layer 24 .
- the input device 2 further includes a calculation processing unit 30 .
- the calculation processing unit 30 includes a detection processing unit 32 , a distance measurement calculation unit 34 , a determination unit 36 , an operation processing unit 38 , and a display control unit 40 .
- the detection processing unit 32 computes the number of passage places of the object 18 in the first sensing layer 16 based on a detection signal from the first detection unit 6 . In addition, the detection processing unit 32 computes the number of passage places of the object 18 in the second sensing layer 24 based on a detection signal from the second detection unit 8 .
- the distance measurement calculation unit 34 computes a position (two-dimensional coordinates) of the object 18 in the first sensing layer 16 based on a detection signal from the first detection unit 6 . In addition, the distance measurement calculation unit 34 computes a position (two-dimensional coordinates) of the object 18 in the second sensing layer 24 based on a detection signal from the second detection unit 8 .
- the determination unit 36 determines the presence or absence of a specific non-contact operation performed on the operation screen 10 made by the object 18 based on respective computation results of the detection processing unit 32 and the distance measurement calculation unit 34 .
- the specific non-contact operation is, for example, a non-contact single touch gesture, multi-touch gesture, etc. performed by the finger of the user on the operation screen 10 .
- the single touch gesture is a gesture performed by one finger (for example, the index finger) of the user, and is, for example, a gesture such as a tap.
- the multi-touch gesture is a gesture performed by two fingers (for example, the index finger and the thumb) of the user, and is, for example, a gesture such as pinch-in, pinch-out, rotation, etc. A determination process by the determination unit 36 will be described later in detail.
- the operation processing unit 38 executes processing corresponding to the specific non-contact operation based on a computation result of the distance measurement calculation unit 34 and a determination result of the determination unit 36 . For example, when the user performs a single touch gesture on the icon 14 on the operation screen 10 in a non-contact manner, the operation processing unit 38 executes a process of selecting the icon 14 , etc. In addition, for example, when the user performs a multi-touch gesture on the operation screen 10 in a non-contact manner, the operation processing unit 38 executes a process of enlarging or reducing the display magnification of the operation screen 10 , etc.
- the display control unit 40 controls display content on the display surface 12 of the display unit 4 . Specifically, the display control unit 40 executes a process of displaying the operation screen 10 on the display surface 12 of the display unit 4 . In addition, based on respective computation results of the detection processing unit 32 and the distance measurement calculation unit 34 , the display control unit 40 executes a process of displaying a cursor 42 (see (b) of FIG. 5A described later) on the operation screen 10 when the object 18 passes through the first sensing layer 16 . In this instance, the cursor 42 is displayed at a position on the operation screen 10 corresponding to a position of the object 18 in the first sensing layer 16 .
- FIG. 4 is a flowchart illustrating an operation flow of the input device 2 according to Embodiment 1.
- FIG. 5A is a diagram for description of a state in which the object 18 passes through the first sensing layer 16 at one place in the input device 2 according to Embodiment 1.
- FIG. 5B is a diagram for description of a state in which the object 18 passes through the second sensing layer 24 at one place in the input device 2 according to Embodiment 1.
- FIG. 6 is a diagram for description of a state in which the object 18 passes through each of the first sensing layer 16 and the second sensing layer 24 at two places in the input device 2 according to Embodiment 1.
- FIG. 7 is a diagram for description of a state in which the object 18 passes through each of the first sensing layer 16 and the second sensing layer 24 at three places in the input device 2 according to Embodiment 1.
- the first detection unit 6 detects the number of passage places of the object 18 (one finger) in the first sensing layer 16 (S 101 ).
- the detection processing unit 32 computes the number of passage places of the object 18 in the first sensing layer 16 as “one place” based on a detection signal from the first detection unit 6 (“one place” in S 102 ).
- the display control unit 40 displays the cursor 42 on the operation screen 10 when the object 18 passes through the first sensing layer 16 based on respective computation results of the detection processing unit 32 and the distance measurement calculation unit 34 (S 103 ).
- the cursor 42 moves on the operation screen 10 following movement of the finger of the user.
- a sound may be output from a speaker (not illustrated) of the display unit 4 at the same time as the cursor 42 is displayed on the operation screen 10 .
- the second detection unit 8 detects the number of passage places of the object 18 (one finger) in the second sensing layer 24 (S 104 ).
- the detection processing unit 32 computes the number of passage places of the object 18 in the second sensing layer 24 as “one place” based on a detection signal from the second detection unit 8 .
- the determination unit 36 determines that a single touch gesture is performed (S 105 ).
- a dedicated button or icon other than the icon 14 may be displayed on the operation screen 10 .
- the first detection unit 6 detects the number of passage places of the object 18 (two fingers) in the sensing layer 16 (S 101 ).
- the detection processing unit 32 computes the number of passage places of the object 18 in the sensing layer 16 as “two places” based on a detection signal from the first detection unit 6 (“two places” in S 102 ).
- the display control unit 40 Based on respective computation results of the detection processing unit 32 and the distance measurement calculation unit 34 , the display control unit 40 displays the cursor 42 (see (b) of FIG. 5A ) on the operation screen 10 when the object 18 passes through the first sensing layer 16 (S 106 ).
- the second detection unit 8 detects the number of passage places of the object 18 (finger) in the second sensing layer 24 (S 107 ). Moreover, when two fingers of the user pass through the first sensing layer 16 , the number of passage places of the object 18 in the second sensing layer 24 is two or one.
- the detection processing unit 32 computes the number of passage places of the object 18 in the second sensing layer 24 as “two places” based on a detection signal from the second detection unit 8 (“two places” in S 108 ).
- the determination unit 36 determines that a multi-touch gesture has been performed (S 109 ).
- the detection processing unit 32 computes the number of passage places of the object 18 in the second sensing layer 24 as “one place” based on a detection signal from the second detection unit 8 (“one place” in S 108 ).
- the determination unit 36 determines that a single touch gesture is performed after a predetermined waiting time elapses from the time the object 18 passes through the second sensing layer 24 at one place (S 105 ).
- the first detection unit 6 detects the number of passage places of the object 18 (three fingers) in the first sensing layer 16 (S 101 ).
- the detection processing unit 32 computes the number of passage places of the object 18 in the first sensing layer 16 as “three places” based on a detection signal from the first detection unit 6 (“three places” in S 102 ).
- the display control unit 40 Based on respective computation results of the detection processing unit 32 and the distance measurement calculation unit 34 , the display control unit 40 displays the cursor 42 (see (b) of FIG. 5A ) on the operation screen 10 when the object 18 passes through the first sensing layer 16 (S 110 ).
- the second detection unit 8 detects the number of passage places of the object 18 (finger) in the second sensing layer 24 (S 111 ). Moreover, when three fingers of the user pass through the first sensing layer 16 , the number of passage places of the object 18 in the second sensing layer 24 is three or fewer.
- the detection processing unit 32 computes the number of passage places of the object 18 in the second sensing layer 24 as “three places” based on a detection signal from the second detection unit 8 .
- the determination unit 36 determines that neither the single touch gesture nor the multi-touch gesture is performed (S 112 ). In this instance, similarly to the case in which the object 18 passes through the second sensing layer 24 at two places or one place, the determination unit 36 determines that neither single touch gesture nor multi-touch gesture is performed.
- step S 102 the process proceeds from step S 102 to S 110 in the same manner as described above to execute respective processes of S 110 to S 112 , and it is determined that neither single touch gesture nor multi-touch gesture is performed.
- the determination unit 36 determines the presence or absence of a specific non-contact operation performed by the object 18 based on the number of passage places of the object 18 in each of the first sensing layer 16 and the second sensing layer 24 .
- the determination unit 36 determines that the specific non-contact operation on the operation screen 10 is performed.
- the determination unit 36 determines that the specific non-contact operation on the operation screen 10 is not performed.
- the input device 2 it is possible to precisely determine the presence or absence of a specific non-contact operation on the operation screen 10 , and it is possible to avoid an erroneous operation and the like of a device having the input device 2 as a user interface.
- FIG. 8 is a plan view illustrating the input device 2 A according to Embodiment 2.
- FIG. 9 is a block diagram illustrating a configuration of the input device 2 A according to Embodiment 2.
- the same components as those in Embodiment 1 are denoted by the same reference numerals, and the description thereof is omitted.
- respective configurations of a first detection unit 6 A and a second detection unit 8 A are different from the configurations in Embodiment 1.
- the first detection unit 6 A includes an optical array sensor, and has a plurality of first light emitting units 44 , a plurality of first light receiving units 46 , a plurality of second light emitting units 48 , and a plurality of second light receiving units 50 .
- the plurality of first light emitting units 44 is disposed at intervals along a first side 52 a of the display unit 4 .
- the plurality of first light receiving units 46 is disposed at intervals along a second side 52 b facing the first side 52 a of the display unit 4 . That is, each of the plurality of first light emitting units 44 is disposed corresponding to each of the plurality of first light receiving units 46 .
- Each of the plurality of first light emitting units 44 linearly irradiates an infrared ray (indicated by a one-dot chain line in FIG. 8 ) toward the plurality of first light receiving units 46 .
- Each of the plurality of first light receiving units 46 receives the infrared ray from the plurality of first light emitting units 44 .
- the plurality of second light emitting units 48 is disposed at intervals along a third side 52 c of the display unit 4 .
- the plurality of second light receiving units 50 is disposed at intervals along a fourth side 52 d facing the third side 52 c of the display unit 4 . That is, each of the plurality of second light emitting units 48 is disposed corresponding to each of the plurality of second light receiving units 50 .
- Each of the plurality of second light emitting units 48 linearly irradiates an infrared ray (indicated by a one-dot chain line in FIG. 8 ) toward the plurality of second light receiving units 50 .
- Each of the plurality of second light receiving units 50 receives the infrared ray from the plurality of second light emitting units 48 .
- a first sensing layer 16 A is formed in a region surrounded by the plurality of first light emitting units 44 , the plurality of first light receiving units 46 , the plurality of second light emitting units 48 , and the plurality of second light receiving units 50 .
- the object 18 passes through a predetermined position of the first sensing layer 16 A, the light from the first light emitting unit 44 and the second light emitting unit 48 corresponding to the predetermined position is blocked by the object 18 , and thus the light is not received by the first light receiving unit 46 and the second light receiving unit 50 corresponding to the predetermined position.
- the first detection unit 6 A detects the size (area) of the object 18 at the predetermined position of the first sensing layer 16 A.
- the second detection unit 8 A detects the presence or absence of the object 18 in the second sensing layer 24 . Similar to Embodiment 1, the second detection unit 8 A includes a scan sensor. Moreover, the second detection unit 8 A may include an optical array sensor similarly to the first detection unit 6 A.
- the detection processing unit 32 A computes the size of the object 18 in the first sensing layer 16 A based on a detection signal from the first detection unit 6 A. In addition, the detection processing unit 32 A determines the presence or absence of the object 18 in the second sensing layer 24 based on a detection signal from the second detection unit 8 A.
- the determination unit 36 A determines the presence or absence of a specific non-contact operation performed on the operation screen 10 by the object 18 based on respective computation results of the detection processing unit 32 A and the distance measurement calculation unit 34 .
- the specific non-contact operation is a non-contact single touch gesture on the operation screen 10 , for example. A determination process by the determination unit 36 A will be described later in detail.
- FIG. 10 is a flowchart illustrating an operation flow of the input device 2 A according to Embodiment 2.
- the first detection unit 6 A detects the size of the object 18 in the first sensing layer 16 A (S 201 ).
- the detection processing unit 32 A computes the size of the object 18 in the first sensing layer 16 A based on a detection signal from the first detection unit 6 A.
- the detection processing unit 32 A determines that the size of the object 18 in the first sensing layer 16 A is equal to or smaller than a threshold (YES in S 202 ).
- the threshold is an area corresponding to the average size of one finger.
- the display control unit 40 displays the cursor 42 (see (b) of FIG. 5A ) on the operation screen 10 when the object 18 passes through the first sensing layer 16 A (S 203 ).
- the second detection unit 8 A detects the presence of the object 18 (one finger) in the second sensing layer 24 (S 204 ).
- the detection processing unit 32 A determines that the object 18 is present in the second sensing layer 24 based on a detection signal from the second detection unit 8 A.
- the determination unit 36 A determines that a single touch gesture is performed when the object 18 passes through the second sensing layer 24 in a state that the cursor 42 is superimposed on the icon 14 (S 205 ).
- step S 202 when a wrist of the user passes through the first sensing layer 16 A, the detection processing unit 32 A determines that the size of the object 18 in the first sensing layer 16 A exceeds the threshold (NO in S 202 ). In this case, the display control unit 40 does not display the cursor 42 on the operation screen 10 .
- the second detection unit 8 A detects the presence of the object 18 (such as the wrist or the finger) in the second sensing layer 24 (S 206 ).
- the detection processing unit 32 A determines that the object 18 is present in the second sensing layer 24 based on a detection signal from the second detection unit 8 A.
- the determination unit 36 A determines that the single touch gesture is not performed (S 207 ).
- the determination unit 36 A determines the presence or absence of a specific non-contact operation performed by the object 18 based on the size of the object in the first sensing layer 16 A and the presence or absence of the object 18 in the second sensing layer 24 .
- the determination unit 36 A determines that a specific non-contact operation on the operation screen 10 is performed.
- the determination unit 36 A determines that a specific non-contact operation on the operation screen 10 is not performed.
- Embodiment 2 As in Embodiment 1, it is possible to precisely determine the presence or absence of a specific non-contact operation on the operation screen 10 , and it is possible to avoid erroneous operation and the like of a device having the input device 2 A as a user interface.
- Embodiment 1 and Embodiment 2 of the disclosure have been described above, but the disclosure is not limited to these embodiments. For example, the above respective embodiments may be combined.
- the operation screen 10 is displayed on the display surface 12 of the display unit 4 including the liquid crystal display panel.
- the operation screen may be, for example, an aerial image (real image) formed on a display surface which is an aerial display region, or may be a projection image projected on a display surface on a screen by a projector.
- the first detection unit 6 ( 6 A) and the second detection unit 8 ( 8 A) are separately configured.
- these detection units may be integrally configured. That is, each function of the first detection unit 6 ( 6 A) and the second detection unit 8 ( 8 A) may be realized by one detection unit.
- the first sensing layer 16 ( 16 A) and the second sensing layer 24 are formed in the air.
- the disclosure is not limited thereto, and one or a plurality of third sensing layers may be further formed in the air between the second sensing layer 24 and the display surface 12 of the display unit 4 .
- the input device 2 ( 2 A) includes one or a plurality of third detection units that detects a passing state of the object 18 in one or a plurality of third sensing layers, and the determination unit 36 determines the presence or absence of a specific non-contact operation performed by the object 18 based on a detection result of each of the first detection unit 6 ( 6 A), the second detection unit 8 ( 8 A), and the one or plurality of third detection units.
- the determination unit 36 may determine that a specific non-contact operation by the object 18 is performed when the object 18 passes through the third sensing layer formed at a position closest to the display surface 12 of the display unit 4 .
- the object 18 is the finger of the user.
- the disclosure is not limited thereto, and the object 18 may be, for example, an indicator stick, etc.
- each of the above devices may be configured as a computer system including a microprocessor, a ROM (Read Only Memory), a RAM (Random Access Memory), a hard disk drive, a display unit, a keyboard, a mouse, etc.
- a computer program is stored in the RAM or the hard disk drive.
- Each device achieves a function thereof by the microprocessor operating according to the computer program.
- the computer program is configured by combining a plurality of instruction codes indicating instructions for the computer in order to achieve a predetermined function.
- the system LSI is an ultra-multifunctional LSI manufactured by integrating a plurality of components on a single chip.
- the system LSI is a computer system including a microprocessor, a ROM, a RAM, etc.
- a computer program is stored in the RAM.
- the system LSI achieves functions thereof by the microprocessor operating according to the computer program.
- the components configuring each of the aforementioned devices may include an IC card or a single module that can be attached to and detached from each device.
- the IC card or the module is a computer system that includes a microprocessor, a ROM, a RAM, etc.
- the IC card or the module may include the ultra-multifunctional LSI described above.
- the IC card or the module achieves a function thereof by the microprocessor operating according to a computer program. This IC card or this module may have tamper resistance.
- the disclosure may be the aforementioned methods. Further, the disclosure may be a computer program that realizes these methods by a computer, or may be a digital signal including the computer program.
- the computer program or the digital signal may be recorded in a computer-readable non-transitory recording medium, for example, a flexible disk, a hard disk, a CD-ROM, an MO, a DVD, a DVD-ROM, a DVD-RAM, a BD, a semiconductor memory, etc.
- the disclosure may be the digital signal recorded in these non-transitory recording media.
- the computer program or the digital signal may be transmitted via an electric communication line, a wireless or wired communication line, a network represented by the Internet, a data broadcast, etc.
- the disclosure may be a computer system including a microprocessor and a memory, in which the memory stores the computer program, and the microprocessor operates according to the computer program.
- another independent computer system may be used for implementation by recording the program or the digital signal in the non-transitory recording medium and transferring the program or the digital signal or by transferring the program or the digital signal via the network, etc.
- each component may be configured by dedicated hardware or may be realized by executing a software program suitable for each component.
- Each component may be realized by a program execution unit such as a CPU or a processor reading and executing a software program recorded in a recording medium such as a hard disk or a semiconductor memory.
- the input device of the disclosure can be applied as, for example, a user interface for operating a device.
- an input device for receiving a specific non-contact operation performed on an operation screen by an object, including: a display control unit that displays the operation screen on a display surface; a first detection unit that detects a passing state of the object in a first sensing layer in air formed to face the display surface; a second detection unit that detects a passing state of the object in a second sensing layer in air formed between the display surface and the first sensing layer, and a determination unit that determines presence or absence of the specific non-contact operation performed by the object based on respective detection results of the first detection unit and the second detection unit.
- the passing state of the object in each of the first sensing layer and the second sensing layer is different between the case in which a user intentionally performs the specific non-contact operation on the operation screen and the case in which a part of the body of the user or the like unintentionally passes through each of the first sensing layer and the second sensing layer.
- the determination unit can precisely determine the presence or absence of the specific non-contact operation on the operation screen. As a result, it is possible to avoid an erroneous operation and the like of a device having the input device as a user interface.
- the first detection unit may detect the number of passage places of the object in the first sensing layer
- the second detection unit may detect the number of passage places of the object in the second sensing layer
- the determination unit can determine the presence or absence of the specific non-contact operation performed by the object based on the number of passage places of the object in each of the first sensing layer and the second sensing layer.
- the specific non-contact operation may be a non-contact single touch gesture for a predetermined display on the operation screen made by the object, and when the object passes through the first sensing layer at one place and passes through the second sensing layer at one place, the determination unit determines that the single touch gesture is performed.
- the determination unit can determine that the single touch gesture is performed based on the number of passage places of the object in each of the first sensing layer and the second sensing layer.
- the specific non-contact operation may be a non-contact multi-touch gesture on the operation screen made by the object, and when the object passes through the first sensing layer at two places and passes through the second sensing layer at two places, the determination unit determines that the multi-touch gesture is performed.
- the determination unit can determine that the multi-touch gesture is performed based on the number of passage places of the object in each of the first sensing layer and the second sensing layer.
- the determination unit may determine that the specific non-contact operation is not performed.
- the determination unit can determine that the specific non-contact operation is not performed based on the number of passage places of the object in each of the first sensing layer and the second sensing layer.
- the display control unit may display a cursor on the operation screen.
- the display control unit displays a cursor on the operation screen when the object passes through the first sensing layer, the user can easily recognize that the object passes through the first sensing layer.
- the first detection unit may detect a size of the object in the first sensing layer
- the second detection unit may detect presence or absence of the object in the second sensing layer
- the determination unit may determine that the specific non-contact operation is performed when the size of the object in the first sensing layer is smaller than or equal to a threshold and the object is detected in the second sensing layer, and determine that the specific non-contact operation is not performed when the size of the object in the first sensing layer exceeds the threshold and the object is detected in the second sensing layer.
- the determination unit can determine the presence or absence of the specific non-contact operation performed by the object based on the size of the object in the first sensing layer and the presence or absence of the object in the second sensing layer.
- the input device may further include one or a plurality of third detection units that detects a passing state of the object in one or a plurality of third sensing layers in air formed between the display surface and the second sensing layer, and the determination unit determines the presence or absence of the specific non-contact operation performed by the object based on respective detection results of the first detection unit, the second detection unit, and the one or plurality of third detection units.
- the determination unit can more precisely determine the presence or absence of the specific non-contact operation on the operation screen.
- the disclosure can be realized as a program for causing a computer to function as a characteristic processing unit included in an input device or a program for causing a computer to execute characteristic steps included in an aerial image display method.
- a program can be distributed via a computer-readable non-temporary recording medium such as a CD-ROM (Compact Disc-Read Only Memory) or a communication network such as the Internet.
- the input device of the embodiment of the disclosure it is possible to precisely determine the presence or absence of a specific non-contact operation on an operation screen.
Landscapes
- Engineering & Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- User Interface Of Digital Computer (AREA)
- Position Input By Displaying (AREA)
Abstract
Description
Claims (8)
Applications Claiming Priority (3)
| Application Number | Priority Date | Filing Date | Title |
|---|---|---|---|
| JPJP2019-070743 | 2019-04-02 | ||
| JP2019-070743 | 2019-04-02 | ||
| JP2019070743A JP7400205B2 (en) | 2019-04-02 | 2019-04-02 | input device |
Publications (2)
| Publication Number | Publication Date |
|---|---|
| US20200319750A1 US20200319750A1 (en) | 2020-10-08 |
| US11068108B2 true US11068108B2 (en) | 2021-07-20 |
Family
ID=69779894
Family Applications (1)
| Application Number | Title | Priority Date | Filing Date |
|---|---|---|---|
| US16/813,762 Expired - Fee Related US11068108B2 (en) | 2019-04-02 | 2020-03-10 | Input device |
Country Status (4)
| Country | Link |
|---|---|
| US (1) | US11068108B2 (en) |
| EP (1) | EP3719614B1 (en) |
| JP (1) | JP7400205B2 (en) |
| CN (1) | CN111796694B (en) |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12095956B2 (en) | 2022-11-04 | 2024-09-17 | Canon Kabushiki Kaisha | Image forming apparatus using a non-contact operation for changing a power mode |
| US12170748B2 (en) | 2022-11-18 | 2024-12-17 | Canon Kabushiki Kaisha | Image forming apparatus |
Families Citing this family (8)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JP2022113036A (en) * | 2021-01-22 | 2022-08-03 | キヤノン株式会社 | Information processing equipment |
| FR3122751B1 (en) * | 2021-05-04 | 2023-10-06 | Imprimerie Nat | Detection kit for interactive terminal |
| JP7256842B2 (en) * | 2021-05-25 | 2023-04-12 | 株式会社エイサムテクノロジー | contactless input device |
| JP7694168B2 (en) * | 2021-06-07 | 2025-06-18 | ニプロ株式会社 | Input Devices and Medical Devices |
| US20250010178A1 (en) * | 2021-10-25 | 2025-01-09 | Sony Interactive Entertainment Inc. | Operating device |
| US11995359B2 (en) | 2021-12-14 | 2024-05-28 | Canon Kabushiki Kaisha | Image forming apparatus with touch and touchless input portion |
| JP2024101393A (en) * | 2023-01-17 | 2024-07-29 | Toppanホールディングス株式会社 | Aerial display devices and display devices |
| KR102875728B1 (en) * | 2023-04-28 | 2025-10-24 | 주식회사 켐트로닉스 | Icon selection method of contactless touch sensor |
Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140028557A1 (en) | 2011-05-16 | 2014-01-30 | Panasonic Corporation | Display device, display control method and display control program, and input device, input assistance method and program |
| JP2014067071A (en) | 2012-09-10 | 2014-04-17 | Askanet:Kk | Floating touch panel |
| US20140340343A1 (en) * | 2013-02-22 | 2014-11-20 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing proximity motion using sensors |
| US20170038861A1 (en) | 2014-05-29 | 2017-02-09 | Fuji Electric Co., Ltd | Optical operating input detection apparatus, automatic vending machine, and optical operating input detection method |
| US20170329458A1 (en) | 2014-12-08 | 2017-11-16 | Hitachi Maxell, Ltd. | Projection video display device and video display method |
Family Cites Families (12)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| JPH06110610A (en) * | 1992-09-30 | 1994-04-22 | Toshiba Corp | Coordinate input device |
| JP2008009759A (en) * | 2006-06-29 | 2008-01-17 | Toyota Motor Corp | Touch panel device |
| JP5802667B2 (en) * | 2010-07-20 | 2015-10-28 | パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America | Gesture input device and gesture input method |
| JP5880024B2 (en) * | 2011-12-22 | 2016-03-08 | 株式会社バッファロー | Information processing apparatus and program |
| US10261612B2 (en) * | 2013-02-22 | 2019-04-16 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing proximity motion using sensors |
| US20150002475A1 (en) * | 2013-06-27 | 2015-01-01 | Industrial Technology Research Institute | Mobile device and method for controlling graphical user interface thereof |
| JP6359862B2 (en) * | 2014-04-17 | 2018-07-18 | シャープ株式会社 | Touch operation input device, touch operation input method, and program |
| KR101636460B1 (en) * | 2014-11-05 | 2016-07-05 | 삼성전자주식회사 | Electronic device and method for controlling the same |
| TWI546715B (en) * | 2014-12-26 | 2016-08-21 | 深圳市華星光電技術有限公司 | Floating touch method |
| CN105511675B (en) * | 2015-11-20 | 2020-07-24 | 重庆桔子科技发展有限公司 | Touch control method, user equipment, input processing method, mobile terminal and intelligent terminal |
| JP2017228216A (en) * | 2016-06-24 | 2017-12-28 | キヤノン株式会社 | Information processing apparatus, control method therefor, program, and storage medium |
| WO2018083737A1 (en) * | 2016-11-01 | 2018-05-11 | マクセル株式会社 | Display device and remote operation controller |
-
2019
- 2019-04-02 JP JP2019070743A patent/JP7400205B2/en active Active
-
2020
- 2020-02-26 CN CN202010118480.7A patent/CN111796694B/en active Active
- 2020-03-05 EP EP20161297.5A patent/EP3719614B1/en active Active
- 2020-03-10 US US16/813,762 patent/US11068108B2/en not_active Expired - Fee Related
Patent Citations (5)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US20140028557A1 (en) | 2011-05-16 | 2014-01-30 | Panasonic Corporation | Display device, display control method and display control program, and input device, input assistance method and program |
| JP2014067071A (en) | 2012-09-10 | 2014-04-17 | Askanet:Kk | Floating touch panel |
| US20140340343A1 (en) * | 2013-02-22 | 2014-11-20 | Samsung Electronics Co., Ltd. | Apparatus and method for recognizing proximity motion using sensors |
| US20170038861A1 (en) | 2014-05-29 | 2017-02-09 | Fuji Electric Co., Ltd | Optical operating input detection apparatus, automatic vending machine, and optical operating input detection method |
| US20170329458A1 (en) | 2014-12-08 | 2017-11-16 | Hitachi Maxell, Ltd. | Projection video display device and video display method |
Cited By (2)
| Publication number | Priority date | Publication date | Assignee | Title |
|---|---|---|---|---|
| US12095956B2 (en) | 2022-11-04 | 2024-09-17 | Canon Kabushiki Kaisha | Image forming apparatus using a non-contact operation for changing a power mode |
| US12170748B2 (en) | 2022-11-18 | 2024-12-17 | Canon Kabushiki Kaisha | Image forming apparatus |
Also Published As
| Publication number | Publication date |
|---|---|
| CN111796694B (en) | 2024-11-12 |
| JP7400205B2 (en) | 2023-12-19 |
| CN111796694A (en) | 2020-10-20 |
| US20200319750A1 (en) | 2020-10-08 |
| EP3719614A1 (en) | 2020-10-07 |
| JP2020170311A (en) | 2020-10-15 |
| EP3719614B1 (en) | 2022-12-21 |
Similar Documents
| Publication | Publication Date | Title |
|---|---|---|
| US11068108B2 (en) | Input device | |
| US11199963B2 (en) | Non-contact operation input device | |
| US9910527B2 (en) | Interpretation of pressure based gesture | |
| US8355887B1 (en) | Proximity based gesturing devices, systems and methods | |
| US9317130B2 (en) | Visual feedback by identifying anatomical features of a hand | |
| US20110205189A1 (en) | Stereo Optical Sensors for Resolving Multi-Touch in a Touch Detection System | |
| US20120218215A1 (en) | Methods for Detecting and Tracking Touch Objects | |
| US20140237422A1 (en) | Interpretation of pressure based gesture | |
| KR20100108116A (en) | Apparatus and method for recognizing touch gesture | |
| US10452205B2 (en) | Three-dimensional touch device and method of providing the same | |
| US20210026587A1 (en) | Touch apparatus | |
| AU2017203910A1 (en) | Glove touch detection | |
| US9235293B2 (en) | Optical touch device and touch sensing method | |
| TW201312422A (en) | Optical touch-control system with track detecting function and method thereof | |
| US9110588B2 (en) | Optical touch device and method for detecting touch point | |
| KR102169236B1 (en) | Touchscreen device and method for controlling the same and display apparatus | |
| US20240411408A1 (en) | Input detection apparatus, input detection method, and recording medium storing input detection program | |
| US20250321659A1 (en) | Electronic device with touch control | |
| KR20190049349A (en) | Method for recognizing user's touch on projection image and apparatus for performing the method | |
| KR101835952B1 (en) | Apparatus and method for controlling scroll of screen |
Legal Events
| Date | Code | Title | Description |
|---|---|---|---|
| FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| AS | Assignment |
Owner name: FUNAI ELECTRIC CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:MURAYAMA, MANABU;REEL/FRAME:052118/0986 Effective date: 20200123 |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED |
|
| STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
| STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
| FEPP | Fee payment procedure |
Free format text: MAINTENANCE FEE REMINDER MAILED (ORIGINAL EVENT CODE: REM.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| LAPS | Lapse for failure to pay maintenance fees |
Free format text: PATENT EXPIRED FOR FAILURE TO PAY MAINTENANCE FEES (ORIGINAL EVENT CODE: EXP.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
| STCH | Information on status: patent discontinuation |
Free format text: PATENT EXPIRED DUE TO NONPAYMENT OF MAINTENANCE FEES UNDER 37 CFR 1.362 |
|
| FP | Lapsed due to failure to pay maintenance fee |
Effective date: 20250720 |
|
| AS | Assignment |
Owner name: FEC IP LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:FUNAI ELECTRIC CO., LTD. (F/K/A FE-TECH CO., LTD.);REEL/FRAME:073121/0883 Effective date: 20250913 Owner name: FUNAI ELECTRIC CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNOR'S INTEREST;ASSIGNOR:FUNAI GROUP CO., LTD;REEL/FRAME:073121/0824 Effective date: 20250913 Owner name: FUNAI ELECTRIC CO., LTD., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUNAI GROUP CO., LTD;REEL/FRAME:073121/0824 Effective date: 20250913 Owner name: FEC IP LLC, TEXAS Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:FUNAI ELECTRIC CO., LTD. (F/K/A FE-TECH CO., LTD.);REEL/FRAME:073121/0883 Effective date: 20250913 |