US20240036678A1 - Input device and input device control method - Google Patents

Input device and input device control method Download PDF

Info

Publication number
US20240036678A1
US20240036678A1 US18/016,925 US202118016925A US2024036678A1 US 20240036678 A1 US20240036678 A1 US 20240036678A1 US 202118016925 A US202118016925 A US 202118016925A US 2024036678 A1 US2024036678 A1 US 2024036678A1
Authority
US
United States
Prior art keywords
image
aerial
input device
user
display region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US18/016,925
Other languages
English (en)
Inventor
Kyosuke KUBOTA
Yosuke OGUCHI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Instruments Corp
Original Assignee
Nidec Sankyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Sankyo Corp filed Critical Nidec Sankyo Corp
Priority to US18/016,925 priority Critical patent/US20240036678A1/en
Assigned to NIDEC SANKYO CORPORATION reassignment NIDEC SANKYO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUBOTA, KYOSUKE, OGUCHI, YOSUKE
Publication of US20240036678A1 publication Critical patent/US20240036678A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • the present invention relates to an input device for users to input information using their fingertips.
  • the present invention also relates to a control method of such input devices.
  • the aerial-image display device includes an aerial-image forming mechanism and a display portion.
  • the PIN display and input portion includes a PIN display portion and a PIN input portion.
  • a keypad for inputting the PIN is displayed on the display portion.
  • the aerial-image forming mechanism projects the keypad displayed on the display portion into a space so as to form an image as an aerial image to display it on the PIN display portion.
  • the PIN input portion includes a detection mechanism which detects an operation performed by the user on the aerial image of the keypad displayed in the PIN display portion.
  • the detection mechanism is, for example, an infrared sensor, a camera or the like that detects a position of the user's fingertip in a plane containing the aerial image of the keypad displayed on the PIN display portion.
  • the PIN can be input by the user moving his/her fingertips sequentially to predetermined positions on the aerial image of the keypad displayed on the PIN display portion.
  • the user can input a PIN by sequentially pointing a fingertip at a predetermined number on the keypad displayed as an aerial image on the PIN display portion.
  • an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide an input device which can suppress input errors of information such as PINs.
  • an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide a control method of an input device, which can suppress input errors of information such as PINs.
  • an input device of an aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, in which the aerial-image display region is an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying information input in the input portion.
  • the recognition region is narrower than the display region.
  • a center part of the display region is the recognition region.
  • the detection mechanism is, for example, a reflective sensor array.
  • an input device of another aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, in which the aerial-image display region serving as an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the plurality of button images was pointed by the user's fingertip.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying information input in the input portion.
  • the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of images of the plurality of buttons was pointed by the user's fingertip.
  • the input device on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism (that is, on the basis of the detection results of the position in the two directions of the user's fingertip), it is possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the intended button image. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • the first and second detection mechanisms are preferably transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions.
  • the optical axis of the light emitted from the light emitting portion preferably pass through the center of the image of the button displayed in the aerial-image display region.
  • the first direction and the second direction are orthogonal to each other, and images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.
  • the aerial image is an image of a keypad containing an image of a plurality of numeric buttons.
  • a control method of an input device in a further different aspect of the present invention is a control method of an input device including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in the aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, in which it is recognized that the image of the button was pointed on the basis of a detection result of the detection mechanism, when the user points to the recognition region, which is narrower than a display region in a display region of the button image displayed in the aerial-image display region.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying the information input at the input portion.
  • the control method of the input device of this aspect when the user points at a recognition region, which is narrower than the display region within the display region of the image of the button displayed in the aerial-image display region, it is recognized on the basis of the detection result of the detection mechanism that the image of the button was pointed.
  • the input device by the control method of this aspect it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, by controlling the input device by the control method of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram for explaining a configuration of the input device shown in FIG. 1 .
  • FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device used in the input device shown in FIG. 1 .
  • FIG. 4 A is a diagram showing an example of an aerial image displayed in the aerial-image display region shown in FIG. 1
  • FIG. 4 B is a diagram for explaining a display region of button images and a recognition region shown in FIG. 4 A .
  • FIG. 5 is a schematic diagram for explaining a configuration of an input device according to Embodiment 2 of the present invention.
  • FIG. 6 is a schematic diagram for explaining a configuration of a detection mechanism according to a variation of Embodiment 2.
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device 1 according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram for explaining a configuration of the input device 1 shown in FIG. 1 .
  • FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device 3 used in the input device 1 shown in FIG. 1 .
  • FIG. 4 A is a diagram showing an example of an aerial image displayed in an aerial-image display region R shown in FIG. 1
  • FIG. 4 B is a diagram for explaining a display region DR of images of buttons 19 and a recognition region KR shown in FIG. 4 A .
  • the input device 1 in this embodiment is a device inputting information using a user's fingertip and is used by ATMs, authentication devices for credit card and other payments, automatic ticketing machines, vending machines, or access control devices, for example.
  • a PIN is input.
  • the input device 1 has an aerial-image display device 3 which displays an aerial image in a three-dimensional space, an optical detection mechanism 4 for detecting a position of the user's fingertip in the aerial-image display region R, which is a region in which the aerial image is displayed, and an enclosure 5 in which the aerial-image display device 3 and the detection mechanism 4 are accommodated.
  • the input device 1 includes a control portion 8 for controlling the input device 1 .
  • the aerial-image display device 3 has a display mechanism 11 having a display surface 11 a for displaying images, and an aerial-image forming mechanism 12 that projects the image displayed on the display surface 11 a into a space to form an image as an aerial image.
  • the display mechanism 11 and the aerial-image forming mechanism 12 are accommodated in the enclosure 5 .
  • the aerial-image forming mechanism 12 has a beam splitter 13 and a retroreflective material 14 .
  • a Y-direction in FIG. 3 which is orthogonal to an up-down direction (vertical direction), is referred to as a left-right direction, and a direction orthogonal to the up-down direction and the left-right direction is referred to as a front-back direction.
  • an X1-direction side in FIG. 3 which is one side in the front-back direction, is assumed to be a “front” side
  • an X2-direction side in FIG. 3 which is a side opposite to that, is assumed to be a “back” side.
  • a user standing on a front side of the input device 1 performs a predetermined operation on a front surface side of the input device 1 .
  • the display mechanism 11 is, for example, a liquid crystal display or an organic EL (electroluminescent) display, and the display surface 11 a is a display screen.
  • the display surface 11 a faces diagonally forward and downward.
  • the beam splitter 13 is formed as a flat plate.
  • the beam splitter 13 is disposed on the front side of the display mechanism 11 .
  • the beam splitter 13 reflects a part of light emitted from the display surface 11 a . That is, a surface on one side of the beam splitter 13 is a reflective surface 13 a that reflects a part of the light emitted from the display surface 11 a .
  • the reflective surface 13 a faces diagonally rearward and downward.
  • a retroreflective material 14 is formed as a flat plate.
  • the retroreflective material 14 is disposed on a lower side of the display mechanism 11 and is disposed on a rear side of the beam splitter 13 .
  • To the retroreflective material 14 the light reflected by the beam splitter 13 is incident.
  • the retroreflective material 14 reflects the incident light in the same direction as the incident direction toward the beam splitter 13 .
  • the surface on the one side of the retroreflective material 14 is a retroreflective surface 14 a , to which the light reflected by the beam splitter 13 is incident, and also reflects the incident light in the same direction as the incident direction toward the beam splitter 13 .
  • a quarter-wavelength plate is attached to the retroreflective surface 14 a .
  • the retroreflective surface 14 a faces diagonally forward and upward.
  • a part of the light emitted from the display surface 11 a of the display mechanism 11 is reflected by the reflective surface 13 a of the beam splitter 13 and enters the retroreflective surface 14 a of the retroreflective material 14 .
  • the light reflected by the reflective surface 13 a is directed diagonally rearward and downward.
  • the light incident to the retroreflective surface 14 a is reflected in the same direction as the incident direction of the light to the retroreflective surface 14 a .
  • the light reflected by the retroreflective surface 14 a goes diagonally forward and upward and passes through the beam splitter 13 .
  • an optical axis L1 of the light emitted from the display surface 11 a and an optical axis L2 of the light reflected by the beam splitter 13 are orthogonal.
  • the optical axis of the light reflected by the retroreflective material 14 matches the optical axis L2.
  • the light transmitted through the beam splitter 13 forms an aerial image in the aerial-image display region R.
  • the aerial-image display region R is disposed diagonally forward and upward of the beam splitter 13 .
  • the aerial image formed in the aerial-image display region R is recognized by a user standing in front of the input device 1 as an image inclined downward as it moves toward the front side.
  • the enclosure 5 is formed, for example, having a shape of a cuboid.
  • the enclosure 5 includes a frame body 17 that surrounds the aerial-image display region R.
  • the frame body 17 is formed having a rectangular or regular-square frame shape and is formed having a flat plate shape.
  • the frame body 17 constitutes a front upper surface of the enclosure 5 .
  • the frame body 17 which is formed having a flat plate shape, is inclined downward as it goes toward the front side.
  • An inner peripheral side of the frame body 17 is an opening portion 17 a that leads to an inside of the enclosure 5 .
  • the opening portion 17 a is formed having a rectangular or regular-square shape.
  • the aerial-image display region R is formed in the opening portion 17 a .
  • the aerial-image display region R serves as an input portion 18 for the user to input information using the fingertips.
  • the PIN is input in the input portion 18 .
  • the detection mechanism 4 detects a position of the user's fingertip in the aerial-image display region R, as described above. In other words, the input portion 18 is included in a detection range of the detection mechanism 4 .
  • the detection mechanism 4 is a reflective sensor array. Specifically, the detection mechanism 4 is a reflective infrared sensor array including a plurality of light emitting portions that emit infrared light and a plurality of light receiving portions that receive the infrared light emitted from the light emitting portions and reflected by the user's fingertip. Moreover, the detection mechanism 4 is a line sensor in which the light emitting portion and the light receiving portion are arranged alternately and in a straight line. The detection mechanism 4 is disposed on the side of the opening portion 17 a .
  • the detection mechanism 4 detects the position of the user's fingertip in a plane containing the aerial image (that is, in the plane containing the input portion 18 ). Moreover, the detection mechanism 4 detects the position of the user's fingertip in the entire aerial-image display region R (that is, in the entire input portion 18 ).
  • the display mechanism 11 displays an image of a keypad for inputting the PIN on the display surface 11 a
  • the aerial-image forming mechanism 12 displays the keypad image displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4 A ).
  • the aerial image displayed in the aerial-image display region R when the PIN is to be input is the image of the keypad.
  • the image of the keypad which is an aerial image, contains images of a plurality of buttons 19 for identifying the information (that is, PIN) to be input in the input portion 18 .
  • the images of the plurality of buttons 19 include images of a plurality of numeric buttons 19 .
  • the images of the plurality of buttons 19 include images of 10 numeric buttons 19 from “0” to “9”.
  • the images of the plurality of buttons 19 include images of five non-numeric buttons 19 .
  • the images of the plurality of buttons 19 are arranged in a matrix-state (matrix).
  • the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.
  • the user inputs the PIN by using the keypad image displayed in the aerial-image display region R. Specifically, the user inputs the PIN by sequentially moving the fingertip to a position of the images of the predetermined numeric buttons 19 displayed in the aerial-image display region R. In other words, the user inputs the PIN by sequentially pointing the fingertip at the image of the predetermined numeric button 19 in the input portion 18 (pointing operation).
  • the control portion 8 recognizes the image of the numeric button 19 that is pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 (that is, the detection result of the position of the user's fingertip). Assuming that a region in the display region DR of the image of the button 19 displayed in the aerial-image display region R and recognized by the control portion 8 such that the image of the predetermined button 19 was pointed by the user's fingertip on the basis of the detection result of the detection mechanism 4 is a recognition region KR, the recognition region KR is narrower than the display region DR in this embodiment (see FIG. 4 B ).
  • control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user's fingertip points at the recognition region KR in the display region DR, which is narrower than the display region DR.
  • the entire image of the button 19 surrounded by a solid-line circle (round) is the display region DR
  • a region surrounded by a broken-line circle (round) in the image of the button 19 is the recognition region KR (see FIG. 4 B ).
  • the center part of the display region DR is the recognition region KR.
  • the control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is the region surrounded by the broken-line circle, in the input portion 18 .
  • the recognition region KR is set by using application software installed in the control portion 8 .
  • control portion 8 When the control portion 8 recognizes that the image of a predetermined numeric button 19 was pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 , it transmits a control instruction to the display mechanism 11 , and the display mechanism 11 displays an image of, for example, a “*” mark on an upper-side part of the image of the plurality of buttons 19 on the display surface 11 a , and the aerial-image forming mechanism 12 displays the image with the mark “*” displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4 A and FIG. 4 B ).
  • the aerial-image display region R is the input portion 18 for inputting PINs, and the aerial image includes images of the plurality of buttons 19 for identifying PINs input in the input portion 18 .
  • the recognition region KR is narrower than the display region DR of the image of the button 19 , and the control portion 8 recognizes that the image of the button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is narrower than the display region DR.
  • the control portion 8 does not recognize that the image of a predetermined button 19 was pointed.
  • the control portion 8 it becomes possible for the control portion 8 to recognize that the image of a predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19 . Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs.
  • the center part of the display region DR is the recognition region KR, it becomes easier for the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19 . Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.
  • FIG. 5 is a schematic diagram for explaining a configuration of the input device 1 according to Embodiment 2 of the present invention.
  • the same symbols are given to configurations similar to those of Embodiment 1.
  • Embodiment 1 and Embodiment 2 are different in a configuration of the detection mechanism 4 . Moreover, since Embodiment 1 and Embodiment 2 are different in the configuration of the detection mechanism 4 , a recognition method by the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user is different. In the following, a configuration of the input device 1 according to Embodiment 2 will be explained mainly on the difference from Embodiment 1.
  • the image of the keypad is displayed as an aerial image in the aerial-image display region R when the PIN is to be input in the input portion 18 .
  • the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.
  • the vertical direction (row direction, V direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in a matrix is assumed to be a first direction
  • the lateral direction (column direction, W direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in the matrix as a second direction.
  • the first direction and the second direction are orthogonal to each other.
  • the detection mechanism 4 has a first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the plurality of buttons 19 is pointed by the user's fingertip, and a second detection mechanism 25 for detecting the user's fingertip in the second direction.
  • the detection mechanism 4 in this embodiment is constituted by the first detection mechanism 24 and the second detection mechanism 25 .
  • the first detection mechanism 24 is a transmissive sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27 .
  • the second detection mechanism 25 is a transmissive sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29 , similarly to the first detection mechanism 24 .
  • the light emitting portions 26 and 28 emit infrared light
  • the light receiving portions 27 and 29 receive the infrared light emitted from the light emitting portions 26 and 28 .
  • the first detection mechanism 24 and the second detection mechanism 25 are transmission-type infrared sensors.
  • the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix. In other words, the first detection mechanism 24 includes five pieces of the light emitting portions 26 and five pieces of the light receiving portions 27 . In other words, the first detection mechanism 24 includes five pairs of the light emitting portions 26 and the light receiving portions 27 .
  • the five light emitting portions 26 are arranged in the first direction at a certain pitch, and the five light receiving portions 27 are arranged in the first direction at a certain pitch.
  • the light emitting portion 26 and the light receiving portion 27 forming a pair are disposed at the same position in the first direction. Moreover, the light emitting portion 26 and the light receiving portion 27 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the second direction (that is, to sandwich the input portion 18 ).
  • the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix. In other words, the second detection mechanism 25 includes three pieces of the light emitting portions 28 and three pieces of the light receiving portions 29 . In other words, the second detection mechanism 25 includes three pairs of the light emitting portions 28 and the light receiving portions 29 .
  • the three light emitting portions 28 are arranged in the second direction at a certain pitch, and the three light receiving portions 29 are arranged in the second direction at a certain pitch.
  • the light emitting portion 28 and the light receiving portion 29 forming a pair are disposed at the same position in the second direction.
  • the light emitting portion 28 and the light receiving portion 29 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the first direction (that is, to sandwich the input portion 18 ).
  • the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R.
  • An optical axis L5 of the light emitted from the light emitting portion 26 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L5 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R.
  • An optical axis L6 of the light emitted from the light emitting portion 28 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L6 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • the detection mechanism 4 includes the first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the images of the plurality of buttons 19 is pointed by the user's fingertip and the second detection mechanism 25 for detecting the position of the user's fingertip in the second direction, and the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • the first detection mechanism 24 and the second detection mechanism 25 are transmissive sensors having a plurality of the light emitting portions 26 , 28 and a plurality of the light receiving portions 27 , 29 .
  • the first detection mechanism 24 and the second detection mechanism 25 are reflective sensors, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 , the position of the fingertip when the user's fingertip points at the image of the predetermined button 19 in the input portion 18 can be identified more accurately.
  • the optical axes L5 and L6 of the light emitted from the light emitting portions 26 , 28 pass through the center of the image of the button 19 displayed in the aerial-image display region R.
  • FIG. 6 is a schematic diagram for explaining a configuration of the detection mechanism 4 according to a variation of Embodiment 2.
  • the same reference symbols are used to denote the components similar to those of the embodiment described above.
  • the first detection mechanism 24 may be a reflective infrared sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27 .
  • the second detection mechanism 25 may be a reflective infrared sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29 .
  • the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix
  • the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix.
  • the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R, and the optical axis L5 of the light emitted from the light emitting portion 26 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • a part shifted from the center in the display region DR may be the recognition region KR.
  • the recognition region KR is a circular region, but the recognition region KR may be a non-circular region such as an oval-shaped region or a polygonal region.
  • the optical axis L5 of the light emitted from the light emitting portion 26 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R
  • the optical axis L6 of the light emitted from the light emitting portion 28 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R.
  • the first direction and the second direction are orthogonal to each other in the plane containing the aerial image, but the first direction and the second direction do not have to be orthogonal.
  • the first detection mechanism 24 and the second detection mechanism 25 may be transmissive sensor arrays.
  • the image of the keypad displayed in the aerial-image display region R may be configured by images of 12 buttons 19 arranged in 4 rows and 3 columns, for example.
  • information other than the PIN may be input in the input device 1 .
  • the aerial image displayed in the aerial-image display region R may be an image other than the keypad. Even in this case, the aerial image contains an image of a button to identify the information input in the input portion 18 .

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Hardware Design (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Human Computer Interaction (AREA)
  • Software Systems (AREA)
  • Bioethics (AREA)
  • Computer Security & Cryptography (AREA)
  • Medical Informatics (AREA)
  • Databases & Information Systems (AREA)
  • Optics & Photonics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Input From Keyboards Or The Like (AREA)
  • Image Input (AREA)
US18/016,925 2020-07-22 2021-05-31 Input device and input device control method Abandoned US20240036678A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/016,925 US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063054799P 2020-07-22 2020-07-22
PCT/JP2021/020743 WO2022018972A1 (ja) 2020-07-22 2021-05-31 入力装置および入力装置の制御方法
US18/016,925 US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method

Publications (1)

Publication Number Publication Date
US20240036678A1 true US20240036678A1 (en) 2024-02-01

Family

ID=79728618

Family Applications (3)

Application Number Title Priority Date Filing Date
US18/016,925 Abandoned US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method
US18/016,927 Active US12001031B2 (en) 2020-07-22 2021-05-31 Input device and information processing device
US18/016,785 Pending US20230288723A1 (en) 2020-07-22 2021-07-19 Input device

Family Applications After (2)

Application Number Title Priority Date Filing Date
US18/016,927 Active US12001031B2 (en) 2020-07-22 2021-05-31 Input device and information processing device
US18/016,785 Pending US20230288723A1 (en) 2020-07-22 2021-07-19 Input device

Country Status (3)

Country Link
US (3) US20240036678A1 (enExample)
JP (4) JPWO2022018973A1 (enExample)
WO (4) WO2022018972A1 (enExample)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7402265B2 (ja) * 2021-06-28 2023-12-20 日立チャネルソリューションズ株式会社 情報処理システム
JP2023131250A (ja) 2022-03-09 2023-09-22 アルプスアルパイン株式会社 光学素子の製造方法、光学素子、空中映像表示装置および空間入力装置

Family Cites Families (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5949348A (en) * 1992-08-17 1999-09-07 Ncr Corporation Method and apparatus for variable keyboard display
JPH07210299A (ja) * 1994-01-20 1995-08-11 Sumitomo Wiring Syst Ltd 光学式入力装置の入力位置検出方法
JP2000181632A (ja) * 1998-12-17 2000-06-30 Funai Electric Co Ltd 映像機器のタッチ入力装置
JP2002236944A (ja) * 2001-02-08 2002-08-23 Hitachi Ltd 情報提供端末
JP4064647B2 (ja) * 2001-08-24 2008-03-19 富士通株式会社 情報処理装置及び入力操作装置
JP2004178584A (ja) * 2002-11-26 2004-06-24 Asulab Sa 機能、装置、又は所定の場所にアクセスするためのタッチスクリーンによるセキュリティコードの入力方法、及びその方法を実行するためのデバイス
WO2012030153A2 (ko) * 2010-09-02 2012-03-08 주식회사 엘앤와이비젼 비접촉식 입력장치
WO2012032842A1 (ja) * 2010-09-06 2012-03-15 シャープ株式会社 表示システム、および検出方法
JP5602668B2 (ja) * 2011-03-24 2014-10-08 日本電産サンキョー株式会社 媒体処理装置およびフレキシブルケーブル
US8978975B2 (en) * 2011-07-18 2015-03-17 Accullink, Inc. Systems and methods for authenticating near field communcation financial transactions
US20130301830A1 (en) * 2012-05-08 2013-11-14 Hagai Bar-El Device, system, and method of secure entry and handling of passwords
CA2917478A1 (en) * 2013-07-10 2015-01-15 Real View Imaging Ltd. Three dimensional user interface
JP2015060296A (ja) * 2013-09-17 2015-03-30 船井電機株式会社 空間座標特定装置
JP6364994B2 (ja) * 2014-06-20 2018-08-01 船井電機株式会社 入力装置
US9531689B1 (en) * 2014-11-10 2016-12-27 The United States Of America As Represented By The Secretary Of The Navy System and method for encryption of network data
JP6927554B2 (ja) * 2015-12-07 2021-09-01 国立大学法人宇都宮大学 表示装置
US12061742B2 (en) * 2016-06-28 2024-08-13 Nikon Corporation Display device and control device
US11635827B2 (en) * 2016-06-28 2023-04-25 Nikon Corporation Control device, display device, program, and detection method
WO2018003860A1 (ja) * 2016-06-28 2018-01-04 株式会社ニコン 表示装置、プログラム、表示方法および制御装置
JP6725371B2 (ja) * 2016-09-07 2020-07-15 シャープ株式会社 表示装置、表示システム、および表示方法
EP3319069B1 (en) * 2016-11-02 2019-05-01 Skeyecode Method for authenticating a user by means of a non-secure terminal
JP6913494B2 (ja) * 2017-03-30 2021-08-04 日本電産サンキョー株式会社 フレキシブルプリント基板およびカードリーダ
JP6974032B2 (ja) * 2017-05-24 2021-12-01 シャープ株式会社 画像表示装置、画像形成装置、制御プログラムおよび制御方法
JP2019109636A (ja) * 2017-12-18 2019-07-04 コニカミノルタ株式会社 非接触入力装置
JP2019133284A (ja) * 2018-01-30 2019-08-08 コニカミノルタ株式会社 非接触式入力装置
JP7128720B2 (ja) * 2018-10-25 2022-08-31 日立チャネルソリューションズ株式会社 入出力装置及び自動取引装置
JP7164405B2 (ja) * 2018-11-07 2022-11-01 日立チャネルソリューションズ株式会社 画像読取り装置及び方法
JP7173895B2 (ja) 2019-02-22 2022-11-16 日立チャネルソリューションズ株式会社 空中像表示装置、取引装置、および空中像表示装置における空中像結像制御方法
JP7251301B2 (ja) * 2019-05-10 2023-04-04 京セラドキュメントソリューションズ株式会社 画像処理システム、画像処理方法、及び画像形成装置

Also Published As

Publication number Publication date
WO2022019279A1 (ja) 2022-01-27
US20230351804A1 (en) 2023-11-02
WO2022018972A1 (ja) 2022-01-27
US12001031B2 (en) 2024-06-04
WO2022018971A1 (ja) 2022-01-27
JPWO2022018972A1 (enExample) 2022-01-27
JPWO2022018973A1 (enExample) 2022-01-27
WO2022018973A1 (ja) 2022-01-27
JPWO2022019279A1 (enExample) 2022-01-27
US20230288723A1 (en) 2023-09-14
JPWO2022018971A1 (enExample) 2022-01-27

Similar Documents

Publication Publication Date Title
US11048900B2 (en) Image reading device and method
CN107077003B (zh) 光学器件
EP3467703A1 (en) Display device and fingerprint identification method thereof
US20240036678A1 (en) Input device and input device control method
CN107111383B (zh) 非接触输入装置及方法
EP3731135B1 (en) Method and device for fingerprint recognition, and terminal device
JPH11506237A (ja) ライトペン入力システム
JP2025010380A (ja) 空中像表示入力装置及び空中像表示入力方法
EP3754540B1 (en) Optical fingerprint recognition apparatus, electronic device and fingerprint recognition method
JP6182830B2 (ja) 電子機器
US9063616B2 (en) Optical touch device with symmetric light sources and locating method thereof
WO2022019280A1 (ja) 入力装置および入力装置の制御方法
EP4071526A1 (en) Display device
CN111597912A (zh) 显示模组及具有该显示模组的电子设备
US20060164387A1 (en) Input apparatus and touch-reading character/symbol input method
US20230244347A1 (en) Display operation section and device
EP4134730B1 (en) Display device and spatial input device including the same
JP2023168064A (ja) 空中操作装置
JP2017142577A (ja) 非接触表示入力装置及び方法
WO2018143313A1 (ja) ウェアラブル電子機器
CN111095271B (zh) 指纹检测的装置、显示屏和电子设备
KR20170021665A (ko) 광학식 터치스크린 기능을 갖는 디스플레이 장치
JP2009301250A (ja) 入力装置の複数指示位置の制御装置、その方法及びその記録媒体
US20230316799A1 (en) Input device
JP2022122693A (ja) 遊技機

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIDEC SANKYO CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUBOTA, KYOSUKE;OGUCHI, YOSUKE;REEL/FRAME:062423/0053

Effective date: 20230113

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION