US20240036678A1 - Input device and input device control method - Google Patents

Input device and input device control method Download PDF

Info

Publication number
US20240036678A1
US20240036678A1 US18/016,925 US202118016925A US2024036678A1 US 20240036678 A1 US20240036678 A1 US 20240036678A1 US 202118016925 A US202118016925 A US 202118016925A US 2024036678 A1 US2024036678 A1 US 2024036678A1
Authority
US
United States
Prior art keywords
image
aerial
input device
user
display region
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
US18/016,925
Inventor
Kyosuke KUBOTA
Yosuke OGUCHI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nidec Sankyo Corp
Original Assignee
Nidec Sankyo Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nidec Sankyo Corp filed Critical Nidec Sankyo Corp
Priority to US18/016,925 priority Critical patent/US20240036678A1/en
Assigned to NIDEC SANKYO CORPORATION reassignment NIDEC SANKYO CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: KUBOTA, KYOSUKE, OGUCHI, YOSUKE
Publication of US20240036678A1 publication Critical patent/US20240036678A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/70Protecting specific internal or peripheral components, in which the protection of a component leads to protection of the entire computer
    • G06F21/82Protecting input, output or interconnection devices
    • G06F21/83Protecting input, output or interconnection devices input devices, e.g. keyboards, mice or controllers thereof
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/0416Control or interface arrangements specially adapted for digitisers
    • G06F3/0418Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/602Providing cryptographic facilities or services
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F21/00Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
    • G06F21/60Protecting data
    • G06F21/62Protecting access to data via a platform, e.g. using keys or access control rules
    • G06F21/6218Protecting access to data via a platform, e.g. using keys or access control rules to a system of files or objects, e.g. local or distributed file system or database
    • G06F21/6245Protecting personal data, e.g. for financial or medical purposes
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04886Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/172Classification, e.g. identification
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/048Indexing scheme relating to G06F3/048
    • G06F2203/04803Split screen, i.e. subdividing the display area or the window area into separate subareas
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0428Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual

Definitions

  • the present invention relates to an input device for users to input information using their fingertips.
  • the present invention also relates to a control method of such input devices.
  • the aerial-image display device includes an aerial-image forming mechanism and a display portion.
  • the PIN display and input portion includes a PIN display portion and a PIN input portion.
  • a keypad for inputting the PIN is displayed on the display portion.
  • the aerial-image forming mechanism projects the keypad displayed on the display portion into a space so as to form an image as an aerial image to display it on the PIN display portion.
  • the PIN input portion includes a detection mechanism which detects an operation performed by the user on the aerial image of the keypad displayed in the PIN display portion.
  • the detection mechanism is, for example, an infrared sensor, a camera or the like that detects a position of the user's fingertip in a plane containing the aerial image of the keypad displayed on the PIN display portion.
  • the PIN can be input by the user moving his/her fingertips sequentially to predetermined positions on the aerial image of the keypad displayed on the PIN display portion.
  • the user can input a PIN by sequentially pointing a fingertip at a predetermined number on the keypad displayed as an aerial image on the PIN display portion.
  • an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide an input device which can suppress input errors of information such as PINs.
  • an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide a control method of an input device, which can suppress input errors of information such as PINs.
  • an input device of an aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, in which the aerial-image display region is an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying information input in the input portion.
  • the recognition region is narrower than the display region.
  • a center part of the display region is the recognition region.
  • the detection mechanism is, for example, a reflective sensor array.
  • an input device of another aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, in which the aerial-image display region serving as an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the plurality of button images was pointed by the user's fingertip.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying information input in the input portion.
  • the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of images of the plurality of buttons was pointed by the user's fingertip.
  • the input device on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism (that is, on the basis of the detection results of the position in the two directions of the user's fingertip), it is possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the intended button image. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • the first and second detection mechanisms are preferably transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions.
  • the optical axis of the light emitted from the light emitting portion preferably pass through the center of the image of the button displayed in the aerial-image display region.
  • the first direction and the second direction are orthogonal to each other, and images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.
  • the aerial image is an image of a keypad containing an image of a plurality of numeric buttons.
  • a control method of an input device in a further different aspect of the present invention is a control method of an input device including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in the aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, in which it is recognized that the image of the button was pointed on the basis of a detection result of the detection mechanism, when the user points to the recognition region, which is narrower than a display region in a display region of the button image displayed in the aerial-image display region.
  • the aerial-image display region in which the aerial image is displayed is an input portion for inputting information
  • the aerial image includes images of a plurality of buttons for identifying the information input at the input portion.
  • the control method of the input device of this aspect when the user points at a recognition region, which is narrower than the display region within the display region of the image of the button displayed in the aerial-image display region, it is recognized on the basis of the detection result of the detection mechanism that the image of the button was pointed.
  • the input device by the control method of this aspect it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, by controlling the input device by the control method of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram for explaining a configuration of the input device shown in FIG. 1 .
  • FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device used in the input device shown in FIG. 1 .
  • FIG. 4 A is a diagram showing an example of an aerial image displayed in the aerial-image display region shown in FIG. 1
  • FIG. 4 B is a diagram for explaining a display region of button images and a recognition region shown in FIG. 4 A .
  • FIG. 5 is a schematic diagram for explaining a configuration of an input device according to Embodiment 2 of the present invention.
  • FIG. 6 is a schematic diagram for explaining a configuration of a detection mechanism according to a variation of Embodiment 2.
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device 1 according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram for explaining a configuration of the input device 1 shown in FIG. 1 .
  • FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device 3 used in the input device 1 shown in FIG. 1 .
  • FIG. 4 A is a diagram showing an example of an aerial image displayed in an aerial-image display region R shown in FIG. 1
  • FIG. 4 B is a diagram for explaining a display region DR of images of buttons 19 and a recognition region KR shown in FIG. 4 A .
  • the input device 1 in this embodiment is a device inputting information using a user's fingertip and is used by ATMs, authentication devices for credit card and other payments, automatic ticketing machines, vending machines, or access control devices, for example.
  • a PIN is input.
  • the input device 1 has an aerial-image display device 3 which displays an aerial image in a three-dimensional space, an optical detection mechanism 4 for detecting a position of the user's fingertip in the aerial-image display region R, which is a region in which the aerial image is displayed, and an enclosure 5 in which the aerial-image display device 3 and the detection mechanism 4 are accommodated.
  • the input device 1 includes a control portion 8 for controlling the input device 1 .
  • the aerial-image display device 3 has a display mechanism 11 having a display surface 11 a for displaying images, and an aerial-image forming mechanism 12 that projects the image displayed on the display surface 11 a into a space to form an image as an aerial image.
  • the display mechanism 11 and the aerial-image forming mechanism 12 are accommodated in the enclosure 5 .
  • the aerial-image forming mechanism 12 has a beam splitter 13 and a retroreflective material 14 .
  • a Y-direction in FIG. 3 which is orthogonal to an up-down direction (vertical direction), is referred to as a left-right direction, and a direction orthogonal to the up-down direction and the left-right direction is referred to as a front-back direction.
  • an X1-direction side in FIG. 3 which is one side in the front-back direction, is assumed to be a “front” side
  • an X2-direction side in FIG. 3 which is a side opposite to that, is assumed to be a “back” side.
  • a user standing on a front side of the input device 1 performs a predetermined operation on a front surface side of the input device 1 .
  • the display mechanism 11 is, for example, a liquid crystal display or an organic EL (electroluminescent) display, and the display surface 11 a is a display screen.
  • the display surface 11 a faces diagonally forward and downward.
  • the beam splitter 13 is formed as a flat plate.
  • the beam splitter 13 is disposed on the front side of the display mechanism 11 .
  • the beam splitter 13 reflects a part of light emitted from the display surface 11 a . That is, a surface on one side of the beam splitter 13 is a reflective surface 13 a that reflects a part of the light emitted from the display surface 11 a .
  • the reflective surface 13 a faces diagonally rearward and downward.
  • a retroreflective material 14 is formed as a flat plate.
  • the retroreflective material 14 is disposed on a lower side of the display mechanism 11 and is disposed on a rear side of the beam splitter 13 .
  • To the retroreflective material 14 the light reflected by the beam splitter 13 is incident.
  • the retroreflective material 14 reflects the incident light in the same direction as the incident direction toward the beam splitter 13 .
  • the surface on the one side of the retroreflective material 14 is a retroreflective surface 14 a , to which the light reflected by the beam splitter 13 is incident, and also reflects the incident light in the same direction as the incident direction toward the beam splitter 13 .
  • a quarter-wavelength plate is attached to the retroreflective surface 14 a .
  • the retroreflective surface 14 a faces diagonally forward and upward.
  • a part of the light emitted from the display surface 11 a of the display mechanism 11 is reflected by the reflective surface 13 a of the beam splitter 13 and enters the retroreflective surface 14 a of the retroreflective material 14 .
  • the light reflected by the reflective surface 13 a is directed diagonally rearward and downward.
  • the light incident to the retroreflective surface 14 a is reflected in the same direction as the incident direction of the light to the retroreflective surface 14 a .
  • the light reflected by the retroreflective surface 14 a goes diagonally forward and upward and passes through the beam splitter 13 .
  • an optical axis L1 of the light emitted from the display surface 11 a and an optical axis L2 of the light reflected by the beam splitter 13 are orthogonal.
  • the optical axis of the light reflected by the retroreflective material 14 matches the optical axis L2.
  • the light transmitted through the beam splitter 13 forms an aerial image in the aerial-image display region R.
  • the aerial-image display region R is disposed diagonally forward and upward of the beam splitter 13 .
  • the aerial image formed in the aerial-image display region R is recognized by a user standing in front of the input device 1 as an image inclined downward as it moves toward the front side.
  • the enclosure 5 is formed, for example, having a shape of a cuboid.
  • the enclosure 5 includes a frame body 17 that surrounds the aerial-image display region R.
  • the frame body 17 is formed having a rectangular or regular-square frame shape and is formed having a flat plate shape.
  • the frame body 17 constitutes a front upper surface of the enclosure 5 .
  • the frame body 17 which is formed having a flat plate shape, is inclined downward as it goes toward the front side.
  • An inner peripheral side of the frame body 17 is an opening portion 17 a that leads to an inside of the enclosure 5 .
  • the opening portion 17 a is formed having a rectangular or regular-square shape.
  • the aerial-image display region R is formed in the opening portion 17 a .
  • the aerial-image display region R serves as an input portion 18 for the user to input information using the fingertips.
  • the PIN is input in the input portion 18 .
  • the detection mechanism 4 detects a position of the user's fingertip in the aerial-image display region R, as described above. In other words, the input portion 18 is included in a detection range of the detection mechanism 4 .
  • the detection mechanism 4 is a reflective sensor array. Specifically, the detection mechanism 4 is a reflective infrared sensor array including a plurality of light emitting portions that emit infrared light and a plurality of light receiving portions that receive the infrared light emitted from the light emitting portions and reflected by the user's fingertip. Moreover, the detection mechanism 4 is a line sensor in which the light emitting portion and the light receiving portion are arranged alternately and in a straight line. The detection mechanism 4 is disposed on the side of the opening portion 17 a .
  • the detection mechanism 4 detects the position of the user's fingertip in a plane containing the aerial image (that is, in the plane containing the input portion 18 ). Moreover, the detection mechanism 4 detects the position of the user's fingertip in the entire aerial-image display region R (that is, in the entire input portion 18 ).
  • the display mechanism 11 displays an image of a keypad for inputting the PIN on the display surface 11 a
  • the aerial-image forming mechanism 12 displays the keypad image displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4 A ).
  • the aerial image displayed in the aerial-image display region R when the PIN is to be input is the image of the keypad.
  • the image of the keypad which is an aerial image, contains images of a plurality of buttons 19 for identifying the information (that is, PIN) to be input in the input portion 18 .
  • the images of the plurality of buttons 19 include images of a plurality of numeric buttons 19 .
  • the images of the plurality of buttons 19 include images of 10 numeric buttons 19 from “0” to “9”.
  • the images of the plurality of buttons 19 include images of five non-numeric buttons 19 .
  • the images of the plurality of buttons 19 are arranged in a matrix-state (matrix).
  • the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.
  • the user inputs the PIN by using the keypad image displayed in the aerial-image display region R. Specifically, the user inputs the PIN by sequentially moving the fingertip to a position of the images of the predetermined numeric buttons 19 displayed in the aerial-image display region R. In other words, the user inputs the PIN by sequentially pointing the fingertip at the image of the predetermined numeric button 19 in the input portion 18 (pointing operation).
  • the control portion 8 recognizes the image of the numeric button 19 that is pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 (that is, the detection result of the position of the user's fingertip). Assuming that a region in the display region DR of the image of the button 19 displayed in the aerial-image display region R and recognized by the control portion 8 such that the image of the predetermined button 19 was pointed by the user's fingertip on the basis of the detection result of the detection mechanism 4 is a recognition region KR, the recognition region KR is narrower than the display region DR in this embodiment (see FIG. 4 B ).
  • control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user's fingertip points at the recognition region KR in the display region DR, which is narrower than the display region DR.
  • the entire image of the button 19 surrounded by a solid-line circle (round) is the display region DR
  • a region surrounded by a broken-line circle (round) in the image of the button 19 is the recognition region KR (see FIG. 4 B ).
  • the center part of the display region DR is the recognition region KR.
  • the control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is the region surrounded by the broken-line circle, in the input portion 18 .
  • the recognition region KR is set by using application software installed in the control portion 8 .
  • control portion 8 When the control portion 8 recognizes that the image of a predetermined numeric button 19 was pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 , it transmits a control instruction to the display mechanism 11 , and the display mechanism 11 displays an image of, for example, a “*” mark on an upper-side part of the image of the plurality of buttons 19 on the display surface 11 a , and the aerial-image forming mechanism 12 displays the image with the mark “*” displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4 A and FIG. 4 B ).
  • the aerial-image display region R is the input portion 18 for inputting PINs, and the aerial image includes images of the plurality of buttons 19 for identifying PINs input in the input portion 18 .
  • the recognition region KR is narrower than the display region DR of the image of the button 19 , and the control portion 8 recognizes that the image of the button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is narrower than the display region DR.
  • the control portion 8 does not recognize that the image of a predetermined button 19 was pointed.
  • the control portion 8 it becomes possible for the control portion 8 to recognize that the image of a predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19 . Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs.
  • the center part of the display region DR is the recognition region KR, it becomes easier for the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19 . Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.
  • FIG. 5 is a schematic diagram for explaining a configuration of the input device 1 according to Embodiment 2 of the present invention.
  • the same symbols are given to configurations similar to those of Embodiment 1.
  • Embodiment 1 and Embodiment 2 are different in a configuration of the detection mechanism 4 . Moreover, since Embodiment 1 and Embodiment 2 are different in the configuration of the detection mechanism 4 , a recognition method by the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user is different. In the following, a configuration of the input device 1 according to Embodiment 2 will be explained mainly on the difference from Embodiment 1.
  • the image of the keypad is displayed as an aerial image in the aerial-image display region R when the PIN is to be input in the input portion 18 .
  • the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.
  • the vertical direction (row direction, V direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in a matrix is assumed to be a first direction
  • the lateral direction (column direction, W direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in the matrix as a second direction.
  • the first direction and the second direction are orthogonal to each other.
  • the detection mechanism 4 has a first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the plurality of buttons 19 is pointed by the user's fingertip, and a second detection mechanism 25 for detecting the user's fingertip in the second direction.
  • the detection mechanism 4 in this embodiment is constituted by the first detection mechanism 24 and the second detection mechanism 25 .
  • the first detection mechanism 24 is a transmissive sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27 .
  • the second detection mechanism 25 is a transmissive sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29 , similarly to the first detection mechanism 24 .
  • the light emitting portions 26 and 28 emit infrared light
  • the light receiving portions 27 and 29 receive the infrared light emitted from the light emitting portions 26 and 28 .
  • the first detection mechanism 24 and the second detection mechanism 25 are transmission-type infrared sensors.
  • the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix. In other words, the first detection mechanism 24 includes five pieces of the light emitting portions 26 and five pieces of the light receiving portions 27 . In other words, the first detection mechanism 24 includes five pairs of the light emitting portions 26 and the light receiving portions 27 .
  • the five light emitting portions 26 are arranged in the first direction at a certain pitch, and the five light receiving portions 27 are arranged in the first direction at a certain pitch.
  • the light emitting portion 26 and the light receiving portion 27 forming a pair are disposed at the same position in the first direction. Moreover, the light emitting portion 26 and the light receiving portion 27 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the second direction (that is, to sandwich the input portion 18 ).
  • the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix. In other words, the second detection mechanism 25 includes three pieces of the light emitting portions 28 and three pieces of the light receiving portions 29 . In other words, the second detection mechanism 25 includes three pairs of the light emitting portions 28 and the light receiving portions 29 .
  • the three light emitting portions 28 are arranged in the second direction at a certain pitch, and the three light receiving portions 29 are arranged in the second direction at a certain pitch.
  • the light emitting portion 28 and the light receiving portion 29 forming a pair are disposed at the same position in the second direction.
  • the light emitting portion 28 and the light receiving portion 29 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the first direction (that is, to sandwich the input portion 18 ).
  • the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R.
  • An optical axis L5 of the light emitted from the light emitting portion 26 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L5 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R.
  • An optical axis L6 of the light emitted from the light emitting portion 28 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L6 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • the detection mechanism 4 includes the first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the images of the plurality of buttons 19 is pointed by the user's fingertip and the second detection mechanism 25 for detecting the position of the user's fingertip in the second direction, and the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • the first detection mechanism 24 and the second detection mechanism 25 are transmissive sensors having a plurality of the light emitting portions 26 , 28 and a plurality of the light receiving portions 27 , 29 .
  • the first detection mechanism 24 and the second detection mechanism 25 are reflective sensors, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 , the position of the fingertip when the user's fingertip points at the image of the predetermined button 19 in the input portion 18 can be identified more accurately.
  • the optical axes L5 and L6 of the light emitted from the light emitting portions 26 , 28 pass through the center of the image of the button 19 displayed in the aerial-image display region R.
  • FIG. 6 is a schematic diagram for explaining a configuration of the detection mechanism 4 according to a variation of Embodiment 2.
  • the same reference symbols are used to denote the components similar to those of the embodiment described above.
  • the first detection mechanism 24 may be a reflective infrared sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27 .
  • the second detection mechanism 25 may be a reflective infrared sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29 .
  • the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix
  • the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix.
  • the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R, and the optical axis L5 of the light emitted from the light emitting portion 26 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 .
  • a part shifted from the center in the display region DR may be the recognition region KR.
  • the recognition region KR is a circular region, but the recognition region KR may be a non-circular region such as an oval-shaped region or a polygonal region.
  • the optical axis L5 of the light emitted from the light emitting portion 26 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R
  • the optical axis L6 of the light emitted from the light emitting portion 28 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R.
  • the first direction and the second direction are orthogonal to each other in the plane containing the aerial image, but the first direction and the second direction do not have to be orthogonal.
  • the first detection mechanism 24 and the second detection mechanism 25 may be transmissive sensor arrays.
  • the image of the keypad displayed in the aerial-image display region R may be configured by images of 12 buttons 19 arranged in 4 rows and 3 columns, for example.
  • information other than the PIN may be input in the input device 1 .
  • the aerial image displayed in the aerial-image display region R may be an image other than the keypad. Even in this case, the aerial image contains an image of a button to identify the information input in the input portion 18 .

Abstract

An input device includes an optical detection mechanism detecting a position of the user's fingertip in the aerial-image display region in which an aerial image is displayed and a control portion for controlling the input device. In the input device, the aerial-image display region is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Assuming that a region in a display region of an image of the button displayed in the aerial-image display region and recognized by the control portion such that the image of the predetermined button was pointed by the user on the basis of the detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region in this input device.

Description

    CROSS REFERENCE TO RELATED APPLICATIONS
  • This is the U.S. national stage of application No. PCT/JP2021/020743, filed on May 31, 2021. Priority under 35 U.S.C. § 119(e) is claimed to U.S. Provisional Application No. 63/054,0799 filed Jul. 22, 2020, the disclosure of which is also incorporated herein by reference.
  • TECHNICAL FIELD
  • The present invention relates to an input device for users to input information using their fingertips. The present invention also relates to a control method of such input devices.
  • BACKGROUND ART
  • Conventionally, automated transaction devices such as ATMs (Automated Teller Machines) including an aerial-image display device and a PIN (Personal Identification Number) display and input portion are known (see Patent Literature 1, for example). In the automated transaction device described in Patent Literature 1, the aerial-image display device includes an aerial-image forming mechanism and a display portion. The PIN display and input portion includes a PIN display portion and a PIN input portion. On the display portion, a keypad for inputting the PIN is displayed. The aerial-image forming mechanism projects the keypad displayed on the display portion into a space so as to form an image as an aerial image to display it on the PIN display portion.
  • In the automated transaction device in Patent Literature 1, the PIN input portion includes a detection mechanism which detects an operation performed by the user on the aerial image of the keypad displayed in the PIN display portion. The detection mechanism is, for example, an infrared sensor, a camera or the like that detects a position of the user's fingertip in a plane containing the aerial image of the keypad displayed on the PIN display portion. In the automatic transaction device of Patent Literature 1, the PIN can be input by the user moving his/her fingertips sequentially to predetermined positions on the aerial image of the keypad displayed on the PIN display portion. In other words, in this automated transaction device, the user can input a PIN by sequentially pointing a fingertip at a predetermined number on the keypad displayed as an aerial image on the PIN display portion.
  • CITATION LIST Patent Literature
    • [Patent Literature 1] Japanese Unexamined Patent Application Publication 2020-134843
    SUMMARY OF THE INVENTION Problems to be Solved by the Invention
  • In the automated transaction device described in Patent Literature 1, if PIN input errors are repeated, subsequent procedures cannot be performed anymore in some cases. Therefore, it is desirable that this automated transaction device is less prone to PIN input errors.
  • Thus, an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide an input device which can suppress input errors of information such as PINs. Moreover, an object of at least an embodiment of the present invention is, in an input device for a user to input information by using an aerial image displayed in an aerial-image display region, to provide a control method of an input device, which can suppress input errors of information such as PINs.
  • Means for Solving the Problem
  • In order to solve the above problem, an input device of an aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, in which the aerial-image display region is an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region.
  • In the input device of this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Moreover, in this aspect, by assuming that a region in the display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user on the basis of a detection result of the detection mechanism is a recognition region, the recognition region is narrower than the display region. Thus, in this aspect, it is possible to cause the control portion to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • In this aspect, it is preferable that a center part of the display region is the recognition region. By configuring as above, it becomes easier for the control portion to recognize that a predetermined button image was pointed by the user, only when the user reliably pointed at the image of the intended button.
  • In this aspect, the detection mechanism is, for example, a reflective sensor array.
  • In order to solve the above problem, an input device of another aspect of the present invention is an input device inputting information by using a user's fingertip, characterized by including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, in which the aerial-image display region serving as an input portion for inputting information, the aerial image includes images of a plurality of buttons for identifying information input in the input portion, and by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the plurality of button images was pointed by the user's fingertip.
  • In the input device of this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying information input in the input portion. Moreover, in this aspect, by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction, the detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of images of the plurality of buttons was pointed by the user's fingertip.
  • Thus, in this aspect, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism (that is, on the basis of the detection results of the position in the two directions of the user's fingertip), it is possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the intended button image. Therefore, in the input device of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • In this aspect, the first and second detection mechanisms are preferably transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions. By configuring as above, as compared with a case where the first detection mechanism and the second detection mechanism are reflective sensors, the position of the user's fingertip in the input portion can be identified more accurately on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism. Therefore, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip more easily, only when the user's fingertip reliably pointed the intended button image.
  • In this aspect, the optical axis of the light emitted from the light emitting portion preferably pass through the center of the image of the button displayed in the aerial-image display region. By configuring as above, on the basis of the detection result of the first detection mechanism and the detection result of the second detection mechanism, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user's fingertip more easily, only when the user's fingertip reliably pointed the intended button image.
  • In this aspect, for example, the first direction and the second direction are orthogonal to each other, and images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.
  • In this aspect, for example, the aerial image is an image of a keypad containing an image of a plurality of numeric buttons.
  • Moreover, in order to solve the above problem, a control method of an input device in a further different aspect of the present invention is a control method of an input device including a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in the aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, in which it is recognized that the image of the button was pointed on the basis of a detection result of the detection mechanism, when the user points to the recognition region, which is narrower than a display region in a display region of the button image displayed in the aerial-image display region.
  • In this aspect, the aerial-image display region in which the aerial image is displayed is an input portion for inputting information, and the aerial image includes images of a plurality of buttons for identifying the information input at the input portion. Moreover, in the control method of the input device of this aspect, when the user points at a recognition region, which is narrower than the display region within the display region of the image of the button displayed in the aerial-image display region, it is recognized on the basis of the detection result of the detection mechanism that the image of the button was pointed. Thus, by controlling the input device by the control method of this aspect, it becomes possible to cause the input device to recognize that the image of the predetermined button was pointed by the user, only when the user reliably pointed at the image of the intended button. Therefore, by controlling the input device by the control method of this aspect, it becomes possible to suppress input errors of information such as PINs.
  • Effect of the Invention
  • As described above, in the present invention, in an input device for a user to input information using an aerial image displayed in the aerial-image display region, it becomes possible to suppress input errors of information such as PINs.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • Embodiments will now be described, by way of example only, with reference to the accompanying drawings which are meant to be exemplary, not limiting, and wherein like elements are numbered alike in several Figures, in which:
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device according to Embodiment 1 of the present invention.
  • FIG. 2 is a block diagram for explaining a configuration of the input device shown in FIG. 1 .
  • FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device used in the input device shown in FIG. 1 .
  • FIG. 4A is a diagram showing an example of an aerial image displayed in the aerial-image display region shown in FIG. 1 , and FIG. 4B is a diagram for explaining a display region of button images and a recognition region shown in FIG. 4A.
  • FIG. 5 is a schematic diagram for explaining a configuration of an input device according to Embodiment 2 of the present invention.
  • FIG. 6 is a schematic diagram for explaining a configuration of a detection mechanism according to a variation of Embodiment 2.
  • MODE FOR CARRYING OUT THE INVENTION
  • In the following, embodiments of the present invention will be described with reference to the drawings.
  • Other Embodiments
  • Configuration of Input Device
  • FIG. 1 is a schematic diagram for explaining a configuration of an input device 1 according to Embodiment 1 of the present invention. FIG. 2 is a block diagram for explaining a configuration of the input device 1 shown in FIG. 1 . FIG. 3 is a schematic diagram for explaining a configuration of an aerial-image display device 3 used in the input device 1 shown in FIG. 1 . FIG. 4A is a diagram showing an example of an aerial image displayed in an aerial-image display region R shown in FIG. 1 , and FIG. 4B is a diagram for explaining a display region DR of images of buttons 19 and a recognition region KR shown in FIG. 4A.
  • The input device 1 in this embodiment is a device inputting information using a user's fingertip and is used by ATMs, authentication devices for credit card and other payments, automatic ticketing machines, vending machines, or access control devices, for example. In the input device 1 of this embodiment, a PIN is input. The input device 1 has an aerial-image display device 3 which displays an aerial image in a three-dimensional space, an optical detection mechanism 4 for detecting a position of the user's fingertip in the aerial-image display region R, which is a region in which the aerial image is displayed, and an enclosure 5 in which the aerial-image display device 3 and the detection mechanism 4 are accommodated. Moreover, the input device 1 includes a control portion 8 for controlling the input device 1.
  • The aerial-image display device 3 has a display mechanism 11 having a display surface 11 a for displaying images, and an aerial-image forming mechanism 12 that projects the image displayed on the display surface 11 a into a space to form an image as an aerial image. The display mechanism 11 and the aerial-image forming mechanism 12 are accommodated in the enclosure 5. The aerial-image forming mechanism 12 has a beam splitter 13 and a retroreflective material 14. In the following explanation, a Y-direction in FIG. 3 , which is orthogonal to an up-down direction (vertical direction), is referred to as a left-right direction, and a direction orthogonal to the up-down direction and the left-right direction is referred to as a front-back direction. In addition, an X1-direction side in FIG. 3 , which is one side in the front-back direction, is assumed to be a “front” side, and an X2-direction side in FIG. 3 , which is a side opposite to that, is assumed to be a “back” side. In this embodiment, a user standing on a front side of the input device 1 performs a predetermined operation on a front surface side of the input device 1.
  • The display mechanism 11 is, for example, a liquid crystal display or an organic EL (electroluminescent) display, and the display surface 11 a is a display screen. The display surface 11 a faces diagonally forward and downward. The beam splitter 13 is formed as a flat plate. The beam splitter 13 is disposed on the front side of the display mechanism 11. The beam splitter 13 reflects a part of light emitted from the display surface 11 a. That is, a surface on one side of the beam splitter 13 is a reflective surface 13 a that reflects a part of the light emitted from the display surface 11 a. The reflective surface 13 a faces diagonally rearward and downward.
  • A retroreflective material 14 is formed as a flat plate. The retroreflective material 14 is disposed on a lower side of the display mechanism 11 and is disposed on a rear side of the beam splitter 13. To the retroreflective material 14, the light reflected by the beam splitter 13 is incident. The retroreflective material 14 reflects the incident light in the same direction as the incident direction toward the beam splitter 13. In other words, the surface on the one side of the retroreflective material 14 is a retroreflective surface 14 a, to which the light reflected by the beam splitter 13 is incident, and also reflects the incident light in the same direction as the incident direction toward the beam splitter 13. A quarter-wavelength plate is attached to the retroreflective surface 14 a. The retroreflective surface 14 a faces diagonally forward and upward.
  • A part of the light emitted from the display surface 11 a of the display mechanism 11 is reflected by the reflective surface 13 a of the beam splitter 13 and enters the retroreflective surface 14 a of the retroreflective material 14. The light reflected by the reflective surface 13 a is directed diagonally rearward and downward. The light incident to the retroreflective surface 14 a is reflected in the same direction as the incident direction of the light to the retroreflective surface 14 a. The light reflected by the retroreflective surface 14 a goes diagonally forward and upward and passes through the beam splitter 13. In this embodiment, an optical axis L1 of the light emitted from the display surface 11 a and an optical axis L2 of the light reflected by the beam splitter 13 are orthogonal. The optical axis of the light reflected by the retroreflective material 14 matches the optical axis L2.
  • The light transmitted through the beam splitter 13 forms an aerial image in the aerial-image display region R. The aerial-image display region R is disposed diagonally forward and upward of the beam splitter 13. The aerial image formed in the aerial-image display region R is recognized by a user standing in front of the input device 1 as an image inclined downward as it moves toward the front side.
  • The enclosure 5 is formed, for example, having a shape of a cuboid. The enclosure 5 includes a frame body 17 that surrounds the aerial-image display region R. The frame body 17 is formed having a rectangular or regular-square frame shape and is formed having a flat plate shape. The frame body 17 constitutes a front upper surface of the enclosure 5. The frame body 17, which is formed having a flat plate shape, is inclined downward as it goes toward the front side. An inner peripheral side of the frame body 17 is an opening portion 17 a that leads to an inside of the enclosure 5. The opening portion 17 a is formed having a rectangular or regular-square shape. The aerial-image display region R is formed in the opening portion 17 a. The aerial-image display region R serves as an input portion 18 for the user to input information using the fingertips. In this embodiment, the PIN is input in the input portion 18.
  • The detection mechanism 4 detects a position of the user's fingertip in the aerial-image display region R, as described above. In other words, the input portion 18 is included in a detection range of the detection mechanism 4. The detection mechanism 4 is a reflective sensor array. Specifically, the detection mechanism 4 is a reflective infrared sensor array including a plurality of light emitting portions that emit infrared light and a plurality of light receiving portions that receive the infrared light emitted from the light emitting portions and reflected by the user's fingertip. Moreover, the detection mechanism 4 is a line sensor in which the light emitting portion and the light receiving portion are arranged alternately and in a straight line. The detection mechanism 4 is disposed on the side of the opening portion 17 a. The detection mechanism 4 detects the position of the user's fingertip in a plane containing the aerial image (that is, in the plane containing the input portion 18). Moreover, the detection mechanism 4 detects the position of the user's fingertip in the entire aerial-image display region R (that is, in the entire input portion 18).
  • When a PIN is to be input in the input portion 18, the display mechanism 11 displays an image of a keypad for inputting the PIN on the display surface 11 a, and the aerial-image forming mechanism 12 displays the keypad image displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4A). In other words, the aerial image displayed in the aerial-image display region R when the PIN is to be input is the image of the keypad. The image of the keypad, which is an aerial image, contains images of a plurality of buttons 19 for identifying the information (that is, PIN) to be input in the input portion 18.
  • The images of the plurality of buttons 19 include images of a plurality of numeric buttons 19. Specifically, the images of the plurality of buttons 19 include images of 10 numeric buttons 19 from “0” to “9”. Moreover, the images of the plurality of buttons 19 include images of five non-numeric buttons 19. The images of the plurality of buttons 19 are arranged in a matrix-state (matrix). Specifically, the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns.
  • The user inputs the PIN by using the keypad image displayed in the aerial-image display region R. Specifically, the user inputs the PIN by sequentially moving the fingertip to a position of the images of the predetermined numeric buttons 19 displayed in the aerial-image display region R. In other words, the user inputs the PIN by sequentially pointing the fingertip at the image of the predetermined numeric button 19 in the input portion 18 (pointing operation).
  • The control portion 8 recognizes the image of the numeric button 19 that is pointed in the pointing operation on the basis of the detection result of the detection mechanism 4 (that is, the detection result of the position of the user's fingertip). Assuming that a region in the display region DR of the image of the button 19 displayed in the aerial-image display region R and recognized by the control portion 8 such that the image of the predetermined button 19 was pointed by the user's fingertip on the basis of the detection result of the detection mechanism 4 is a recognition region KR, the recognition region KR is narrower than the display region DR in this embodiment (see FIG. 4B). In other words, the control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user's fingertip points at the recognition region KR in the display region DR, which is narrower than the display region DR.
  • For example, in this embodiment, the entire image of the button 19 surrounded by a solid-line circle (round) is the display region DR, and a region surrounded by a broken-line circle (round) in the image of the button 19 is the recognition region KR (see FIG. 4B). In other words, in this embodiment, the center part of the display region DR is the recognition region KR. The control portion 8 recognizes that the image of the predetermined button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is the region surrounded by the broken-line circle, in the input portion 18. The recognition region KR is set by using application software installed in the control portion 8.
  • When the control portion 8 recognizes that the image of a predetermined numeric button 19 was pointed in the pointing operation on the basis of the detection result of the detection mechanism 4, it transmits a control instruction to the display mechanism 11, and the display mechanism 11 displays an image of, for example, a “*” mark on an upper-side part of the image of the plurality of buttons 19 on the display surface 11 a, and the aerial-image forming mechanism 12 displays the image with the mark “*” displayed on the display surface 11 a as an aerial image in the aerial-image display region R (see FIG. 4A and FIG. 4B).
  • Main Effect of this Embodiment
  • As described above, in this embodiment, the aerial-image display region R is the input portion 18 for inputting PINs, and the aerial image includes images of the plurality of buttons 19 for identifying PINs input in the input portion 18. Moreover, in this embodiment, the recognition region KR is narrower than the display region DR of the image of the button 19, and the control portion 8 recognizes that the image of the button 19 was pointed on the basis of the detection result of the detection mechanism 4 when the user pointed at the recognition region KR, which is narrower than the display region DR.
  • Thus, in this embodiment, for example, even if the user points at a space between the images of the two buttons 19 such as straddling the image of the adjacent button 19 with the number “1” and the image of the button 19 with the number “2”, the control portion 8 does not recognize that the image of a predetermined button 19 was pointed. In other words, in this embodiment, it becomes possible for the control portion 8 to recognize that the image of a predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs.
  • In particular, in this embodiment, since the center part of the display region DR is the recognition region KR, it becomes easier for the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user, only when the user reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.
  • Embodiment 2
  • Configuration of Input Device
  • FIG. 5 is a schematic diagram for explaining a configuration of the input device 1 according to Embodiment 2 of the present invention. In FIG. 5 , the same symbols are given to configurations similar to those of Embodiment 1.
  • Embodiment 1 and Embodiment 2 are different in a configuration of the detection mechanism 4. Moreover, since Embodiment 1 and Embodiment 2 are different in the configuration of the detection mechanism 4, a recognition method by the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user is different. In the following, a configuration of the input device 1 according to Embodiment 2 will be explained mainly on the difference from Embodiment 1.
  • Similarly to Embodiment 1, in Embodiment 2, the image of the keypad is displayed as an aerial image in the aerial-image display region R when the PIN is to be input in the input portion 18. In the keypad image, the images of the 15 buttons 19 are arranged in a matrix of 5 rows and 3 columns. In the following explanation, in the plane containing the aerial image (that is, the plane containing the keypad image), the vertical direction (row direction, V direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in a matrix is assumed to be a first direction, and the lateral direction (column direction, W direction in FIG. 5 ) of the image of the plurality of buttons 19 arranged in the matrix as a second direction. In the plane containing the aerial image, the first direction and the second direction are orthogonal to each other.
  • The detection mechanism 4 has a first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the plurality of buttons 19 is pointed by the user's fingertip, and a second detection mechanism 25 for detecting the user's fingertip in the second direction. The detection mechanism 4 in this embodiment is constituted by the first detection mechanism 24 and the second detection mechanism 25. The first detection mechanism 24 is a transmissive sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27. The second detection mechanism 25 is a transmissive sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29, similarly to the first detection mechanism 24. The light emitting portions 26 and 28 emit infrared light, and the light receiving portions 27 and 29 receive the infrared light emitted from the light emitting portions 26 and 28. In other words, the first detection mechanism 24 and the second detection mechanism 25 are transmission-type infrared sensors.
  • The first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix. In other words, the first detection mechanism 24 includes five pieces of the light emitting portions 26 and five pieces of the light receiving portions 27. In other words, the first detection mechanism 24 includes five pairs of the light emitting portions 26 and the light receiving portions 27. The five light emitting portions 26 are arranged in the first direction at a certain pitch, and the five light receiving portions 27 are arranged in the first direction at a certain pitch. The light emitting portion 26 and the light receiving portion 27 forming a pair are disposed at the same position in the first direction. Moreover, the light emitting portion 26 and the light receiving portion 27 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the second direction (that is, to sandwich the input portion 18).
  • The second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix. In other words, the second detection mechanism 25 includes three pieces of the light emitting portions 28 and three pieces of the light receiving portions 29. In other words, the second detection mechanism 25 includes three pairs of the light emitting portions 28 and the light receiving portions 29. The three light emitting portions 28 are arranged in the second direction at a certain pitch, and the three light receiving portions 29 are arranged in the second direction at a certain pitch. The light emitting portion 28 and the light receiving portion 29 forming a pair are disposed at the same position in the second direction. Moreover, the light emitting portion 28 and the light receiving portion 29 forming a pair are opposed and disposed so as to sandwich the aerial-image display region R in the first direction (that is, to sandwich the input portion 18).
  • The arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R. An optical axis L5 of the light emitted from the light emitting portion 26 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L5 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • The arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R. An optical axis L6 of the light emitted from the light emitting portion 28 (specifically, the infrared light) passes through the center of the image of the button 19 displayed in the aerial-image display region R. More specifically, when viewed from a direction orthogonal to the plane containing the aerial image, the optical axis L6 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • In this embodiment, when the user performs the pointing operation to point at the center part of the image of the predetermined numeric button 19 in the input portion 18, the infrared light incident to the one light receiving portion 27 of the five light receiving portions 27 is shielded, and the infrared light incident to the one light receiving portion 29 of the five light receiving portions 29 is also shielded. The control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25.
  • Main Effect of this Embodiment
  • As described above, in this embodiment, the detection mechanism 4 includes the first detection mechanism 24 for detecting the position of the user's fingertip in the first direction when each of the images of the plurality of buttons 19 is pointed by the user's fingertip and the second detection mechanism 25 for detecting the position of the user's fingertip in the second direction, and the control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25.
  • Thus, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25 (that is, on the basis of the detection result of the position of the user's fingertip in the two directions), it is possible to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to suppress input errors of PINs similarly to Embodiment 1.
  • In this embodiment, the first detection mechanism 24 and the second detection mechanism 25 are transmissive sensors having a plurality of the light emitting portions 26, 28 and a plurality of the light receiving portions 27, 29. Thus, in this embodiment, as compared with the case where the first detection mechanism 24 and the second detection mechanism 25 are reflective sensors, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, the position of the fingertip when the user's fingertip points at the image of the predetermined button 19 in the input portion 18 can be identified more accurately. Therefore, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes easier to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to effectively suppress input errors of PINs.
  • In this embodiment, the optical axes L5 and L6 of the light emitted from the light emitting portions 26, 28 pass through the center of the image of the button 19 displayed in the aerial-image display region R. Thus, in this embodiment, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes easier to cause the input device to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19. Therefore, in the input device 1 of this embodiment, it becomes possible to more effectively suppress input errors of PINs.
  • Variation of Detection Mechanism
  • FIG. 6 is a schematic diagram for explaining a configuration of the detection mechanism 4 according to a variation of Embodiment 2. In FIG. 6 , the same reference symbols are used to denote the components similar to those of the embodiment described above.
  • In Embodiment 2, the first detection mechanism 24 may be a reflective infrared sensor having a plurality of light emitting portions 26 and a plurality of light receiving portions 27. Moreover, the second detection mechanism 25 may be a reflective infrared sensor having a plurality of light emitting portions 28 and a plurality of light receiving portions 29. In this case, the first detection mechanism 24 includes the same number of the light emitting portions 26 and the light receiving portions 27 as the number of rows of 15 buttons 19 arranged in a matrix, and the second detection mechanism 25 includes the same number of the light emitting portions 28 and the light receiving portions 29 as the number of columns of 15 buttons 19 arranged in a matrix.
  • In this variation, too, the arrangement pitch of the light emitting portions 26 in the first direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the first direction displayed in the aerial-image display region R, and the optical axis L5 of the light emitted from the light emitting portion 26 passes through the center of the image of the button 19 displayed in the aerial-image display region R. Moreover, the arrangement pitch of the light emitting portions 28 in the second direction is equal to the arrangement pitch of the images of the plurality of buttons 19 in the second direction displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 passes through the center of the image of the button 19 displayed in the aerial-image display region R.
  • In this variation, when the user performs the pointing operation to point at the center part of the image of the predetermined numeric button 19 in the input portion 18, a light amount of the infrared light incident to the one light receiving portion 27 of the five light receiving portions 27 is largely fluctuated, and a light amount of the infrared light incident to the one light receiving portion 29 of the five light receiving portions 29 is largely fluctuated. The control portion 8 recognizes the image of the numeric button 19 that was pointed in the pointing operation on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25. In this variation, too, on the basis of the detection result of the first detection mechanism 24 and the detection result of the second detection mechanism 25, it becomes possible to cause the control portion 8 to recognize that the image of the predetermined button 19 was pointed by the user's fingertip, only when the user's fingertip reliably pointed at the image of the intended button 19 and thus, input errors of the PIN can be suppressed.
  • Other Embodiments
  • The embodiment described above is an example of a preferred embodiment of the present invention, but it is not limiting, and various modifications can be made within a range not changing the gist of the present invention.
  • In Embodiment 1, a part shifted from the center in the display region DR may be the recognition region KR. Moreover, in Embodiment 1, the recognition region KR is a circular region, but the recognition region KR may be a non-circular region such as an oval-shaped region or a polygonal region.
  • In Embodiment 2, the optical axis L5 of the light emitted from the light emitting portion 26 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R, and the optical axis L6 of the light emitted from the light emitting portion 28 may pass through a position shifted from the center of the image of the button 19 displayed in the aerial-image display region R. In Embodiment 2, the first direction and the second direction are orthogonal to each other in the plane containing the aerial image, but the first direction and the second direction do not have to be orthogonal. Moreover, in Embodiment 2, the first detection mechanism 24 and the second detection mechanism 25 may be transmissive sensor arrays.
  • In Embodiment described above, the image of the keypad displayed in the aerial-image display region R may be configured by images of 12 buttons 19 arranged in 4 rows and 3 columns, for example. Moreover, in Embodiment described above, information other than the PIN may be input in the input device 1. In this case, the aerial image displayed in the aerial-image display region R may be an image other than the keypad. Even in this case, the aerial image contains an image of a button to identify the information input in the input portion 18. The above description relates to specific examples according to the present invention, and various modifications are possible without departing from the spirit of the present invention. The appended claims are intended to cover such applications within the true scope and spirit of the invention.
  • DESCRIPTION OF REFERENCE NUMERALS
      • 1 Input device
      • 4 Detection mechanism
      • 8 Control portion
      • 11 Display mechanism
      • 11 a Display surface
      • 12 Aerial-image forming mechanism
      • 18 Input portion
      • 19 Button
      • 24 First detection mechanism
      • 25 Second detection mechanism
      • 26, 28 Light emitting portion
      • 27, 29 Light receiving portion
      • DR Display region
      • KR Recognition region
      • L5, L6 Optical axis of light emitted from light emitting portion
      • R Aerial-image display region
      • V First direction
      • W Second direction

Claims (10)

1. An input device inputting information by using a user's fingertip, the input device comprising:
a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, and a control portion for controlling the input device, wherein
the aerial-image display region is an input portion for inputting information;
the aerial image includes images of a plurality of buttons for identifying information input in the input portion; and
by assuming that a region in a display region for the images of the buttons displayed in the aerial-image display region, which is a region recognized by the control portion that the image of a predetermined button was pointed by the user based on a detection result of the optical detection mechanism, is a recognition region, the recognition region is narrower than the display region.
2. The input device according to claim 1, wherein a center part of the display region is the recognition region.
3. The input device according to claim 1, wherein the optical detection mechanism is a reflective sensor array.
4. An input device inputting information by using a user's fingertip, comprising:
a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image; and an optical detection mechanism detecting a position of the user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, wherein
the aerial-image display region is an input portion for inputting information;
the aerial image includes images of a plurality of buttons for identifying information input in the input portion; and
by assuming that directions crossing each other in a plane including the aerial image are a first direction and a second direction,
the optical detection mechanism includes a first detection mechanism detecting a position in the first direction of the user's fingertip and a second detection mechanism detecting a position in the second direction of the user's fingertip, when each of the images of the plurality of buttons was pointed by the user's fingertip.
5. The input device according to claim 4, wherein
the first detection mechanism and the second detection mechanism are transmissive sensors having a plurality of light emitting portions and a plurality of light receiving portions.
6. The input device according to claim 5, wherein
an optical axis of light emitted from the light emitting portion passes through a center of an image of the button displayed in the aerial-image display region.
7. The input device according to claim 6, wherein
the first direction and the second direction are orthogonal to each other; and
the images of the plurality of buttons displayed in the aerial-image display region are arranged in a matrix.
8. The input device according to claim 1, wherein
the aerial image is an image of a keypad including images of a plurality of numeric buttons.
9. A control method of an input device,
the input device including: a display mechanism having a display surface which displays an image, an aerial-image forming mechanism which projects the image displayed on the display surface into a space to form an image as an aerial image, and an optical detection mechanism detecting a position of a user's fingertip in an aerial-image display region, which is a region in which the aerial image is displayed, the aerial-image display region serving as an input portion for inputting information by using the user's fingertip, and the aerial image including images of a plurality of buttons for identifying information input in the input portion, wherein
when the user points at a recognition region, which is in a display region of an image of the button displayed in the aerial-image display region and is narrower than the display region, it is recognized that the image of the button was pointed based on a detection result of the optical detection mechanism.
10. The input device according to claim 4, wherein
the aerial image is an image of a keypad including images of a plurality of numeric buttons.
US18/016,925 2020-07-22 2021-05-31 Input device and input device control method Pending US20240036678A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US18/016,925 US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US202063054799P 2020-07-22 2020-07-22
US18/016,925 US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method
PCT/JP2021/020743 WO2022018972A1 (en) 2020-07-22 2021-05-31 Input device and input device control method

Publications (1)

Publication Number Publication Date
US20240036678A1 true US20240036678A1 (en) 2024-02-01

Family

ID=79728618

Family Applications (3)

Application Number Title Priority Date Filing Date
US18/016,925 Pending US20240036678A1 (en) 2020-07-22 2021-05-31 Input device and input device control method
US18/016,927 Pending US20230351804A1 (en) 2020-07-22 2021-05-31 Input device and information processing device
US18/016,785 Pending US20230288723A1 (en) 2020-07-22 2021-07-19 Input device

Family Applications After (2)

Application Number Title Priority Date Filing Date
US18/016,927 Pending US20230351804A1 (en) 2020-07-22 2021-05-31 Input device and information processing device
US18/016,785 Pending US20230288723A1 (en) 2020-07-22 2021-07-19 Input device

Country Status (3)

Country Link
US (3) US20240036678A1 (en)
JP (4) JPWO2022018973A1 (en)
WO (4) WO2022018972A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP7402265B2 (en) * 2021-06-28 2023-12-20 日立チャネルソリューションズ株式会社 information processing system
JP2023131250A (en) * 2022-03-09 2023-09-22 アルプスアルパイン株式会社 Method for manufacturing optical element, optical element, aerial picture display device, and aerial input device

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5949348A (en) * 1992-08-17 1999-09-07 Ncr Corporation Method and apparatus for variable keyboard display
JPH07210299A (en) * 1994-01-20 1995-08-11 Sumitomo Wiring Syst Ltd Detecting method for input position of optical input device
JP2000181632A (en) * 1998-12-17 2000-06-30 Funai Electric Co Ltd Touch input device for video equipment
JP2002236944A (en) * 2001-02-08 2002-08-23 Hitachi Ltd Information providing terminal
JP4064647B2 (en) * 2001-08-24 2008-03-19 富士通株式会社 Information processing apparatus and input operation apparatus
JP2004178584A (en) * 2002-11-26 2004-06-24 Asulab Sa Input method of security code by touch screen for accessing function, device or specific place, and device for executing the method
WO2012032842A1 (en) * 2010-09-06 2012-03-15 シャープ株式会社 Display system and detection method
JP5602668B2 (en) * 2011-03-24 2014-10-08 日本電産サンキョー株式会社 Medium processing apparatus and flexible cable
US8978975B2 (en) * 2011-07-18 2015-03-17 Accullink, Inc. Systems and methods for authenticating near field communcation financial transactions
US9124419B2 (en) * 2012-05-08 2015-09-01 Discretix Technologies Ltd. Method, device, and system of secure entry and handling of passwords
JP2015060296A (en) * 2013-09-17 2015-03-30 船井電機株式会社 Spatial coordinate specification device
JP6364994B2 (en) * 2014-06-20 2018-08-01 船井電機株式会社 Input device
US9531689B1 (en) * 2014-11-10 2016-12-27 The United States Of America As Represented By The Secretary Of The Navy System and method for encryption of network data
JP6927554B2 (en) * 2015-12-07 2021-09-01 国立大学法人宇都宮大学 Display device
WO2018003860A1 (en) * 2016-06-28 2018-01-04 株式会社ニコン Display device, program, display method, and controller
US11635827B2 (en) * 2016-06-28 2023-04-25 Nikon Corporation Control device, display device, program, and detection method
JP6725371B2 (en) * 2016-09-07 2020-07-15 シャープ株式会社 Display device, display system, and display method
EP3319069B1 (en) * 2016-11-02 2019-05-01 Skeyecode Method for authenticating a user by means of a non-secure terminal
JP6913494B2 (en) * 2017-03-30 2021-08-04 日本電産サンキョー株式会社 Flexible printed circuit board and card reader
JP6974032B2 (en) * 2017-05-24 2021-12-01 シャープ株式会社 Image display device, image forming device, control program and control method
JP2019109636A (en) * 2017-12-18 2019-07-04 コニカミノルタ株式会社 Non-contact input device
JP2019133284A (en) * 2018-01-30 2019-08-08 コニカミノルタ株式会社 Non-contact input device
JP7128720B2 (en) * 2018-10-25 2022-08-31 日立チャネルソリューションズ株式会社 Input/output device and automatic transaction device
JP7164405B2 (en) * 2018-11-07 2022-11-01 日立チャネルソリューションズ株式会社 Image reader and method
JP7251301B2 (en) * 2019-05-10 2023-04-04 京セラドキュメントソリューションズ株式会社 Image processing system, image processing method, and image forming apparatus

Also Published As

Publication number Publication date
WO2022018971A1 (en) 2022-01-27
JPWO2022018973A1 (en) 2022-01-27
WO2022019279A1 (en) 2022-01-27
JPWO2022018972A1 (en) 2022-01-27
JPWO2022019279A1 (en) 2022-01-27
WO2022018972A1 (en) 2022-01-27
JPWO2022018971A1 (en) 2022-01-27
WO2022018973A1 (en) 2022-01-27
US20230351804A1 (en) 2023-11-02
US20230288723A1 (en) 2023-09-14

Similar Documents

Publication Publication Date Title
US20240036678A1 (en) Input device and input device control method
CN109496313B (en) Fingerprint identification device and electronic equipment
CN107077003B (en) Optical device
CN108133174B (en) Optical image sensor and flat panel display having the same embedded therein
EP3467703A1 (en) Display device and fingerprint identification method thereof
EP3731135B1 (en) Method and device for fingerprint recognition, and terminal device
US20180348960A1 (en) Input device
US11048900B2 (en) Image reading device and method
JPH11506237A (en) Light pen input system
CN107111383B (en) Non-contact input device and method
CN110488519B (en) Liquid crystal display device, electronic apparatus, and control method of electronic apparatus
US9063616B2 (en) Optical touch device with symmetric light sources and locating method thereof
US20060164387A1 (en) Input apparatus and touch-reading character/symbol input method
US20230244347A1 (en) Display operation section and device
US10371349B2 (en) Optical device, optical system, and ticket gate
CN111597912A (en) Display module and electronic equipment with same
JP2015191433A (en) Information input device and vending machine
CN110770748B (en) Optical fingerprint identification device, electronic equipment and fingerprint identification method
CN110785770A (en) Fingerprint identification method and device and electronic equipment
CN115390269A (en) Display device
KR20170021665A (en) Display apparatus with optical touch screen function
JP2023082791A (en) Display operation unit and device
WO2018143313A1 (en) Wearable electronic device
JP2009301250A (en) Device, method and recording medium for controlling multiple pointed positions on input device
US20230316799A1 (en) Input device

Legal Events

Date Code Title Description
AS Assignment

Owner name: NIDEC SANKYO CORPORATION, JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:KUBOTA, KYOSUKE;OGUCHI, YOSUKE;REEL/FRAME:062423/0053

Effective date: 20230113

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED