WO2021260989A1 - Aerial image display input device and aerial mage display input method - Google Patents

Aerial image display input device and aerial mage display input method Download PDF

Info

Publication number
WO2021260989A1
WO2021260989A1 PCT/JP2021/003495 JP2021003495W WO2021260989A1 WO 2021260989 A1 WO2021260989 A1 WO 2021260989A1 JP 2021003495 W JP2021003495 W JP 2021003495W WO 2021260989 A1 WO2021260989 A1 WO 2021260989A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
image
user
detection sensor
operation button
Prior art date
Application number
PCT/JP2021/003495
Other languages
French (fr)
Japanese (ja)
Inventor
誠 飯田
利一 加藤
靖久 吉田
寛之 斎藤
真也 松井
雄希 前田
公 松島
Original Assignee
日立チャネルソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020171039A external-priority patent/JP2022007868A/en
Application filed by 日立チャネルソリューションズ株式会社 filed Critical 日立チャネルソリューションズ株式会社
Publication of WO2021260989A1 publication Critical patent/WO2021260989A1/en

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present invention relates to an aerial image display input device and an aerial image display input method, and is suitable for application to an aerial image display input device and an aerial image display input method for inputting a user's operation on an image displayed in the air (aerial image). It is a thing.
  • Patent Document 1 arranges an image display device, a half mirror, and a retroreflective material in the air.
  • An aerial image display device for displaying an image is disclosed.
  • Patent Document 2 a display object installed inside is displayed in an external three-dimensional space by an image coupling plate, and a gesture of hand movement is performed from information of a camera and a distance sensor that captures the user's hand.
  • a gesture operation device that recognizes an operation is disclosed.
  • an aerial image display device including a light source such as an image, a retroreflective member, and an optical branching member.
  • a non-contact operation detecting device including a detecting means for detecting whether or not an object is present at a point is disclosed.
  • the detection means includes a detection light source that emits infrared light provided in a light source such as an image, and a photodiode that detects infrared light that is return light from a point of interest. It is shown.
  • Patent Document 4 describes a proposal of arranging a built-in input detection sensor in an input / output device provided with an image forming mechanism unit having a display unit for displaying an image to be imaged in the air, and an operator's face or line of sight.
  • a proposal to install a detection device for detecting is disclosed.
  • a control unit having high processing capacity is required in order to perform image recognition processing for recognizing a user's gesture as a predetermined input operation. There were issues in terms of time and product price.
  • the present invention has been made in consideration of the above points, and is an aerial image display input device that realizes a highly convenient input device that can be intuitively operated by a user for an aerial image with a simple device configuration. This is an attempt to propose an aerial image display input method.
  • the present invention provides the following aerial image display input devices [1] to [14] and an aerial image display input method using these aerial image display input devices.
  • [1] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is arranged on a straight line parallel to the other side, which is located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
  • a surface formed by a line segment connecting the vicinity of the midpoint of a set of sides not parallel to the image transmission plate among the sides of the rectangular projection surface and the arrangement straight line of the input detection sensor is arranged as an input detection area.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen in the vicinity of the line segment connecting the midpoints of the rectangular projection surface.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [2]
  • the input detection sensor is a reflection type distance measurement sensor, and two or more N pieces are linearly arranged.
  • the display unit displays N or less input operation buttons.
  • the aerial image display input device according to the above [1], wherein the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
  • the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
  • a movement operation for moving an object in a predetermined operation direction is provided.
  • One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
  • the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button.
  • the aerial image display input device according to the above [2], wherein the pressing of the display change request button is determined based on the change information of the above.
  • the aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane.
  • the aerial image display input device according to any one of the above [1] to [3], wherein the integrated structure has a variable angle.
  • the aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device.
  • Aerial image display input device [6] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is arranged on two or more M straight lines parallel to the other side, which are located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
  • the display unit that displays the predetermined image displays an input operation button in the input guide screen in the vicinity of a line segment connecting the vicinity of the (M + 1) equal division point on the rectangular projection surface.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by.
  • It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at an angle of about 75 degrees with the image transmission plate.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
  • the control unit determines that the input operation button is pressed based on the detection information of the object movement by the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [8] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the display unit that displays the predetermined image displays the input operation button in the input guidance screen as a stereoscopic perspective image having an image in the thickness direction on the side opposite to one side of the rectangle of the image transmission plate.
  • the input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
  • the control unit determines that the input operation button is pressed based on the detection information of the object movement of the input detection sensor, and when it is determined that the input operation button is pressed, the display unit displays the display. By displaying the input operation button on a flat image having no image in the thickness direction at a position moved by the length of the image in the thickness direction of the input operation button when it is not in the pressed state.
  • An aerial image display input device characterized by indicating that the input operation button is in the pressed state.
  • It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor has a plurality of sets of infrared light emitting elements and light receiving elements, and the principle of triangular measurement is based on the light receiving direction when the light receiving element receives the reflected light by the object of light emission by the infrared light emitting element. It is a reflected light distance sensor that calculates the distance to the object, and is a two straight line that is located near the other side of the rectangular shape of the image transmission plate and is parallel to the other side.
  • the plurality of the infrared light emitting elements are arranged on a straight line on the side close to the image transmitting plate
  • the plurality of the light receiving elements are arranged on the straight line on the side far from the image transmitting plate
  • each set of the infrared rays is arranged.
  • a straight line connecting the light emitting element and the light receiving element is arranged so as to intersect the two parallel straight lines at right angles.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [10] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen. An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is With respect to the position of the object associated with the user's aerial operation on the same surface as the three-dimensional space projection surface, at least when one side direction of the rectangle of the image transmission plate is the X direction and the direction perpendicular to it is the Y direction.
  • a projection surface touch sensor that detects the position in the Y direction, and A plurality of distance sensors arranged on a straight line parallel to the other side located in the vicinity of the other side of the rectangle of the image transmission plate and calculating the distance to the object.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button by the projection surface touch sensor, and pushes the object against the area of the input operation button by the plurality of reflected light distance sensors.
  • the control unit determines that the input operation button is pressed based on the detection information of the presence or absence of an object detected by the input detection sensor and the detection information of the amount of movement of the object in the pushing direction, and determines that the input operation button is pressed.
  • An aerial image display input device characterized in that, when it is determined that a button has been pressed, the input operation button displayed by the display unit is changed to indicate that the button is in the pressed state.
  • the aerial image display input device according to any one of. [12] As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided. One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
  • the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button.
  • the aerial image display input device according to any one of the above [7] to [11], wherein the pressing of the display change request button is determined based on the change information of the above. [13]
  • the aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane.
  • the aerial image display input device according to any one of [7] to [12] above, wherein the integrated structure has a variable angle.
  • the aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device.
  • a handling processing device that performs medium processing or a predetermined work with the user
  • the control unit displays on the display unit based on the information from the handling processing device.
  • FIG. 1A It is an external perspective view of the aerial image display input device 1A which concerns on Example 2 of this invention. It is a figure which shows the display example of the input operation button 30 in Example 3.
  • FIG. It is an external perspective view of the aerial image display input device 1B which concerns on Example 4.
  • FIG. It is a side view of the aerial image display input device 1B.
  • FIG. 2 is a diagram (No. 2) showing an example of a display image of a three-dimensional space projection surface 10 by an aerial image display input device 1C.
  • FIG. 16B It is a figure which shows the relationship between the display image shown in FIG. 16B, and the arrangement of the reflected light distance sensor of the aerial image display input device 1C. It is an external perspective view of the aerial image display input device 1D which concerns on Example 6. FIG. It is a side view of the aerial image display input device 1D.
  • FIG. 1 is an external perspective view of the aerial image display input device 1 according to the first embodiment (Example 1) of the present invention.
  • FIG. 2 is a side view of the aerial image display input device 1.
  • the aerial image display input device 1 of FIG. 2 shows the internal structure in an easy-to-understand manner by showing a cross section seen from the side.
  • FIG. 3 is a block diagram showing a configuration example of the aerial image display input device 1.
  • the aerial image display input device 1 is configured to include an aerial image projection unit 2, an input detection sensor 5, and a control unit 7, which are mounted on a housing pedestal 12. It is implemented in 11.
  • FIG. 1 schematically shows an input operation button 30 and a user-operated hand 80 on the three-dimensional space projection surface 10.
  • the three-dimensional space projection surface 10 is a projection surface in three-dimensional space on which an aerial image (input operation button 30) is displayed by the aerial image display input device 1, and the hand 80 uses the aerial image display input device 1. This is an example of the part of the user 8.
  • the user 8 places the hand 80 at the position of the input operation button 30 projected on the three-dimensional space projection surface 10 to perform an input operation on the aerial image display input device 1. It can be performed.
  • the aerial image display input device 1 includes an aerial image projection unit 2, an input detection sensor 5, and a control unit 7.
  • the aerial image projection unit 2 has a display unit 3 and an image transmission plate 4 (light branch member 40, retroreflection member 41).
  • the input detection sensor 5 has reflected light distance sensors 51, 52, and 53, which are reflection type distance measurement sensors.
  • the control unit 7 has a built-in speaker 74, and has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73.
  • the aerial image display input device 1 can be connected to a handling processing device 9 which is a separate device, and the handling processing device 9 includes a control unit 91, a display unit 92, and a processing unit 93.
  • the configurations of the aerial image display input device 1 will be described in detail below.
  • the aerial image projection unit 2 is a unit that transmits an image of the display unit 3 by a liquid crystal display or the like through an image transmission plate 4 by internal light reflection or transmission action to form an image on a three-dimensional space projection surface 10. ..
  • a high-brightness liquid crystal display unit 3 is arranged at the top, and an optical branching member 40 such as a half mirror is provided on the image transmission plate 4 with respect to the display unit 3.
  • an optical branching member 40 such as a half mirror is provided on the image transmission plate 4 with respect to the display unit 3.
  • a retroreflective member 41 Arranged at an angle of about degrees, and further arranged, a retroreflective member 41 that retroreflects the light reflected by the optical branch member 40 in the same direction.
  • the reflected light passes through the optical branching member 40 and is imaged on the three-dimensional space projection surface 10 to form an aerial image.
  • the display unit 3 and the aerial image of the three-dimensional space projection surface 10 are line-symmetrical with respect to the image transmission plate 4.
  • the aerial image projection technique having the above configuration is widely disclosed in the above-mentioned prior art documents and the like, and a configuration other than the present embodiment may be adopted.
  • an optical branching member 40 such as a half mirror is adopted for the image transmitting plate 4 to form an aerial image in combination with the retroreflective member 41, but as another embodiment, optical branching is performed.
  • An image transmission plate 4 that does not use the member 40 has also been devised, and the formation of an aerial image can be realized by such another embodiment as in the present embodiment.
  • a small high-brightness liquid crystal having a size of 5 inches is arranged in the display unit 3, and the optical branching member 40 has a rectangular shape forming a vertical surface as shown in FIG.
  • a rectangular display area having a size substantially the same as the screen size of the 5-inch liquid crystal of the display unit 3 is formed on the original space projection surface 10, and the angle ⁇ 1 between the optical branch member 40 and the three-dimensional space projection surface 10 (see FIG. 2). ) Is arranged so that it is about 75 degrees.
  • the rectangle in the display area is also represented as a rectangle A1-A2-A3-A4 by using the respective vertices A1, A2, A3, A4 of the rectangle (see FIG. 1), and such a notation method will be described later. The same applies to other rectangles.
  • the input detection sensor 5 includes three reflected light distance sensors 51, 52, and 53.
  • Each of the reflected light distance sensors 51, 52, and 53 has a light emitting element (for example, an infrared light emitting element) and a light receiving element, and can detect the distance to the object in the optical axis direction of the light emitting element and the light receiving element. It is a distance sensor.
  • the three reflected light distance sensors 51, 52, and 53 are arranged in a straight line (on the line segments B1-B2 in FIG. 1). Further, as shown in FIG.
  • the reflected light distance sensors 51, 52, and 53 are arranged near the lower end of the optical branch member 40, and include an input detection direction 6 determined by the optical axis direction of the light emitting element and the light receiving element.
  • 61 that is, the rectangles B1-B2-B3-B4 shown in FIG. 1 are arranged so as to intersect the three-dimensional space projection surface 10.
  • the input detection direction 6 (input detection surface 61) and the optical branch member 40 are arranged at an angle ⁇ 2.
  • the line segments B3 and B4 of the line of intersection between the input detection surface 61 and the three-dimensional space projection surface 10 are almost the same as the line segments connecting the midpoints of the sides A1-A4 and the sides A2-A3 of the three-dimensional space projection surface 10.
  • the angle ⁇ 2 formed by the three-dimensional space projection surface 10 and the optical branching member 40 is set to about 10 degrees. Further, since the reflected light distance sensors 51, 52, and 53 can acquire distance information when an object exists in the direction facing each of the light emitting element and the light receiving element (see the solid line with an arrow in FIG. 1). When the user's hand 80 in the vicinity of the three-dimensional space projection surface 10 exists in the area in the direction of the arrow, the distance information to the hand 80 is acquired, and the movement of the hand 80 is detected as an input.
  • the control unit 7 has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73.
  • the input determination processing unit 71 is connected to the input detection sensor 5 including the reflected light distance sensors 51, 52, 53 and the built-in speaker 74, and is an input determination process for processing the input of the user operation to the input operation button 30. I do.
  • the input / output I / F processing unit 72 is connected to a handling processing device 9 of another device that performs medium processing or predetermined work with the user, and transmits / receives screen information and input information. Perform processing.
  • the screen control processing unit 73 connects to the display unit 3 and performs screen control processing for controlling the screen display on the display unit 3.
  • the handling processing device 9 is a device that receives a user operation (input operation) for the aerial image display input device 1 and executes the handling processing desired by the user, and as shown in the block diagram of FIG. It includes a control unit 91, a display unit 92, and a processing unit 93.
  • the control unit 91 controls the display unit 92 having a touch panel and controls the processing unit 93 that handles a plurality of cases.
  • the operation of the user 8 with respect to the aerial image display input device 1 and the internal processing of the aerial image display input device 1 will be described below.
  • the user 8 selects, for example, one of handling A and handling B when two types of handling (handling A and handling B) are prepared. Start the operation from.
  • the control unit 91 of the handling processing device 9 displays the selection screens of handling A and handling B and transmits to the control unit 7 to request any selection information.
  • the screen control processing unit 73 displays two input operation buttons indicating handling A and handling B on the display unit 3.
  • the aerial image projection unit 2 displays on the three-dimensional space projection surface 10 two input operation buttons 30 indicating selection of handling A and handling B.
  • the user 8 recognizes the input operation button 30 displayed on the three-dimensional space projection surface 10, and holds the hand 80 for an input operation meaning the desired handling of the handling A and the handling B. It is held over the upper surface of the button 30 (here, it is referred to as handling A), and an operation of pressing the button (button pressing operation) is performed.
  • the reflected light distance sensor (here, the reflected light distance sensor 51) that detects the selected input operation button 30 among the reflected light distance sensors 51, 52, and 53 performs this button pressing operation by the user's hand 80. Is detected.
  • the control unit 7 determines that the button of the handling A is pressed by the input determination processing unit 71 detecting the movement in the input detection direction 6.
  • the screen control processing unit 73 controls the screen display of the display unit 3 to change the color and shape of the button display in order to notify the user that the display button has been pressed. While switching to the screen, a predetermined sound is output from the speaker 74. Further, in the control unit 7, the input / output I / F processing unit 72 transmits the selection information of the handling A to the handling processing device 9.
  • the user 8 inputs the handling selection of the handling processing device 9 touchlessly without touching and operating the display unit 92 (touch panel) provided on the handling processing device 9. Therefore, the handling processing device 9 and the aerial image display input device 1 can be made into clean equipment. Further, when it is difficult to operate with a touch panel, for example, even a user wearing gloves or a user with a handicap can easily operate by touchless, thus providing a highly convenient input device. be able to.
  • the aerial image display input device 1 has the following effects in addition to the above-mentioned effects due to the general touchless operation.
  • the three-dimensional spatial projection surface 10 that forms an aerial image is arranged at an angle ⁇ 1 (about 75 degrees) with the optical branch member 40, so that the viewpoint from the user 8 is as shown in FIG. Since it faces the line of sight below the area, it is not easily affected by invading light or reflected light from the outside, and an aerial image with good visibility can be obtained.
  • the palm (hand 80) is used. Since the three-dimensional space projection surface 10 is close to parallel, the user can easily hold the hand 80 over the position of the input operation button 30, and the palm can be naturally lowered in the vertical direction. The operability can be further improved.
  • the input detection direction 6 has an angle ⁇ 3 of about 90 degrees with respect to the substantially horizontal direction in which the user 8 naturally holds the palm, and the user 8 presses the button to press the palm. Since the input detection surface 61 is located parallel to the direction in which the (hand 80) is moved, the detection sensitivity (accuracy) of the "push" operation by the user 8 can be improved.
  • the reflected light distance sensors 51, 52, and 53 are used in the input detection sensor 5 to detect the movement of the palm of the user 8, but even if another sensor means such as a camera is used, the movement of the palm is detected. Similarly, it can be a device with good operability. However, in that case, the arrangement and detection method of the sensor means such as a camera are different from the case of using the reflected light distance sensors 51, 52, 53.
  • the color and shape of the button display are changed and the voice is output, so that the user 8 can visually or audibly recognize that the operation is recognized by the device instead of the tactile sense. Can be recognized.
  • the aerial image display input device 1 can be intuitively operated by the user with respect to the aerial image, is highly convenient, and has a reaction (operation sensitivity). Can provide a good input device with a simple device configuration.
  • a relatively small 5-inch liquid crystal display is arranged on the display unit 3, and two inputs are provided on the horizontal line segments B3-B4 on the three-dimensional space projection surface 10.
  • the operation button 30 By setting the operation button 30 to be displayed, the effect of enhancing the operation feeling of the user can be obtained. Specifically, as a user's feeling of operation, if the three-dimensional space projection surface 10 is too large, the aerial image is too far from the background optical branching member 40, and it becomes difficult for some people to grasp the sense of distance of the aerial image. Further, if the three-dimensional space projection surface 10 is too small, the input operation button 30 may be too close to the optical branch member 40, and the hand 80 may touch the panel of the image transmission plate 4 during the operation of the button operation.
  • the hand is tilted in the diagonal direction in consideration of the inclination of the three-dimensional space projection surface 10 in the front-rear direction (angle ⁇ 1 in FIG. 2).
  • the 80 has to be moved, which makes the operation difficult.
  • the above configuration in the present embodiment eliminates such a decrease in the operation feeling of the user, and as a result, the user 8 has a good visibility with reference to the optical branching member 40 in the background. Further, the movement of the hand 80 required for the operation can be performed by a small and simple pressing operation without considering the depth and the inclination. That is, the aerial image display input device 1 according to the present embodiment can realize a touchless operation with high operability.
  • FIG. 4A to 4D are diagrams illustrating an example of changing the angle in the aerial image display input device 1.
  • FIG. 4A is a side view of the aerial image display input device 1 before the angle change
  • FIG. 4B is a side view of the aerial image display input device 1 when the angle is changed
  • FIG. 4C is a diagram showing a change in the line-of-sight angle of the user 8 due to a change in the angle of the aerial image display input device 1
  • FIG. 4D shows the aerial image display input device 1 and the user 8 when the angle is changed. It is a figure which shows the positional relationship of.
  • the aerial image display input device 1 is configured by mounting the housing 11 on the housing pedestal 12, but as shown in FIG.
  • the outer arc of the housing 11 is formed.
  • the angle can be changed along with.
  • the optical branching member 40 can move in a range of an angle ⁇ 4 (for example, about 10 degrees) from the vertical direction to the right rotation direction.
  • the aerial image display input device 1 of the present embodiment has a similar angle ⁇ 4 (leftward rotation direction) along the arc of the outer shape of the housing 11 in the direction opposite to that of FIG. 4B (counterclockwise rotation direction).
  • it may be configured to be movable within a range of up to about 10 degrees).
  • the angle of the housing 11 changeable in this way, when it is desired to arrange the aerial image display input device 1 further below or above the line of sight of the user 8 shown in FIG. 2, visibility and visibility and The operability can be optimized. More specifically, when it is desired to arrange the aerial image display input device 1 below the state shown in FIG. 2, the aerial image display input device 1 can be moved in the clockwise rotation direction for visibility and operability. Can be in a suitable state. On the other hand, when it is desired to arrange the aerial image display input device 1 above the state shown in FIG. 2, the aerial image display input device 1 is arranged so as to be movable in the counterclockwise direction, so that the visibility and operability are suitable. Can be. This will be described in detail below with reference to FIGS. 4C and 4D.
  • FIG. 4C is a diagram showing the line-of-sight angle ⁇ 5 ( ⁇ 5A, ⁇ 5B) of the user 8 with respect to the three-dimensional space projection surface 10 (10A, 10B) when the arrangement and angle of the aerial image display input device 1 are changed in combination.
  • 4D is a diagram schematically showing the positional relationship between the user 8 and the aerial image display input device 1 when the combination is changed.
  • the line-of-sight angle ⁇ 5 from the user 8 with respect to the original space projection surface 10 changes, making it difficult for the user 8 to see the aerial image.
  • the aerial image display input device 1 is moved in the right rotation direction
  • the three-dimensional space projection surface 10A by the aerial image display input device 1 after the movement is in the right rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle ⁇ 5A from the user 8 can maintain a value close to the line-of-sight angle ⁇ 5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C).
  • the aerial image display input device 1 is simply arranged downward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10A) becomes farther than in the state of FIG. 2, and the user 8 is in the aerial image. You may feel inconvenient to operate. Therefore, when the aerial image display input device 1 is arranged downward, it is preferable to arrange it so as to be shifted to the front side when viewed from the user 8 from the viewpoint of improving operability. Based on the above, as shown in FIG. 4D, when the aerial image display input device 1 is to be arranged below the state of FIG. 2, the aerial image display input device 1 is moved in the clockwise rotation direction.
  • the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate at hand, so that the visibility and operability are optimized. can do.
  • the aerial image display input device 1 of FIG. 2 is to be arranged above the state of FIG. 2, the position of the three-dimensional space projection surface 10 simply moves upward if the angle is not changed. It is assumed that the line-of-sight angle ⁇ 5 from the user 8 with respect to the three-dimensional space projection surface 10 changes, making it difficult for the user 8 to see the aerial image.
  • the three-dimensional space projection surface 10B by the aerial image display input device 1 after the movement is in the left rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle ⁇ 5B from the user 8 can maintain a value close to the line-of-sight angle ⁇ 5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C). Further, if the aerial image display input device 1 is simply arranged upward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10B) becomes too close to that in the state of FIG.
  • the aerial image display input device 1 when the aerial image display input device 1 is arranged upward, it is preferable to arrange the aerial image display input device 1 so as to be offset from the user 8 from the viewpoint of improving operability.
  • FIG. 4D when the aerial image display input device 1 is to be arranged above the state of FIG. 2, the aerial image display input device 1 is moved in the counterclockwise direction.
  • the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate with the hand 80 extended forward, so that visibility and operation can be performed.
  • the sex can be optimized.
  • FIG. 5A and 5B are diagrams illustrating an example of the arrangement and operation of the reflected light distance sensors 51, 52, 53.
  • FIG. 5A is an image diagram of an input operation by a pressing operation on the input operation button 30, and
  • FIG. 5B is a graph showing a change in distance information detected by the reflected light distance sensors 51 to 53 during an input operation by the pressing operation.
  • FIG. 5A schematically shows an arrangement example of the two input operation buttons 30 and the three reflected light distance sensors 51, 52, 53, and further, the left input operation button 30 (“left”). The image is shown in which the hand 80 of the user 8 performs an input operation by pressing the button).
  • FIG. 5B shows the situation shown in FIG.
  • the change is displayed as a graph, and in the graph, the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
  • the distance information detected by the reflected light distance sensors 51, 52, 53 is referred to as distance information L, C, R in order.
  • the distance information L is represented by a broken line
  • the distance information C is represented by a alternate long and short dash line
  • the distance information R is represented by a solid line.
  • the sensor output represents the distance from each of the reflected light distance sensors 51, 52, 53, and specifically, the distance corresponding to the sensor output of “H2” is in the air. The distance to the image (three-dimensional space projection surface 10). Therefore, when the sensor output changes below H2, it can be determined that the pressing operation with respect to the aerial image has been performed.
  • the distance information L starts changing from a time of 0.2 seconds and then changes to the pressed position (distance of the aerial image) in about 0.2 seconds thereafter, and the reflected light distance sensor 51 has a reflected light distance sensor 51.
  • the adjacent distance information C also changes at the same time as the distance information L, but the distance information C does not change up to the pressed position.
  • the distance information R has not changed at all.
  • the input determination processing unit 71 of the control unit 7 performs input determination processing as follows to input information to the input operation button 30. judge.
  • FIG. 6 is a flowchart showing an example of the processing procedure of the input determination process.
  • the input determination processing unit 71 executes the processing according to the flowchart shown in FIG. 6 and uses the determination levels of H0, H1, H2, and H3 to input from the distance information L, C, and R to the input operation button 30. Judge the input information for.
  • step S10 is a process for detecting the button, and more specifically, for detecting that the hand 80 or the like of the user 8 has started the operation for any of the input operation buttons 30 which are aerial images. It is a process.
  • step S20 is a process for detecting the pressed state of the button, and more specifically, is a process for detecting that the pressed operation has been performed on the input operation button 30 detected in step S10.
  • step S10 The processing outline of step S10 is as follows.
  • the input determination processing unit 71 determines the detection of the distance information L or the distance information R (step S101).
  • the distance information L is detected in step S101, it is for input corresponding to the distance information L by determining whether the distance information L is a value between H2 and H1 and smaller than the distance information C.
  • Detecting the start of an operation on the operation button 30 (“left” button) steps S102 to S104.
  • the input operation button 30 corresponding to the distance information R is performed by performing the same determination processing on the distance information R as the above-mentioned processing on the distance information L (“right”). ”Button) to detect the start of the operation (steps S111 to S113).
  • step S20 executes the processing of step S20 based on the processing results of step S10 (steps S104 and S113).
  • the processing outline of step S20 is as follows.
  • the input determination processing unit 71 determines the pressing operation (100 ms in this example). By using a timer) to determine whether the distance information L is less than H2 (the distance indicated by the distance information L is closer than the distance to the aerial image) for a time of 100 ms or more, "left".
  • the pressing operation of the button is detected (steps S201 to S203, S205 to S208). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left” button (step S204).
  • the input determination processing unit 71 determines the same as the above-mentioned processing related to the distance information L. By performing the process on the distance information R, the pressing operation of the "right” button is detected (steps S211 to S213, S215 to S218). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "right” button (step S214).
  • buttons reflected light distance sensors 51 to 53
  • the pressing action on the two buttons is applied to the distance sensor (reflected light distance sensor 51) corresponding to the left (L) or the right (R).
  • Each can be detected with one output from the corresponding distance sensor (reflected light distance sensor 53). Since this detection method has extremely high accuracy and can be determined by simple logic, processing can be executed with high sensitivity and high speed. Further, the distance sensor and the control unit 7 do not need a complicated configuration for detection, and a small device can be realized by a simple configuration.
  • the output of the sensor (reflected light distance sensor 52) of the distance information C is monitored, and the operation of the hand 80 or the like with respect to the portion other than the button is detected, so that the operation other than the button operation is performed. It is possible to prevent erroneous detection of erroneous operation.
  • the button image displayed on the display unit 3 is a button having a position and size substantially matching the detection area of the sensor.
  • the input operation selected by pressing a button as the handling input information input by the user 8 has been described, but the input operation that can be handled by the aerial image display input device 1 according to this embodiment is a button. It is not limited to the input operation by pressing, and is displayed on the screen when, for example, there are a plurality of selectable information such as "Handling A”, “Handling B", “Handling C”, and “Handling D”. It is also possible to handle operations that display no handling (hereinafter referred to as "turning" operations).
  • FIG. 7A and 7B are diagrams illustrating another example of the arrangement and operation of the reflected light distance sensors 51, 52, 53.
  • FIG. 7A is an image diagram of a swipe operation with respect to the input operation button 30, and
  • FIG. 7B is a graph showing changes in distance information detected by the reflected light distance sensors 51 to 53 during the swipe operation.
  • FIG. 7A shows an example of arrangement of the three reflected light distance sensors 51, 52, 53.
  • two circular buttons (“left” button and “right” button) are displayed as the input operation buttons 30, but in FIG. 7A, the input operation buttons 30 are circular on the left side.
  • the "left” button is displayed, and the arrow-shaped "turn” button is displayed on the right side.
  • the "turn” button changes the handling displayed on the "left” button in a predetermined order when the user 8 performs a predetermined "turning” operation, so that the handling desired by the user 8 is “turned". It is a button to be able to display on the "left” button. For example, in the case of FIG. 7A, it is assumed that "Handling A" is displayed on the "Left” button. The detailed processing procedure will be described later in FIG. 8, but in this example, the pressing operation for the "turning” button is performed (pressing operation), or the display in the direction of the arrow is followed from the vicinity of the "turning" button to the horizontal direction.
  • FIG. 7B is a graph display of the situation shown in FIG. 7A, that is, the change in the distance information detected by the reflected light distance sensors 51, 52, 53 when the user 8 swipes the “turn” button.
  • the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
  • the distance information L starts changing from the time 0.2 seconds
  • the distance information C starts changing about 0.2 seconds later
  • the distance information R changes about 0.2 seconds later.
  • Each sensor (reflected light distance sensor 51 to 53) acquires this information, respectively.
  • the input determination processing unit 71 of the control unit 7 may perform the following input determination processing (when distinguishing from the input determination processing described with reference to FIGS. 5 to 6).
  • the input information for the input operation button 30 is determined.
  • four determination levels (threshold values) of H0, H1, H2, and H3 are set in the same manner as the input determination process described with reference to FIGS. 5 to 6.
  • the distance information is classified into three states.
  • FIG. 8 is a flowchart showing an example of the processing procedure of the second input determination process.
  • the processing procedure shown in FIG. 8 has many parts in common with the processing procedure of the input determination processing shown in FIG. 6, and the description of these common parts is basically omitted.
  • Step S30 is a process for detecting that the hand 80 or the like of the user 8 has started an operation for any one of the input operation buttons 30 which is an aerial image, and step S40 is detected in step S30. This is a process for detecting that a predetermined operation (pressing operation in the case of the "left” button, pressing operation or swiping operation in the case of the "turning" button) has been performed on the input operation button 30. ..
  • step S30 The detailed processing procedure of step S30 is that the processing of step S112 is deleted from the processing procedure of step S10 of FIG.
  • the process of step S112 determines whether the distance information R is farther than the aerial image (button) as a preliminary determination for detecting the pressing operation when the distance information R corresponding to the button on the right side is detected.
  • the button on the right side is a "turn" button and does not require determination of the pressing operation (the object to be detected (hand 80) may be in the vicinity of the button).
  • Step S112 has been deleted. Although the detailed processing procedure is repeated, it is omitted, but the processing in step S30 detects the start of the operation to the input operation button 30 corresponding to the distance information L or the distance information R.
  • step S40 executes the processing of step S40 based on the processing results of step S30 (steps S104 and S113).
  • the processing outline of step S40 is as follows.
  • step S30 When the start of the operation to the input operation button 30 (“left” button) corresponding to the distance information L is detected in step S30, the input determination processing unit 71 performs the same processing as in step S20 of FIG. , The pressing operation of the "left” button is detected (steps S201 to S203, S205 to S208). Then, when the input determination processing unit 71 detects the pressing operation in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left” button (step S204).
  • step S30 when the start of the operation to the input operation button 30 (“turning” button) corresponding to the distance information R is detected in step S30, the input determination processing unit 71 performs a predetermined “turning” button with respect to the “turning” button.
  • the process of detecting the pressing operation of the "turn” button based on the change in the distance information R steps S211 to S213) or the "turning” based on the change in the distance information C.
  • the process of detecting the swipe operation of the button (steps S402, S404 to S407) is executed.
  • the input determination processing unit 71 outputs an input determination processing result of "turning" pressing when the pressing operation is detected in the detection result (step S401), and “turning” when the swipe operation is detected in the detection result.
  • the input determination process result of swiping is output (step S403).
  • the screen control processing unit 73 of the control unit 7 displays the display image of the "left" button on the display unit 3 in a predetermined transition order.
  • the process of switching to the next handling is performed according to. Specifically, for example, the display image is switched from “Handling A" to "Handling B".
  • the second input determination process as described above is performed, so that even when selecting from three or more handlings, two buttons (for input) by the aerial image are used.
  • the operation button 30 By using only the operation button 30) and making one of them a "turn" button, the user 8 can perform two types of intuitive and simple operations: a swipe operation that moves in the horizontal direction and a pressing operation that presses in the vertical direction. It is possible to select one desired handling simply by repeating the operation of. Therefore, for example, a user who is accustomed to the swipe operation on a smartphone or a tablet, or a user who is unfamiliar with the swipe operation and is accustomed only to the pressing operation can easily perform the touchless operation.
  • FIG. 9 is an external view of the system when the aerial image display input device 1 is connected to the reception settlement machine 94 for hospitals.
  • the reception settlement machine 94 for hospitals is an example of a handling processing device 9, and a patient who is a user presents a medical examination ticket before receiving, selects a clinical department to receive, and performs reception processing. After the medical examination, in order to settle the medical examination fee, it is a device that confirms the medical examination amount, inserts cash corresponding to the settlement amount, and performs payment processing.
  • the aerial image display input device 1 is arranged on the right side of the cash slot of the reception settlement machine 94, so that the user faces the operating user and has a line of sight of the user as shown in FIG. Located below. Then, as shown in FIG. 9, the user can use the aerial image display input device 1 and the "reception” and “payment” input operation buttons 96 displayed on the display unit 95 of the reception settlement machine 94 to display the aerial image. Both the displayed "reception” and “payment” input operation buttons 30 can be operated. Therefore, the user can execute any button operation and select the handling.
  • FIG. 10 is a diagram schematically showing the processing between the aerial image display input device 1 and the control unit of the reception settlement machine 94 in the system shown in FIG.
  • FIG. 10 shows the processing between the control units when the user 8 selects the handling of “reception”.
  • three types of display screens (screens A1 to A3) that can be displayed by the aerial image display input device 1 are defined in advance between the aerial image display input device 1 and the control unit of the reception settlement machine 94.
  • four types of display screens (screens B1 to B4) that can be displayed by the reception settlement machine 94 are defined in advance.
  • the aerial image display input device 1 receives the display instruction of the screen A1 which is the selection screen of "reception” and “payment” from the reception settlement machine 94, and displays the screen A1. It is displayed on the display unit 3.
  • the aerial image display input device 1 transmits the input key information to the reception settlement machine 94. Further, the reception settlement machine 94 displays the screen B1 on its own display unit 95.
  • the reception settlement machine 94 can display the screen B2, which is a guide for reading the medical examination ticket, on the display unit 95 and select "cancel".
  • the display instruction of the screen A2, which is a simple screen, is transmitted to the aerial image display input device 1.
  • the aerial image display input device 1 receives this display instruction and displays the screen B2 on the display unit 3.
  • the reception settlement machine 94 displays the screen B3, which is a selection screen of the clinical department, on the display unit 95, and the screen A3, which is a screen on which the clinical department can be selected.
  • the display instruction of is transmitted to the aerial image display input device 1.
  • the aerial image display input device 1 receives this display instruction and displays the screen A3 on the display unit 3.
  • the screen A3 displayed by the aerial image display input device 1 is preferably a display screen using the "turn" button as described with reference to FIGS. 7 and 8, and is also shown in FIG. As described above, when the "turn" button on the right side is operated, the handling (clinical department) is displayed by transitioning to the button on the left side. Then, when there is an input operation (operation of selecting one of the clinical departments) on the screen A3, the aerial image display input device 1 transmits the input key information to the reception settlement machine 94.
  • the reception settlement machine 94 selects a clinical department on the screen B3 displayed on its own display unit 3, or inputs key information from the aerial image display input device 1 to the screen A3 (input information of the selected clinical department). Is received, a predetermined reception process is internally executed, and when the reception process is completed, the screen B4 notifying the completion of the reception is displayed on the display unit 95. This completes the reception process for one user (patient).
  • the aerial image display input device 1 is a small device having a simple configuration, which is an easy-to-use input device that is intuitive to the user and has a good response to the aerial image. Not only can it be provided, but by connecting to an existing handling processing device, an easy touchless input device can be realized.
  • the aerial image display input device 1 is operated by the user in either the aerial image display input device 1 or the handling processing device 9 in a state of being connected to the handling processing device 9 (reception settlement machine 94). Therefore, even if the user is unsuitable for operating the aerial image, the handling processing device 9 may be operated, and a highly convenient input system can be provided.
  • the aerial image display input device 1 by configuring the aerial image display input device 1 separately from the handling processing device 9, it becomes easy to add the aerial image display input device 1 to various existing handling processing devices.
  • the logic for processing the input information from the aerial image display input device 1 is added without changing the specifications of the screen and operation method of the application software. It is easy to change the software, it is easy to install, and it has a cost advantage.
  • the aerial image display input device 1 is a compact device that only displays an input button on the screen, it can be easily arranged in the vicinity of each operation unit of various types of processing devices, and the installation space as a whole. It is possible to suppress the increase in the number and secure the operability.
  • FIG. 11 is an external perspective view of the aerial image display input device 1A according to the second embodiment (Example 2) of the present invention.
  • the arrangement relationship between the input detection sensor and the input operation button is changed from the first embodiment.
  • the three reflected light distance sensors 51, 52, It differs from the aerial image display input device 1 of the first embodiment in that the three input operation buttons 31 are arranged so as to face each of the 53.
  • the distance between the respective input operation buttons 31 becomes short, but the range of utilization of handling is increased by increasing the number of selection types of handling from two to three. Has the advantage of spreading.
  • a total of three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are linearly arranged on both sides of the two input operation buttons 30 and between them.
  • the sensors on both sides detect the pressing operation of each input operation button 30, and the remaining sensors (reflected light distance sensor 52) are hands between the two input operation buttons 30.
  • the three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are arranged in a straight line facing the three input operation buttons 31, and the buttons are close to each other. This may result in improper input determination processing for the above-mentioned erroneous operation. Therefore, in order to prevent this, it is preferable to prepare a detection means and input determination processing different from those in the first embodiment.
  • the aerial image display input device 1 (1A) basically the same relationship is established with respect to the number of buttons and the number of sensors even if the number of input operation buttons 30 (31) increases. That is, it is possible to adopt a configuration in which two or more input detection sensors 5 are linearly arranged and N are arranged, and less than N ([N-1] or less) input operation buttons 30 (31) are displayed. It can be said that it is the most efficient considering the performance of the device and the cost of loading parts. However, in principle, it is also possible to adopt a configuration in which N input detection sensors 5 are arranged and N input operation buttons 30 (31) are displayed in the same number.
  • the number of input operation buttons when the number of input operation buttons is further increased to three or more, the number of reflected light distance sensors constituting the input detection sensor may be increased together.
  • an integrated line-type detection sensor in which elements are arranged on the line may be adopted as the reflected light distance sensor.
  • the aerial image display input device 1 described in the first embodiment is connected to a calling button installed on each floor of the elevator of the building. do.
  • FIG. 12 is a diagram showing a display example of the input operation button 30 in the third embodiment.
  • the aerial image display input device 1 connected to the call button displays the input operation button 30 on the three-dimensional space projection surface 10 by displaying the call button image on the display unit 3. do. More specifically, as shown in FIG. 12, an upper call button 30A for calling the elevator car for upward movement and a lower call button for calling the elevator car for downward movement. 30B is displayed on the three-dimensional space projection surface 10. Then, in the aerial image display input device 1, when the user 8 performs a predetermined operation (basically a pressing operation, but a swipe operation may also be accepted) with respect to the upper call button 30A or the lower call button 30B.
  • a predetermined operation basicically a pressing operation, but a swipe operation may also be accepted
  • the aerial image display input device 1 can switch the call push button installed for each floor in the existing elevator to the touchless operation.
  • FIG. 13 is an external perspective view of the aerial image display input device 1B according to the fourth embodiment. Further, FIG. 14 is a side view of the aerial image display input device 1B.
  • FIGS. 13 and 14 the same reference numerals are given to the configurations common to those in FIGS. 1 and 2, and the description thereof will be omitted.
  • the input detection sensor 5 becomes the input detection sensor 5B as compared with the aerial image display input device 1 shown in FIGS. 1 and 2, and the input detection on one surface is performed.
  • the surface 61 is a two-sided input detection surface 62, 63.
  • the input detection sensor 5B is configured by arranging five sensors (distance sensors including a light emitting element and a light receiving element) arranged on one axis over two axes.
  • the input detection surface 62 (B1-B2-B3-B4) and the input detection surface It forms a two-sided input detection region consisting of 63 (C1-C2-C3-C4).
  • the three-dimensional space projection surface 10 is a surface including a line segment B1-B2 and a line segment C1-C2. For example, 10 input operation buttons 32 are displayed, and "0" to "9” are displayed on each button. By assigning a number of "", a one-digit number can be selected and input.
  • the aerial image display input device 1B as described above can be realized with the same configuration as the aerial image display input device 1 of the first embodiment except that the number of sensors constituting the input detection sensor 5B is increased. ..
  • the input determination processing unit 71 determines the input. In the process, it can be technically easily solved by adding a logic for determining which button the operation target is. Therefore, according to the aerial image display input device 1B according to the fourth embodiment, in addition to the effects obtained in the other examples, the effect of increasing the types of input selection and expanding the applicable range can be obtained.
  • FIG. 15 is an external perspective view of the aerial image display input device 1C according to the fifth embodiment.
  • 16A and 16B are diagrams (No. 1 and No. 2) showing an example of a display image of the three-dimensional space projection surface 10 by the aerial image display input device 1C.
  • FIG. 17 is a diagram showing the relationship between the display image shown in FIG. 16B and the arrangement of the reflected light distance sensor of the aerial image display input device 1C.
  • the aerial image display input device 1C of the fifth embodiment is a modification of the arrangement relationship between the input detection sensor (reflected light distance sensor) and the input operation button from the aerial image display input device 1 of the first embodiment. Places five input operation buttons 33 to 37 facing the five reflected light distance sensors 54 to 58. Of these, the three input operation buttons 33, 34, 35 arranged facing the three reflected light distance sensors 55, 56, 57 are operation buttons corresponding to the input operation (selection operation) by pressing the buttons. The two input operation buttons 36 and 37 arranged to face the two reflected light distance sensors 54 and 58 are operation buttons corresponding to the "turning" operation.
  • the side guide 13 shown in FIGS. 15 and 17 is, for example, a plate-shaped member arranged on both sides of the housing 11, and is a projection surface portion 14 located on the side of the three-dimensional space projection surface 10. And a sensor surface portion 15 located on the side of the input detection sensor 5 (reflected light distance sensor 54 to 58).
  • the display image (three-dimensional space projection surface 10) of FIG. 16A is a screen before the operation of the input operation buttons 33 to 37, and the display image (three-dimensional space projection surface 10) of FIG. 16B is the input operation button 35. This is the screen after pressing.
  • the distance between the respective input operation buttons 33 to 37 is shorter than in the first embodiment, but the input operation buttons corresponding to the "turn" operation are provided.
  • Having 36 and 37 has the advantage that the range of utilization of handling is expanded by increasing the selection types of handling from two types (handling A and handling B) to five or more types (for example, handling A to handling E).
  • the five input detection sensors 5 are arranged at intervals of, for example, about 25 mm.
  • the reflected light distance sensors 54 to 58 have a built-in infrared LED, which is a kind of infrared light emitting element (as a light emitting element), on the light emitting side, and a position detection element (PSD: Position Sensitive Detector) on the light receiving side (as a light receiving element).
  • PSD Position Sensitive Detector
  • a sensor that outputs light according to the distance to the object by a triangular survey method based on the light receiving direction when the PSD receives the reflected light from the object emitted by the infrared LED, as shown in FIG.
  • the light emitting side (light emitting element) is arranged on the side closer to the image transmission plate 4, and the light receiving side (light receiving element) is arranged on the far side.
  • the five input detection sensors 5 (reflected light distance sensors 54 to 58) in the fifth embodiment are on the light emitting side as compared with the three input detection sensors 5 (reflected light distance sensors 51 to 53) shown in the first embodiment. And the arrangement on the light receiving side is rotated 90 degrees.
  • the detection area of the object is wide in the direction of the light emitting / receiving element and narrow in the direction perpendicular to the light emitting / receiving element.
  • the reflected light distance sensor used in this embodiment has a detection region of about 40 mm in the light emitting / receiving element direction and about 20 mm in the direction perpendicular to the light emitting / receiving element at a distance of about 100 mm from the element.
  • a light emitting / receiving element (light emitting element and a light receiving element).
  • the distance between the sensors of each set consisting of) is about 25 mm, and the arrangement direction of the light emitting / receiving element in each set is a direction perpendicular to (perpendicular) to the arrangement direction of a plurality of similar elements (light emitting element, light receiving element).
  • a plurality of homogenous elements are arranged in the vicinity of the bottom of the image transmission plate 4 on a straight line parallel to the bottom, respectively. It can be said that the arrangement direction of the light receiving and emitting elements in the set is a direction perpendicular to the bottom direction of the image transmitting plate 4.
  • the input operation by pressing a button is performed at the position facing each sensor as in the first embodiment.
  • the same processing as the input determination processing described in the first embodiment can be executed.
  • the projection surface portion 14 of the side guide 13 forms a surface that coincides with the three-dimensional space projection surface 10, so that the user can refer to the field of view of the operation image floating in the air as shown in FIG. (Forming both ends of the operation image), so that the visibility of the operation image can be improved.
  • the side guide 13 has an effect of preventing the image from being difficult to see due to external light from the side surface and preventing the input detection sensor 5 from malfunctioning.
  • the arranged input operation buttons 33 to 37 are displayed in a three-dimensional diagram showing that they have a three-dimensional thickness in the front direction, and are shown in FIG. 16B.
  • the pressed state of the input operation button 35 (handling D) is a position in which only the selected input operation button 35 is moved toward the front in a plan view having no three-dimensional thickness by a distance corresponding to the thickness. It is displayed in.
  • FIG. 18 is an external perspective view of the aerial image display input device 1D according to the sixth embodiment. Further, FIG. 19 is a side view of the aerial image display input device 1D.
  • the aerial image display input device 1D of the sixth embodiment arranges five reflected light distance sensors 54, 55, 56, 57, 58 as the input detection sensor 5. Further, as a configuration different from the aerial image display input device 1C of the fifth embodiment, the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10.
  • the projection surface touch sensor 16 has an infrared light emitting element and a light receiving element.
  • the infrared light emitting element emits infrared rays in the above-plane direction of the three-dimensional space projection surface 10, and the light receiving element is arranged on the line in the width direction of the three-dimensional space projection surface 10, and is a three-dimensional space when an object is obstructed.
  • the position on the projection surface 10 is output in XY coordinates with X in the width direction and Y in the direction perpendicular to it.
  • a zForce (registered trademark) AIR touch sensor manufactured by neodo can be used as the projection surface touch sensor 16.
  • the user can use the projection surface touch sensor 16 on the three-dimensional space projection surface 10 to display each input operation button 38.
  • Each input is detected by detecting whether the object is obstructed at the position of, and detecting the distance that the object detected by the projection surface touch sensor 16 moves in the pressing direction by using the five reflected light distance sensors 54 to 58. It is possible to detect the pressing operation of the operation button 38.
  • the present embodiment it is possible to increase the types of operation selections as compared with the fifth embodiment, and at the same time, even if the number of types of operation selections increases, when the pressing operation is performed by providing the projection surface touch sensor 16. Since the press detection is performed only on the button, it is possible to prevent a malfunction or a malfunction.
  • the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10, and the position on the three-dimensional space projection surface 10 when the object is obstructed is set to the width direction X and the direction perpendicular to it Y.
  • the projection surface touch sensor 16 has a light emitting element on both sides of the three-dimensional space projection surface 10 in the width direction X on one side and a light receiving element on the other side. May be a type of sensor that outputs the Y coordinate of the direction Y perpendicular to the width direction X at the position on the three-dimensional space projection surface 10 when the object is obstructed. This is because the position in the X direction can be detected by using the reflected light distance sensors 54 to 58.
  • the present invention is not limited to the above-described embodiment, but includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • SSD Solid State Drive
  • control lines and information lines are shown in the drawings as necessary for explanation, and not all control lines and information lines are shown in the product. In practice it may be considered that almost all configurations are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

This aerial image display input device 1 is provided with: an aerial image projection unit 2 which internally holds a rectangular display unit 3 that displays a prescribed image, and externally holds a rectangular image transmission plate 4 for projecting the image displayed on the display unit 3 onto a three-dimensional space projection plane 10 visible to the user, and which, as an input guidance screen (input operation buttons 30) for the user, forms the image displayed on the display unit 3 in the air; an input detection sensor 5 which detects aerial operations on the input guidance screen by the user; and a control unit 7 which performs prescribed control.

Description

空中像表示入力装置及び空中像表示入力方法Aerial image display input device and aerial image display input method
 本発明は、空中像表示入力装置及び空中像表示入力方法に関し、空中に表示した画像(空中像)に対する利用者の操作を入力する空中像表示入力装置及び空中像表示入力方法に適用して好適なものである。 The present invention relates to an aerial image display input device and an aerial image display input method, and is suitable for application to an aerial image display input device and an aerial image display input method for inputting a user's operation on an image displayed in the air (aerial image). It is a thing.
 近年、仮想現実(Virtual Reality)あるいは拡張現実現(Augmented Reality)という形で、実像を伴わないコミュニケーション手段が普及してきている。 In recent years, communication methods that do not involve a real image have become widespread in the form of virtual reality or augmented reality.
 例えば、画像表示装置からの表示画像を空中に結像することによって空中に画像を表示する技術として、特許文献1には、画像表示装置、ハーフミラー、及び再帰性反射材を配して、空中像を表示する空中像表示装置が開示されている。 For example, as a technique for displaying an image in the air by forming an image displayed from an image display device in the air, Patent Document 1 arranges an image display device, a half mirror, and a retroreflective material in the air. An aerial image display device for displaying an image is disclosed.
 また例えば、特許文献2には、内部に設置した表示対象物を画像結合プレートによって、外部三次元空間に表示させ、使用するユーザの手を撮像するカメラと距離センサの情報から手の動きのジェスチャー操作を認識するジェスチャー操作装置が開示されている。 Further, for example, in Patent Document 2, a display object installed inside is displayed in an external three-dimensional space by an image coupling plate, and a gesture of hand movement is performed from information of a camera and a distance sensor that captures the user's hand. A gesture operation device that recognizes an operation is disclosed.
 また例えば、特許文献3には、画像等の光源と再帰反射部材と光分岐部材とを備えた空中像表示装置が空間中に表示した空中像に設定された複数の注目点に対して、注目点に物体が存在するか否かを検出する検出手段を備えた非接触操作検出装置が開示されている。さらに特許文献3では、上記検出手段は、画像等の光源に設けられた赤外光を発する検出用光源と、注目点からの戻り光である赤外光を検出するフォトダイオードと、を有することが示されている。 Further, for example, in Patent Document 3, attention is paid to a plurality of points of interest set in an aerial image displayed in space by an aerial image display device including a light source such as an image, a retroreflective member, and an optical branching member. A non-contact operation detecting device including a detecting means for detecting whether or not an object is present at a point is disclosed. Further, in Patent Document 3, the detection means includes a detection light source that emits infrared light provided in a light source such as an image, and a photodiode that detects infrared light that is return light from a point of interest. It is shown.
 また例えば、特許文献4には、空中結像させる画像を表示する表示部を有する結像機構部を備えた入出力装置に、内蔵した入力検知センサを配置する案や、操作者の顔もしくは視線を検知する検知装置を搭載する案が開示されている。 Further, for example, Patent Document 4 describes a proposal of arranging a built-in input detection sensor in an input / output device provided with an image forming mechanism unit having a display unit for displaying an image to be imaged in the air, and an operator's face or line of sight. A proposal to install a detection device for detecting is disclosed.
国際公開第2016/088683号International Publication No. 2016/088863 特開2017-062709号公報Japanese Unexamined Patent Publication No. 2017-062709 特開2020-067707号公報Japanese Unexamined Patent Publication No. 2020-067707 特開2020-067838号公報Japanese Unexamined Patent Publication No. 2020-676838
 上述したように、空中の画像を結像表示するための画像表示の技術手段や、表示された空中画像情報に対して利用者が入力操作するための入力検出の技術手段については、様々な案が提案されている。しかし、空中に表示された画像表示内容に対して利用者がどのように操作すると入力がなされるのか、という点については、特に空中という3次元空間における操作手段が普及していないこともあって、経験が乏しい利用者には入力操作が妥当であるか分からない等、操作の利便性における課題があった。 As described above, there are various proposals for an image display technical means for displaying an aerial image as an image and an input detection technical means for a user to input and operate the displayed aerial image information. Has been proposed. However, regarding how the user operates the image display contents displayed in the air to input the input, the operation means in the three-dimensional space of the air is not widely used. There was a problem in the convenience of the operation, such as not knowing whether the input operation is appropriate for the inexperienced user.
 また、カメラ等によって撮影した画像を用いた入力検出の場合、利用者のジェスチャーを所定の入力操作として認識するための画像認識処理を行うためには、処理能力の高い制御部が必要となり、処理時間や製品価格の面で課題があった。 Further, in the case of input detection using an image taken by a camera or the like, a control unit having high processing capacity is required in order to perform image recognition processing for recognizing a user's gesture as a predetermined input operation. There were issues in terms of time and product price.
 本発明は以上の点を考慮してなされたもので、空中像に対して利用者が直感的に操作可能な利便性の高い入力装置を、簡素な装置構成で実現する空中像表示入力装置及び空中像表示入力方法を提案しようとするものである。 The present invention has been made in consideration of the above points, and is an aerial image display input device that realizes a highly convenient input device that can be intuitively operated by a user for an aerial image with a simple device configuration. This is an attempt to propose an aerial image display input method.
 かかる課題を解決するため本発明においては、下記[1]~[14]の空中像表示入力装置、及びこれらの空中像表示入力装置による空中像表示入力方法を提供する。
[1]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
 前記入力検出センサは、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺に平行な直線上に配し、
 前記矩形投影面の辺のうち前記映像透過プレートと平行でない一組の辺の中点の近傍を結んだ線分と、前記入力検出センサの配置直線とで形成する面を入力検出領域として配置し、
 前記所定の映像を表示する表示部は、前記矩形投影面の前記中点を結んだ線分上の近傍に、入力案内画面の中の入力用操作ボタンを表示し、
 前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
 前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
 ことを特徴とする空中像表示入力装置。
[2]
 前記入力検出センサは、反射型距離測定センサであって、直線状に2個以上のN個が配置され、
 前記表示部は、N個以下の入力用操作ボタンを表示し、
 前記制御部は、前記N個の反射型距離測定センサの出力信号の変化により、前記入力用操作ボタンの押下を判定する
 ことを特徴とする、上記[1]に記載の空中像表示入力装置。
[3]
 前記入力案内画面で入力可能な入力内容の変更を要求する前記空中操作として、所定の操作方向に物体を移動させる移動操作が設けられ、
 前記表示部が表示するN個以下の前記入力用操作ボタンの一つ以上は、前記移動操作の前記操作方向を示す表示変更要求ボタンであって、
 前記制御部は、前記N個の反射型距離測定センサの出力信号のうちで、前記表示変更要求ボタンを検出するセンサ出力信号と、前記表示変更要求ボタンに隣接する部分を検出するセンサ出力信号との変化情報に基づいて、前記表示変更要求ボタンの押下を判定する
 ことを特徴とする、上記[2]に記載の空中像表示入力装置。
[4]
 前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、前記一体構造は角度を可変とする
 ことを特徴とする、上記[1]~[3]の何れかに記載の空中像表示入力装置。
[5]
 前記空中像表示入力装置は、利用者との間で媒体処理または所定の作業を実行する取扱処理装置と接続し、前記制御部は、前記取扱処理装置からの情報に基づき、前記表示部に表示する前記入力用操作ボタンの映像を切り替え、前記入力用操作ボタンの押下の判定情報を、前記取扱処理装置に出力する
 ことを特徴とする、上記[1]~[4]の何れかに記載の空中像表示入力装置。
[6]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度を持って、利用者視線に対向する矩形投影面であって、
 前記入力検出センサは、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺に平行な2本以上のM本の直線上に配し、
 前記矩形投影面の辺のうち前記映像透過プレートと平行でない一組の辺の(M+1)等分点の近傍を結んだ線分と、前記入力検出センサのM本の配置直線で形成するM個の面を入力検出領域として配置し、
 前記所定の映像を表示する表示部は、前記矩形投影面における前記(M+1)等分点の近傍を結んだ線分上の近傍に、入力案内画面の中の入力用操作ボタンを表示し、
 前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
 前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
 ことを特徴とする空中像表示入力装置。
[7]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと約75度の角度をなして、利用者視線に対向する矩形投影面であって、
 前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
 前記入力検出センサは、前記三次元空間投射面に表示された前記入力用操作ボタンの領域内を略垂直方向に物体が移動することを検出し、
 前記制御部は、前記入力検出センサによる物体移動の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
 ことを特徴とする空中像表示入力装置。
[8]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
 前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを、前記映像透過プレートの矩形の一辺と反対側に厚み方向の影像を持った立体斜視画像として表示し、
 前記入力検出センサは、前記三次元空間投射面に表示された前記入力用操作ボタンの領域内を略垂直方向に物体が移動することを検出し、
 前記制御部は、前記入力検出センサの物体移動の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記入力用操作ボタンが押下されたと判定した場合は、前記表示部が表示する当該入力用操作ボタンを、前記厚み方向の影像を持たない平面画像で、押下状態にないときの当該入力用操作ボタンの前記厚み方向の影像の長さの分だけ移動した位置に表示することにより、当該入力用操作ボタンが押下状態であることを示すようにする
 ことを特徴とする空中像表示入力装置。
[9]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
 前記入力検出センサは、複数組の赤外線発光素子及び受光素子を有し、前記赤外線発光素子による発光の対象物による反射光を前記受光素子が受光する際の受光方向に基づいて、三角測量の原理で前記対象物までの距離を算出する反射光距離センサであって、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺にそれぞれ平行な2直線のうち、複数の前記赤外線発光素子が前記映像透過プレートに近い側の直線上に配置され、複数の前記受光素子が前記映像透過プレートに遠い側の直線上に配置され、かつ、各組の前記赤外線発光素子と前記受光素子とを結ぶ直線が前記平行な2直線と直角に交わるように配置され、
 前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
 前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
 前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
 ことを特徴とする空中像表示入力装置。
[10]
 所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
 前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
 所定の制御を行う制御部と、
 を有する空中像表示入力装置において、
 前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
 前記入力検出センサは、
 前記三次元空間投射面と同一の面上で、前記利用者の空中操作に伴う物体の位置について、前記映像透過プレートの矩形の一辺方向をX方向、それと直角方向をY方向としたとき、少なくとも前記Y方向の位置を検出する投射面タッチセンサと、
 前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する当該他の一辺に平行な直線上にそれぞれ配置され、対象物までの距離を算出する複数個の距離センサと、を配し、
 前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
 前記入力検出センサは、前記投射面タッチセンサによって、前記入力用操作ボタンの領域の物体有無を検出するとともに、前記複数個の反射光距離センサによって、前記入力用操作ボタンの領域に対する前記物体の押す方向の移動量を検出し、
 前記制御部は、前記入力検出センサによって検出された物体有無の検出情報と、前記物体の押す方向の移動量の検出情報とに基づき、前記入力用操作ボタンの押下を判定し、前記入力用操作ボタンが押下されたと判定した場合は、前記表示部が表示する当該入力用操作ボタンを、押下状態であることを示すように変更する
 ことを特徴とする空中像表示入力装置。
[11]
 前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、
 前記空中像投射ユニットの両側面に、前記入力検出センサ、前記映像透過プレート、及び前記三次元空間投射面を覆う三角形状の側面ガイドをさらに有する
 ことを特徴とする、上記[7]~[10]の何れかに記載の空中像表示入力装置。
[12]
 前記入力案内画面で入力可能な入力内容の変更を要求する前記空中操作として、所定の操作方向に物体を移動させる移動操作が設けられ、
 前記表示部が表示するN個以下の前記入力用操作ボタンの一つ以上は、前記移動操作の前記操作方向を表す表示変更要求ボタンであって、
 前記制御部は、前記N個の反射型距離測定センサの出力信号のうちで、前記表示変更要求ボタンを検出するセンサ出力信号と、前記表示変更要求ボタンに隣接する部分を検出するセンサ出力信号との変化情報に基づいて、前記表示変更要求ボタンの押下を判定する
 ことを特徴とする、上記[7]~[11]の何れかに記載の空中像表示入力装置。
[13]
 前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、前記一体構造は角度を可変とする
 ことを特徴とする、上記[7]~[12]の何れかに記載の空中像表示入力装置。
[14]
 前記空中像表示入力装置は、利用者との間で媒体処理または所定の作業を実行する取扱処理装置と接続し、前記制御部は、前記取扱処理装置からの情報に基づき、前記表示部に表示する前記入力用操作ボタンの映像を切り替え、前記入力用操作ボタンの押下の判定情報を、前記取扱処理装置に出力する
 ことを特徴とする、上記[7]~[13]の何れかに記載の空中像表示入力装置。
In order to solve such a problem, the present invention provides the following aerial image display input devices [1] to [14] and an aerial image display input method using these aerial image display input devices.
[1]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
The input detection sensor is arranged on a straight line parallel to the other side, which is located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
A surface formed by a line segment connecting the vicinity of the midpoint of a set of sides not parallel to the image transmission plate among the sides of the rectangular projection surface and the arrangement straight line of the input detection sensor is arranged as an input detection area. ,
The display unit that displays the predetermined image displays an input operation button in the input guidance screen in the vicinity of the line segment connecting the midpoints of the rectangular projection surface.
The input detection sensor detects the presence or absence of an object in the area of the input operation button.
The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
[2]
The input detection sensor is a reflection type distance measurement sensor, and two or more N pieces are linearly arranged.
The display unit displays N or less input operation buttons.
The aerial image display input device according to the above [1], wherein the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
[3]
As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided.
One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
Among the output signals of the N reflection type distance measurement sensors, the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button. The aerial image display input device according to the above [2], wherein the pressing of the display change request button is determined based on the change information of the above.
[4]
The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. The aerial image display input device according to any one of the above [1] to [3], wherein the integrated structure has a variable angle.
[5]
The aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device. The description according to any one of [1] to [4] above, wherein the video of the input operation button is switched, and the determination information of pressing the input operation button is output to the handling processing device. Aerial image display input device.
[6]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
The input detection sensor is arranged on two or more M straight lines parallel to the other side, which are located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
Of the sides of the rectangular projection surface, M pieces formed by a line segment connecting the vicinity of the (M + 1) equal division point of a set of sides not parallel to the image transmission plate and M arrangement straight lines of the input detection sensor. Place the surface as an input detection area,
The display unit that displays the predetermined image displays an input operation button in the input guide screen in the vicinity of a line segment connecting the vicinity of the (M + 1) equal division point on the rectangular projection surface.
The input detection sensor detects the presence or absence of an object in the area of the input operation button.
The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
[7]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at an angle of about 75 degrees with the image transmission plate.
The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
The input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
The control unit determines that the input operation button is pressed based on the detection information of the object movement by the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
[8]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
The display unit that displays the predetermined image displays the input operation button in the input guidance screen as a stereoscopic perspective image having an image in the thickness direction on the side opposite to one side of the rectangle of the image transmission plate.
The input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
The control unit determines that the input operation button is pressed based on the detection information of the object movement of the input detection sensor, and when it is determined that the input operation button is pressed, the display unit displays the display. By displaying the input operation button on a flat image having no image in the thickness direction at a position moved by the length of the image in the thickness direction of the input operation button when it is not in the pressed state. An aerial image display input device characterized by indicating that the input operation button is in the pressed state.
[9]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
The input detection sensor has a plurality of sets of infrared light emitting elements and light receiving elements, and the principle of triangular measurement is based on the light receiving direction when the light receiving element receives the reflected light by the object of light emission by the infrared light emitting element. It is a reflected light distance sensor that calculates the distance to the object, and is a two straight line that is located near the other side of the rectangular shape of the image transmission plate and is parallel to the other side. Among them, the plurality of the infrared light emitting elements are arranged on a straight line on the side close to the image transmitting plate, the plurality of the light receiving elements are arranged on the straight line on the side far from the image transmitting plate, and each set of the infrared rays is arranged. A straight line connecting the light emitting element and the light receiving element is arranged so as to intersect the two parallel straight lines at right angles.
The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
The input detection sensor detects the presence or absence of an object in the area of the input operation button.
The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
[10]
It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
A control unit that performs predetermined control and
In the aerial image display input device having
The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
The input detection sensor is
With respect to the position of the object associated with the user's aerial operation on the same surface as the three-dimensional space projection surface, at least when one side direction of the rectangle of the image transmission plate is the X direction and the direction perpendicular to it is the Y direction. A projection surface touch sensor that detects the position in the Y direction, and
A plurality of distance sensors arranged on a straight line parallel to the other side located in the vicinity of the other side of the rectangle of the image transmission plate and calculating the distance to the object. Arrange,
The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
The input detection sensor detects the presence or absence of an object in the area of the input operation button by the projection surface touch sensor, and pushes the object against the area of the input operation button by the plurality of reflected light distance sensors. Detects the amount of movement in the direction and
The control unit determines that the input operation button is pressed based on the detection information of the presence or absence of an object detected by the input detection sensor and the detection information of the amount of movement of the object in the pushing direction, and determines that the input operation button is pressed. An aerial image display input device, characterized in that, when it is determined that a button has been pressed, the input operation button displayed by the display unit is changed to indicate that the button is in the pressed state.
[11]
The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. Arranged in
[7] to [10], wherein the input detection sensor, the image transmission plate, and the triangular side guides covering the three-dimensional space projection surface are further provided on both side surfaces of the aerial image projection unit. ]. The aerial image display input device according to any one of.
[12]
As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided.
One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
Among the output signals of the N reflection type distance measurement sensors, the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button. The aerial image display input device according to any one of the above [7] to [11], wherein the pressing of the display change request button is determined based on the change information of the above.
[13]
The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. The aerial image display input device according to any one of [7] to [12] above, wherein the integrated structure has a variable angle.
[14]
The aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device. The above-mentioned [7] to [13], wherein the image of the input operation button is switched, and the determination information of pressing the input operation button is output to the handling processing device. Aerial image display input device.
 本発明によれば、空中像に対して利用者が直感的に操作可能な利便性の高い入力装置を、簡素な装置構成で実現することができる。 According to the present invention, it is possible to realize a highly convenient input device that can be intuitively operated by the user for an aerial image with a simple device configuration.
本発明の実施例1に係る空中像表示入力装置1の外観斜視図である。It is an external perspective view of the aerial image display input device 1 which concerns on Example 1 of this invention. 空中像表示入力装置1の側面図である。It is a side view of the aerial image display input device 1. 空中像表示入力装置1の構成例を示すブロック図である。It is a block diagram which shows the configuration example of the aerial image display input device 1. 角度変更前の空中像表示入力装置1の側面図である。It is a side view of the aerial image display input device 1 before the angle change. 角度変更時の空中像表示入力装置1の側面図である。It is a side view of the aerial image display input device 1 at the time of changing an angle. 空中像表示入力装置1の角度変更に伴う利用者8の視線角度の変化を示す図である。It is a figure which shows the change of the line-of-sight angle of the user 8 with the change of the angle of the aerial image display input device 1. 角度変更時の空中像表示入力装置1と利用者8との位置関係を示す図である。It is a figure which shows the positional relationship between the aerial image display input device 1 and the user 8 at the time of changing an angle. 入力用操作ボタン30に対する押下動作による入力操作のイメージ図である。It is an image diagram of the input operation by the pressing operation with respect to the input operation button 30. 押下動作による入力操作時に反射光距離センサ51~53が検出する距離情報の変化を示すグラフである。It is a graph which shows the change of the distance information detected by the reflected light distance sensor 51-53 at the time of an input operation by a pressing operation. 入力判定処理の処理手順例を示すフローチャートである。It is a flowchart which shows the processing procedure example of the input determination process. 入力用操作ボタン30に対するスワイプ操作のイメージ図である。It is an image diagram of the swipe operation with respect to the input operation button 30. スワイプ操作時に反射光距離センサ51~53が検出する距離情報の変化を示すグラフである。It is a graph which shows the change of the distance information detected by the reflected light distance sensors 51 to 53 at the time of a swipe operation. 第2の入力判定処理の処理手順例を示すフローチャートである。It is a flowchart which shows the processing procedure example of the 2nd input determination process. 空中像表示入力装置1を病院向けの受付精算機94に接続したときのシステムの外観図である。It is an external view of the system when the aerial image display input device 1 is connected to the reception settlement machine 94 for hospitals. 図9に示したシステムにおける空中像表示入力装置1と受付精算機94の制御部間の処理を模式的に示す図である。It is a figure which shows typically the process between the aerial image display input device 1 and the control part of the reception settlement machine 94 in the system shown in FIG. 本発明の実施例2に係る空中像表示入力装置1Aの外観斜視図である。It is an external perspective view of the aerial image display input device 1A which concerns on Example 2 of this invention. 実施例3における入力用操作ボタン30の表示例を示す図である。It is a figure which shows the display example of the input operation button 30 in Example 3. FIG. 実施例4に係る空中像表示入力装置1Bの外観斜視図である。It is an external perspective view of the aerial image display input device 1B which concerns on Example 4. FIG. 空中像表示入力装置1Bの側面図である。It is a side view of the aerial image display input device 1B. 実施例5に係る空中像表示入力装置1Cの外観斜視図である。It is an external perspective view of the aerial image display input device 1C which concerns on Example 5. FIG. 空中像表示入力装置1Cによる三次元空間投射面10の表示画像例を示す図(その1)である。It is a figure (the 1) which shows the display image example of the three-dimensional space projection surface 10 by the aerial image display input device 1C. 空中像表示入力装置1Cによる三次元空間投射面10の表示画像例を示す図(その2)である。FIG. 2 is a diagram (No. 2) showing an example of a display image of a three-dimensional space projection surface 10 by an aerial image display input device 1C. 図16Bに示した表示画像と空中像表示入力装置1Cの反射光距離センサの配置との関係を示す図である。It is a figure which shows the relationship between the display image shown in FIG. 16B, and the arrangement of the reflected light distance sensor of the aerial image display input device 1C. 実施例6に係る空中像表示入力装置1Dの外観斜視図である。It is an external perspective view of the aerial image display input device 1D which concerns on Example 6. FIG. 空中像表示入力装置1Dの側面図である。It is a side view of the aerial image display input device 1D.
 以下、図面を参照して、本発明の実施例について詳述する。 Hereinafter, examples of the present invention will be described in detail with reference to the drawings.
 図1は、本発明の第1の実施例(実施例1)に係る空中像表示入力装置1の外観斜視図である。また図2は、空中像表示入力装置1の側面図である。なお、図2の空中像表示入力装置1は、側方から見た断面を表すことで、内部構造を分かりやすく示している。そして図3は、空中像表示入力装置1の構成例を示すブロック図である。 FIG. 1 is an external perspective view of the aerial image display input device 1 according to the first embodiment (Example 1) of the present invention. Further, FIG. 2 is a side view of the aerial image display input device 1. The aerial image display input device 1 of FIG. 2 shows the internal structure in an easy-to-understand manner by showing a cross section seen from the side. FIG. 3 is a block diagram showing a configuration example of the aerial image display input device 1.
 図1に示すように、空中像表示入力装置1は、空中像投射ユニット2、入力検出センサ5、及び制御部7を備えて構成され、これらは筐体台座12の上に搭載された筐体11の中に実装される。なお、図1には、三次元空間投射面10に、入力用操作ボタン30と利用者操作の手80が模式的に示されている。三次元空間投射面10は、空中像表示入力装置1によって空中像(入力用操作ボタン30)が表示される三次元空間の投射面であり、手80は、空中像表示入力装置1を利用する利用者8の部位の一例である。詳細は後述するが、本実施例では、利用者8は、手80を三次元空間投射面10に投射された入力用操作ボタン30の位置に置くことによって、空中像表示入力装置1に対する入力操作を行うことができる。 As shown in FIG. 1, the aerial image display input device 1 is configured to include an aerial image projection unit 2, an input detection sensor 5, and a control unit 7, which are mounted on a housing pedestal 12. It is implemented in 11. Note that FIG. 1 schematically shows an input operation button 30 and a user-operated hand 80 on the three-dimensional space projection surface 10. The three-dimensional space projection surface 10 is a projection surface in three-dimensional space on which an aerial image (input operation button 30) is displayed by the aerial image display input device 1, and the hand 80 uses the aerial image display input device 1. This is an example of the part of the user 8. Although the details will be described later, in the present embodiment, the user 8 places the hand 80 at the position of the input operation button 30 projected on the three-dimensional space projection surface 10 to perform an input operation on the aerial image display input device 1. It can be performed.
 また、図3のブロック図を参照すると、空中像表示入力装置1は、空中像投射ユニット2、入力検出センサ5、及び制御部7を備えて構成される。空中像投射ユニット2は、表示部3及び映像透過プレート4(光分岐部材40,再帰反射部材41)を有する。入力検出センサ5は、反射型の距離測定センサである反射光距離センサ51,52,53を有する。制御部7は、スピーカ74を内蔵し、入力判定処理部71、入出力I/F処理部72、及び画面制御処理部73を有する。また、空中像表示入力装置1は、別装置である取扱処理装置9に接続可能であり、取扱処理装置9は、制御部91、表示部92、及び処理ユニット93を備えて構成される。 Further, referring to the block diagram of FIG. 3, the aerial image display input device 1 includes an aerial image projection unit 2, an input detection sensor 5, and a control unit 7. The aerial image projection unit 2 has a display unit 3 and an image transmission plate 4 (light branch member 40, retroreflection member 41). The input detection sensor 5 has reflected light distance sensors 51, 52, and 53, which are reflection type distance measurement sensors. The control unit 7 has a built-in speaker 74, and has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73. Further, the aerial image display input device 1 can be connected to a handling processing device 9 which is a separate device, and the handling processing device 9 includes a control unit 91, a display unit 92, and a processing unit 93.
 以下に、空中像表示入力装置1の各構成について詳しく説明する。 The configurations of the aerial image display input device 1 will be described in detail below.
 空中像投射ユニット2は、液晶等による表示部3の画像を、内部の光の反射や透過作用によって、映像透過プレート4を透過して三次元空間投射面10に画像を結像させるユニットである。本実施例では、空中像表示入力装置1の内部において、上部に高輝度液晶の表示部3を配し、映像透過プレート4にはハーフミラー等の光分岐部材40を表示部3に対して45度程度の角度をもって配し、さらに、光分岐部材40で反射した光を同一方向に再帰反射する再帰反射部材41を配する。このような構造とすることにより、反射光が、光分岐部材40を透過し、三次元空間投射面10に結像し空中像を形成する。なお、一般的には、表示部3と三次元空間投射面10の空中像とは、映像透過プレート4に対して線対称となる。 The aerial image projection unit 2 is a unit that transmits an image of the display unit 3 by a liquid crystal display or the like through an image transmission plate 4 by internal light reflection or transmission action to form an image on a three-dimensional space projection surface 10. .. In this embodiment, inside the aerial image display input device 1, a high-brightness liquid crystal display unit 3 is arranged at the top, and an optical branching member 40 such as a half mirror is provided on the image transmission plate 4 with respect to the display unit 3. Arranged at an angle of about degrees, and further arranged, a retroreflective member 41 that retroreflects the light reflected by the optical branch member 40 in the same direction. With such a structure, the reflected light passes through the optical branching member 40 and is imaged on the three-dimensional space projection surface 10 to form an aerial image. In general, the display unit 3 and the aerial image of the three-dimensional space projection surface 10 are line-symmetrical with respect to the image transmission plate 4.
 上記のような構成による空中像投射技術は、前述した先行技術文献等で広く公開されており、本実施例以外の構成を採用するものであってもよい。例えば、本実施例では、ハーフミラー等の光分岐部材40を、映像透過プレート4に採用し、再帰反射部材41との組合せで空中像を形成しているが、別の実施方法として、光分岐部材40を用いない映像透過プレート4も考案されており、このような別の実施方法でも、本実施例と同様に空中像の形成を実現することができる。 The aerial image projection technique having the above configuration is widely disclosed in the above-mentioned prior art documents and the like, and a configuration other than the present embodiment may be adopted. For example, in this embodiment, an optical branching member 40 such as a half mirror is adopted for the image transmitting plate 4 to form an aerial image in combination with the retroreflective member 41, but as another embodiment, optical branching is performed. An image transmission plate 4 that does not use the member 40 has also been devised, and the formation of an aerial image can be realized by such another embodiment as in the present embodiment.
 また、本実施例では、表示部3には5インチサイズの小型高輝度液晶を配し、光分岐部材40を、図2に示したように垂直面を形成する矩形形状とすることにより、三次元空間投射面10に、表示部3の5インチ液晶の画面サイズとほぼ同一サイズの矩形の表示領域を形成し、光分岐部材40と三次元空間投射面10とがなす角度θ1(図2参照)が約75度となるように配している。なお、上記表示領域の矩形は、矩形の各頂点A1,A2,A3,A4を用いると(図1参照)、矩形A1-A2-A3-A4とも表され、このような表記方法は、後述する他の矩形でも同様である。 Further, in this embodiment, a small high-brightness liquid crystal having a size of 5 inches is arranged in the display unit 3, and the optical branching member 40 has a rectangular shape forming a vertical surface as shown in FIG. A rectangular display area having a size substantially the same as the screen size of the 5-inch liquid crystal of the display unit 3 is formed on the original space projection surface 10, and the angle θ1 between the optical branch member 40 and the three-dimensional space projection surface 10 (see FIG. 2). ) Is arranged so that it is about 75 degrees. The rectangle in the display area is also represented as a rectangle A1-A2-A3-A4 by using the respective vertices A1, A2, A3, A4 of the rectangle (see FIG. 1), and such a notation method will be described later. The same applies to other rectangles.
 入力検出センサ5は、本実施例では、3個の反射光距離センサ51,52,53を備える。それぞれの反射光距離センサ51,52,53は、発光素子(例えば赤外線発光素子)及び受光素子を有し、発光素子及び受光素子の光軸方向にある対象物までの距離を検出することができる距離センサである。図1に示すように、3個の反射光距離センサ51,52,53は、直線状(図1における線分B1-B2上)に配置される。さらに反射光距離センサ51,52,53は、図2に示すように、光分岐部材40の下端近傍に配置され、発光素子と受光素子の光軸方向によって決まる入力検出方向6を含む入力検出面61(すなわち、図1に示す矩形B1-B2-B3-B4)が三次元空間投射面10と交差するように配置される。また、入力検出方向6(入力検出面61)と光分岐部材40とが、角度θ2をなして配置される。入力検出面61と三次元空間投射面10との交線の線分B3,B4は、三次元空間投射面10の辺A1-A4及び辺A2-A3の中点を結ぶ線分とほぼ同一となる配置とされている。本実施例では、三次元空間投射面10と光分岐部材40とがなす角度θ2は約10度としている。また、反射光距離センサ51,52,53は、それぞれの発光素子及び受光素子に対向した方向(図1の矢印付き実線を参照)に物体が存在するときの距離情報を取得することができるので、三次元空間投射面10の近傍にある利用者の手80が、上記矢印方向の領域に存在するときに、手80までの距離情報を取得し、手80の動きを入力として検出する。 In this embodiment, the input detection sensor 5 includes three reflected light distance sensors 51, 52, and 53. Each of the reflected light distance sensors 51, 52, and 53 has a light emitting element (for example, an infrared light emitting element) and a light receiving element, and can detect the distance to the object in the optical axis direction of the light emitting element and the light receiving element. It is a distance sensor. As shown in FIG. 1, the three reflected light distance sensors 51, 52, and 53 are arranged in a straight line (on the line segments B1-B2 in FIG. 1). Further, as shown in FIG. 2, the reflected light distance sensors 51, 52, and 53 are arranged near the lower end of the optical branch member 40, and include an input detection direction 6 determined by the optical axis direction of the light emitting element and the light receiving element. 61 (that is, the rectangles B1-B2-B3-B4 shown in FIG. 1) are arranged so as to intersect the three-dimensional space projection surface 10. Further, the input detection direction 6 (input detection surface 61) and the optical branch member 40 are arranged at an angle θ2. The line segments B3 and B4 of the line of intersection between the input detection surface 61 and the three-dimensional space projection surface 10 are almost the same as the line segments connecting the midpoints of the sides A1-A4 and the sides A2-A3 of the three-dimensional space projection surface 10. It is said that it is arranged. In this embodiment, the angle θ2 formed by the three-dimensional space projection surface 10 and the optical branching member 40 is set to about 10 degrees. Further, since the reflected light distance sensors 51, 52, and 53 can acquire distance information when an object exists in the direction facing each of the light emitting element and the light receiving element (see the solid line with an arrow in FIG. 1). When the user's hand 80 in the vicinity of the three-dimensional space projection surface 10 exists in the area in the direction of the arrow, the distance information to the hand 80 is acquired, and the movement of the hand 80 is detected as an input.
 制御部7は、図3のブロック図に示したように、入力判定処理部71、入出力I/F処理部72、及び画面制御処理部73を有する。入力判定処理部71は、反射光距離センサ51,52,53からなる入力検出センサ5や、内蔵するスピーカ74に接続して、入力用操作ボタン30に対する利用者操作の入力を処理する入力判定処理を行う。入出力I/F処理部72は、利用者との間で媒体処理や所定の作業を実行する別装置の取扱処理装置9に接続して、画面情報や入力情報を送受信する入出力I/F処理を行う。画面制御処理部73は、表示部3に接続して、表示部3における画面表示を制御する画面制御処理を行う。 As shown in the block diagram of FIG. 3, the control unit 7 has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73. The input determination processing unit 71 is connected to the input detection sensor 5 including the reflected light distance sensors 51, 52, 53 and the built-in speaker 74, and is an input determination process for processing the input of the user operation to the input operation button 30. I do. The input / output I / F processing unit 72 is connected to a handling processing device 9 of another device that performs medium processing or predetermined work with the user, and transmits / receives screen information and input information. Perform processing. The screen control processing unit 73 connects to the display unit 3 and performs screen control processing for controlling the screen display on the display unit 3.
 取扱処理装置9は、空中像表示入力装置1に対する利用者操作(入力操作)を受けて、利用者が所望する取り扱いの処理を実行する装置であり、図3のブロック図に示したように、制御部91、表示部92、及び処理ユニット93を備える。制御部91は、タッチパネルを有する表示部92に対する制御や、複数の取り扱いを行う処理ユニット93に対する制御を行う。 The handling processing device 9 is a device that receives a user operation (input operation) for the aerial image display input device 1 and executes the handling processing desired by the user, and as shown in the block diagram of FIG. It includes a control unit 91, a display unit 92, and a processing unit 93. The control unit 91 controls the display unit 92 having a touch panel and controls the processing unit 93 that handles a plurality of cases.
 以上のように構成された本実施例に係る空中像表示入力装置1について、以下では、空中像表示入力装置1に対する利用者8の操作と、空中像表示入力装置1の内部処理を説明する。 Regarding the aerial image display input device 1 according to the present embodiment configured as described above, the operation of the user 8 with respect to the aerial image display input device 1 and the internal processing of the aerial image display input device 1 will be described below.
 利用者8は、取扱処理装置9で所望の取り扱いを実行するために、例えば2種類の取り扱い(取扱A,取扱B)が用意されている場合、取扱A,取扱Bのいずれかを選択することから操作を始める。このとき、取扱処理装置9の制御部91は、取扱A,取扱Bの選択画面を表示していずれかの選択情報を要求することを、制御部7に送信する。制御部7では、画面制御処理部73が、表示部3に取扱A,取扱Bを示す2つの入力用操作ボタンを表示する。この結果、空中像投射ユニット2は、図1に示すように、三次元空間投射面10に、取扱A,取扱Bの選択を意味する二つの入力用操作ボタン30を表示する。 In order to perform the desired handling in the handling processing device 9, the user 8 selects, for example, one of handling A and handling B when two types of handling (handling A and handling B) are prepared. Start the operation from. At this time, the control unit 91 of the handling processing device 9 displays the selection screens of handling A and handling B and transmits to the control unit 7 to request any selection information. In the control unit 7, the screen control processing unit 73 displays two input operation buttons indicating handling A and handling B on the display unit 3. As a result, as shown in FIG. 1, the aerial image projection unit 2 displays on the three-dimensional space projection surface 10 two input operation buttons 30 indicating selection of handling A and handling B.
 次に、利用者8は、三次元空間投射面10に表示された入力用操作ボタン30を認知し、手80を、取扱A,取扱Bのうち所望の取扱を意味するいずれかの入力用操作ボタン30(ここでは取扱Aとする)の上面にかざし、ボタンを押下するような動作(ボタン押下動作)をする。このボタン押下動作を、反射光距離センサ51,52,53のうち、選択した入力用操作ボタン30を検出する反射光距離センサ(ここでは反射光距離センサ51とする)が、利用者の手80を検出する。この反射光距離センサ51による検出に基づいて、制御部7では、入力判定処理部71が、入力検出方向6の移動を検出することで、取扱Aのボタン押下と判定する。そして制御部7では、画面制御処理部73が、利用者に対して表示ボタンが押下されたことを伝達するために、表示部3の画面表示を制御してボタン表示の色と形状を変更した画面に切り替えるとともに、スピーカ74から所定の音を出力する。さらに制御部7では、入出力I/F処理部72が、取扱Aの選択情報を取扱処理装置9に送信する。 Next, the user 8 recognizes the input operation button 30 displayed on the three-dimensional space projection surface 10, and holds the hand 80 for an input operation meaning the desired handling of the handling A and the handling B. It is held over the upper surface of the button 30 (here, it is referred to as handling A), and an operation of pressing the button (button pressing operation) is performed. The reflected light distance sensor (here, the reflected light distance sensor 51) that detects the selected input operation button 30 among the reflected light distance sensors 51, 52, and 53 performs this button pressing operation by the user's hand 80. Is detected. Based on the detection by the reflected light distance sensor 51, the control unit 7 determines that the button of the handling A is pressed by the input determination processing unit 71 detecting the movement in the input detection direction 6. Then, in the control unit 7, the screen control processing unit 73 controls the screen display of the display unit 3 to change the color and shape of the button display in order to notify the user that the display button has been pressed. While switching to the screen, a predetermined sound is output from the speaker 74. Further, in the control unit 7, the input / output I / F processing unit 72 transmits the selection information of the handling A to the handling processing device 9.
 本実施例によれば、利用者8は、取扱処理装置9の取扱選択を、取扱処理装置9に設けられた表示部92(タッチパネル)に手で触れて操作することなく、タッチレスで入力することができるので、取扱処理装置9及び空中像表示入力装置1を清潔な機器にできる。また、タッチパネルでの操作が難しい場合、例えば手袋を装着した利用者や手先に障害を持つ利用者であっても、タッチレスによって容易に操作が可能なため、利便性の高い入力装置を提供することができる。 According to this embodiment, the user 8 inputs the handling selection of the handling processing device 9 touchlessly without touching and operating the display unit 92 (touch panel) provided on the handling processing device 9. Therefore, the handling processing device 9 and the aerial image display input device 1 can be made into clean equipment. Further, when it is difficult to operate with a touch panel, for example, even a user wearing gloves or a user with a handicap can easily operate by touchless, thus providing a highly convenient input device. be able to.
 また、本実施例に係る空中像表示入力装置1は、一般的なタッチレス操作が可能なことによる上記効果に留まらず、以下の効果も奏する。 Further, the aerial image display input device 1 according to the present embodiment has the following effects in addition to the above-mentioned effects due to the general touchless operation.
 まず、空中像を結像する三次元空間投射面10が、光分岐部材40と角度θ1(約75度)をなして配置されることで、図2に示したように利用者8からの視点の下方で視線に対面していることになるため、外部からの侵入光や反射光の影響を受けにくく、視認性がよい空中像とすることができる。 First, the three-dimensional spatial projection surface 10 that forms an aerial image is arranged at an angle θ1 (about 75 degrees) with the optical branch member 40, so that the viewpoint from the user 8 is as shown in FIG. Since it faces the line of sight below the area, it is not easily affected by invading light or reflected light from the outside, and an aerial image with good visibility can be obtained.
 また、三次元空間投射面10が、利用者の手80の近傍にあって、図2に示したように利用者8が手のひらを自然にかざしやすいほぼ水平状態の時には、手のひら(手80)と三次元空間投射面10とが平行に近いため、利用者は入力用操作ボタン30の位置に容易に手80をかざしやすく、かつ、手のひらを自然に垂直方向に下ろす操作は容易であることから、より操作性を高めることができる。 Further, when the three-dimensional space projection surface 10 is in the vicinity of the user's hand 80 and the user 8 is in a substantially horizontal state in which the user 8 can easily hold the palm naturally as shown in FIG. 2, the palm (hand 80) is used. Since the three-dimensional space projection surface 10 is close to parallel, the user can easily hold the hand 80 over the position of the input operation button 30, and the palm can be naturally lowered in the vertical direction. The operability can be further improved.
 また、入力検出方向6が、図2に示したように利用者8が手のひらを自然にかざすにほぼ水平方向に対して、角度θ3が約90度であり、利用者8がボタン押下動作で手のひら(手80)を動かす方向と平行に、入力検出面61があることから、利用者8による「押す」操作の検出感度(精度)を高めることができる。 Further, as shown in FIG. 2, the input detection direction 6 has an angle θ3 of about 90 degrees with respect to the substantially horizontal direction in which the user 8 naturally holds the palm, and the user 8 presses the button to press the palm. Since the input detection surface 61 is located parallel to the direction in which the (hand 80) is moved, the detection sensitivity (accuracy) of the "push" operation by the user 8 can be improved.
 なお、本実施例では、入力検出センサ5に、反射光距離センサ51,52,53を用いて利用者8の手のひらの移動を検出しているが、カメラ等別のセンサ手段を用いても、同様に操作性の良い装置とすることができる。但し、その際は、カメラ等のセンサ手段の配置及び検知方法は、反射光距離センサ51,52,53を用いる場合とは異なるものとなる。 In this embodiment, the reflected light distance sensors 51, 52, and 53 are used in the input detection sensor 5 to detect the movement of the palm of the user 8, but even if another sensor means such as a camera is used, the movement of the palm is detected. Similarly, it can be a device with good operability. However, in that case, the arrangement and detection method of the sensor means such as a camera are different from the case of using the reflected light distance sensors 51, 52, 53.
 また、ボタン押下動作の後に、ボタン表示における色や形状の変化や、音声出力が行われることにより、利用者8は、操作が機器に認識されたことを、触覚の代わりに視覚または聴覚によっても認知することができる。 Further, after the button pressing operation, the color and shape of the button display are changed and the voice is output, so that the user 8 can visually or audibly recognize that the operation is recognized by the device instead of the tactile sense. Can be recognized.
 以上のような様々な効果を奏することにより、本実施例に係る空中像表示入力装置1は、空中像に対して利用者が直感的に操作可能で、利便性が高く、反応(操作感度)が良い入力装置を簡素な装置構成で提供することができる。 By achieving the various effects as described above, the aerial image display input device 1 according to the present embodiment can be intuitively operated by the user with respect to the aerial image, is highly convenient, and has a reaction (operation sensitivity). Can provide a good input device with a simple device configuration.
 また、本実施例では、前述したように、表示部3に比較的小型の5インチ液晶を配し、三次元空間投射面10上の水平方向の線分B3-B4上に、2個の入力用操作ボタン30が表示される構成とすることによって、利用者の操作感を高める効果が得られる。具体的には、利用者の操作感として、三次元空間投射面10が大きすぎると空中像が背景の光分岐部材40から離れすぎて、人によっては空中像の距離感が把握し難くなる。また、三次元空間投射面10が小さすぎると、入力用操作ボタン30が光分岐部材40に近すぎて、ボタン操作の動作中に手80が映像透過プレート4のパネルに触れてしまう可能性がある。また、三次元空間投射面10に配置する入力用操作ボタン30を前後方向に配置すると、三次元空間投射面10の前後方向の傾き(図2の角度θ1)を考慮して、斜め方向に手80を動かさなければならず、操作が困難になる。本実施例における上記の構成はこのような利用者の操作感の低下を解消するものであり、この結果として利用者8は、背景にある光分岐部材40を基準にして、視認性の良い状況で操作が可能となり、さらに、操作時に必要な手80の動きは、奥行きや傾きを考慮することなく、小さく単純な押下動作でよくなる。すなわち、本実施例に係る空中像表示入力装置1は、操作性の高いタッチレス操作を実現することができる。 Further, in this embodiment, as described above, a relatively small 5-inch liquid crystal display is arranged on the display unit 3, and two inputs are provided on the horizontal line segments B3-B4 on the three-dimensional space projection surface 10. By setting the operation button 30 to be displayed, the effect of enhancing the operation feeling of the user can be obtained. Specifically, as a user's feeling of operation, if the three-dimensional space projection surface 10 is too large, the aerial image is too far from the background optical branching member 40, and it becomes difficult for some people to grasp the sense of distance of the aerial image. Further, if the three-dimensional space projection surface 10 is too small, the input operation button 30 may be too close to the optical branch member 40, and the hand 80 may touch the panel of the image transmission plate 4 during the operation of the button operation. be. Further, when the input operation button 30 arranged on the three-dimensional space projection surface 10 is arranged in the front-rear direction, the hand is tilted in the diagonal direction in consideration of the inclination of the three-dimensional space projection surface 10 in the front-rear direction (angle θ1 in FIG. 2). The 80 has to be moved, which makes the operation difficult. The above configuration in the present embodiment eliminates such a decrease in the operation feeling of the user, and as a result, the user 8 has a good visibility with reference to the optical branching member 40 in the background. Further, the movement of the hand 80 required for the operation can be performed by a small and simple pressing operation without considering the depth and the inclination. That is, the aerial image display input device 1 according to the present embodiment can realize a touchless operation with high operability.
 以下では、図4~図10を参照しながら、本実施例に係る空中像表示入力装置1の詳細な構成及び処理方法を説明する。 Hereinafter, the detailed configuration and processing method of the aerial image display input device 1 according to the present embodiment will be described with reference to FIGS. 4 to 10.
 図4A~図4Dは、空中像表示入力装置1における角度変更例を説明する図である。詳しくは、図4Aは、角度変更前の空中像表示入力装置1の側面図であり、図4Bは、角度変更時の空中像表示入力装置1の側面図である。また、図4Cは、空中像表示入力装置1の角度変更に伴う利用者8の視線角度の変化を示す図であり、図4Dは、角度変更時の空中像表示入力装置1と利用者8との位置関係を示す図である。空中像表示入力装置1は、図4Aに示したように、筐体台座12の上に筐体11を搭載して構成されるが、図4Bに示したように、筐体11の外形の円弧に沿って角度変更可能な構成とされている。本実施例では、光分岐部材40が、垂直方向から右回転方向に角度θ4(例えば約10度)までの範囲で可動できるとしている。また、詳細な図示は省略するが、本実施例の空中像表示入力装置1は、筐体11の外形の円弧に沿って図4Bとは逆方向(左回転方向)にも同様な角度θ4(例えば約10度)までの範囲で可動できる構成としてもよい。このように筐体11の角度を変更可能な構成とすることにより、空中像表示入力装置1を図2に示した利用者8の視線よりもさらに下方あるいは上方に配置したい場合に、視認性及び操作性を最適な状態とすることができる。より具体的には、空中像表示入力装置1を図2の状態よりも下方に配置したい場合は、空中像表示入力装置1を右回転方向に可動させて配置することで、視認性及び操作性を好適な状態とすることができる。一方、空中像表示入力装置1を図2の状態よりも上方に配置したい場合は、空中像表示入力装置1を左回転方向に可動させて配置することで、視認性及び操作性を好適な状態とすることができる。図4C及び図4Dを参照しながら以下に詳しく説明する。 4A to 4D are diagrams illustrating an example of changing the angle in the aerial image display input device 1. Specifically, FIG. 4A is a side view of the aerial image display input device 1 before the angle change, and FIG. 4B is a side view of the aerial image display input device 1 when the angle is changed. Further, FIG. 4C is a diagram showing a change in the line-of-sight angle of the user 8 due to a change in the angle of the aerial image display input device 1, and FIG. 4D shows the aerial image display input device 1 and the user 8 when the angle is changed. It is a figure which shows the positional relationship of. As shown in FIG. 4A, the aerial image display input device 1 is configured by mounting the housing 11 on the housing pedestal 12, but as shown in FIG. 4B, the outer arc of the housing 11 is formed. The angle can be changed along with. In this embodiment, it is assumed that the optical branching member 40 can move in a range of an angle θ4 (for example, about 10 degrees) from the vertical direction to the right rotation direction. Further, although detailed illustration is omitted, the aerial image display input device 1 of the present embodiment has a similar angle θ4 (leftward rotation direction) along the arc of the outer shape of the housing 11 in the direction opposite to that of FIG. 4B (counterclockwise rotation direction). For example, it may be configured to be movable within a range of up to about 10 degrees). By making the angle of the housing 11 changeable in this way, when it is desired to arrange the aerial image display input device 1 further below or above the line of sight of the user 8 shown in FIG. 2, visibility and visibility and The operability can be optimized. More specifically, when it is desired to arrange the aerial image display input device 1 below the state shown in FIG. 2, the aerial image display input device 1 can be moved in the clockwise rotation direction for visibility and operability. Can be in a suitable state. On the other hand, when it is desired to arrange the aerial image display input device 1 above the state shown in FIG. 2, the aerial image display input device 1 is arranged so as to be movable in the counterclockwise direction, so that the visibility and operability are suitable. Can be. This will be described in detail below with reference to FIGS. 4C and 4D.
 図4Cは、空中像表示入力装置1の配置及び角度を組合せて変更した際の、三次元空間投射面10(10A,10B)に対する利用者8の視線角度θ5(θ5A,θ5B)を示した図であり、図4Dは、上記の組合せ変更時の利用者8と空中像表示入力装置1との位置関係を模式的に示した図である。図2の空中像表示入力装置1を図2の状態よりも下方に配置しようとする際は、角度を変更しないままだと、三次元空間投射面10の位置が単に下方に移動するため、三次元空間投射面10に対する利用者8からの視線角度θ5が変化してしまい、利用者8から空中像が見えにくくなることが想定される。ここで、空中像表示入力装置1を右回転方向に可動させた場合、可動後の空中像表示入力装置1による三次元空間投射面10Aは、可動前の三次元空間投射面10から右回転方向に傾きが変わるため、利用者8からの視線角度θ5Aは、当初配置(図2の場合)における視線角度θ5と近い値を維持することができ、利用者8が空中像を見えやすくすることができる(図4C参照)。また、空中像表示入力装置1の配置を単純に下方とすると、利用者8から空中像(三次元空間投射面10A)までの距離が図2の状態よりも遠くなり、利用者8が空中像に対する操作を不自由に感じるおそれがある。このため、空中像表示入力装置1を下方に配置する場合は、利用者8からみて手前側にずらして配置することが、操作性向上の観点から好ましい。以上のことを踏まえると、図4Dにも示したように、空中像表示入力装置1を図2の状態よりも下方に配置したい場合は、空中像表示入力装置1を右回転方向に可動させ、かつ手前側(前側)に配置することで、利用者8が手元で楽に操作を行うことができる位置に見やすい角度で空中像を表示することができるため、視認性及び操作性を最適な状態とすることができる。一方、図2の空中像表示入力装置1を図2の状態よりも上方に配置しようとする際は、角度を変更しないままだと、三次元空間投射面10の位置が単に上方に移動するため、三次元空間投射面10に対する利用者8からの視線角度θ5が変化してしまい、利用者8から空中像が見えにくくなることが想定される。ここで、空中像表示入力装置1を左回転方向に可動させた場合、可動後の空中像表示入力装置1による三次元空間投射面10Bは、可動前の三次元空間投射面10から左回転方向に傾きが変わるため、利用者8からの視線角度θ5Bは、当初配置(図2の場合)における視線角度θ5と近い値を維持することができ、利用者8が空中像を見えやすくすることができる(図4C参照)。また、空中像表示入力装置1の配置を単純に上方とすると、利用者8から空中像(三次元空間投射面10B)までの距離が図2の状態よりも近くなりすぎて、利用者8が空中像に対する操作を不自由に感じるおそれがある。このため、空中像表示入力装置1を上方に配置する場合は、利用者8からみて奥側にずらして配置することが、操作性向上の観点から好ましい。以上のことを踏まえると、図4Dにも示したように、空中像表示入力装置1を図2の状態よりも上方に配置したい場合は、空中像表示入力装置1を左回転方向に可動させ、かつ奥側(後側)に配置することで、利用者8が前に伸ばした手80で楽に操作を行うことができる位置に見やすい角度で空中像を表示することができるため、視認性及び操作性を最適な状態とすることができる。 FIG. 4C is a diagram showing the line-of-sight angle θ5 (θ5A, θ5B) of the user 8 with respect to the three-dimensional space projection surface 10 (10A, 10B) when the arrangement and angle of the aerial image display input device 1 are changed in combination. 4D is a diagram schematically showing the positional relationship between the user 8 and the aerial image display input device 1 when the combination is changed. When the aerial image display input device 1 of FIG. 2 is to be arranged below the state of FIG. 2, if the angle is not changed, the position of the three-dimensional space projection surface 10 simply moves downward, so that it is tertiary. It is assumed that the line-of-sight angle θ5 from the user 8 with respect to the original space projection surface 10 changes, making it difficult for the user 8 to see the aerial image. Here, when the aerial image display input device 1 is moved in the right rotation direction, the three-dimensional space projection surface 10A by the aerial image display input device 1 after the movement is in the right rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle θ5A from the user 8 can maintain a value close to the line-of-sight angle θ5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C). Further, if the aerial image display input device 1 is simply arranged downward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10A) becomes farther than in the state of FIG. 2, and the user 8 is in the aerial image. You may feel inconvenient to operate. Therefore, when the aerial image display input device 1 is arranged downward, it is preferable to arrange it so as to be shifted to the front side when viewed from the user 8 from the viewpoint of improving operability. Based on the above, as shown in FIG. 4D, when the aerial image display input device 1 is to be arranged below the state of FIG. 2, the aerial image display input device 1 is moved in the clockwise rotation direction. Moreover, by arranging it on the front side (front side), the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate at hand, so that the visibility and operability are optimized. can do. On the other hand, when the aerial image display input device 1 of FIG. 2 is to be arranged above the state of FIG. 2, the position of the three-dimensional space projection surface 10 simply moves upward if the angle is not changed. It is assumed that the line-of-sight angle θ5 from the user 8 with respect to the three-dimensional space projection surface 10 changes, making it difficult for the user 8 to see the aerial image. Here, when the aerial image display input device 1 is moved in the counterclockwise direction, the three-dimensional space projection surface 10B by the aerial image display input device 1 after the movement is in the left rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle θ5B from the user 8 can maintain a value close to the line-of-sight angle θ5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C). Further, if the aerial image display input device 1 is simply arranged upward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10B) becomes too close to that in the state of FIG. 2, and the user 8 You may feel inconvenient to operate the aerial image. Therefore, when the aerial image display input device 1 is arranged upward, it is preferable to arrange the aerial image display input device 1 so as to be offset from the user 8 from the viewpoint of improving operability. Based on the above, as shown in FIG. 4D, when the aerial image display input device 1 is to be arranged above the state of FIG. 2, the aerial image display input device 1 is moved in the counterclockwise direction. Moreover, by arranging it on the back side (rear side), the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate with the hand 80 extended forward, so that visibility and operation can be performed. The sex can be optimized.
 図5A,図5Bは、反射光距離センサ51,52,53の配置と動作の一例を説明する図である。詳しくは、図5Aは、入力用操作ボタン30に対する押下動作による入力操作のイメージ図であり、図5Bは、押下動作による入力操作時に反射光距離センサ51~53が検出する距離情報の変化を示すグラフである。図5Aには、2つの入力用操作ボタン30と、3つの反射光距離センサ51,52,53との配置例が模式的に示されており、さらに、左側の入力用操作ボタン30(「左」ボタン)に対して利用者8の手80が押下動作による入力操作を行うイメージが表されている。また、図5Bは、図5Aに示した状況、すなわち、利用者8が左側のボタンに対して押下動作による入力操作を行うときに、反射光距離センサ51,52,53が検出する距離情報の変化をグラフ表示したものであり、当該グラフは、横軸を時間、縦軸を距離情報(センサ出力、距離)としている。 5A and 5B are diagrams illustrating an example of the arrangement and operation of the reflected light distance sensors 51, 52, 53. Specifically, FIG. 5A is an image diagram of an input operation by a pressing operation on the input operation button 30, and FIG. 5B is a graph showing a change in distance information detected by the reflected light distance sensors 51 to 53 during an input operation by the pressing operation. Is. FIG. 5A schematically shows an arrangement example of the two input operation buttons 30 and the three reflected light distance sensors 51, 52, 53, and further, the left input operation button 30 (“left”). The image is shown in which the hand 80 of the user 8 performs an input operation by pressing the button). Further, FIG. 5B shows the situation shown in FIG. 5A, that is, the distance information detected by the reflected light distance sensors 51, 52, 53 when the user 8 performs an input operation by pressing the button on the left side. The change is displayed as a graph, and in the graph, the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
 以下のグラフに関する説明では、反射光距離センサ51,52,53が検出する距離情報を、順に距離情報L,C,Rと呼ぶ。また、図5Bのグラフにおいては、距離情報Lを破線、距離情報Cを一点鎖線、距離情報Rを実線で表す。また、図5Bのグラフにおいて、センサ出力は、それぞれの反射光距離センサ51,52,53からの距離を表すものであり、具体的には、「H2」のセンサ出力に相当する距離が、空中像(三次元空間投射面10)までの距離である。したがって、センサ出力がH2よりも下方に変化した場合に、空中像に対する押下動作が行われたと判断することができる。 In the following explanation of the graph, the distance information detected by the reflected light distance sensors 51, 52, 53 is referred to as distance information L, C, R in order. Further, in the graph of FIG. 5B, the distance information L is represented by a broken line, the distance information C is represented by a alternate long and short dash line, and the distance information R is represented by a solid line. Further, in the graph of FIG. 5B, the sensor output represents the distance from each of the reflected light distance sensors 51, 52, 53, and specifically, the distance corresponding to the sensor output of “H2” is in the air. The distance to the image (three-dimensional space projection surface 10). Therefore, when the sensor output changes below H2, it can be determined that the pressing operation with respect to the aerial image has been performed.
 図5Bによれば、距離情報Lは、時間0.2秒から変化を開始し、その後の約0.2秒間で押下位置(空中像の距離)まで変化しており、反射光距離センサ51はこのような距離情報Lを取得することによって、左側の入力用操作ボタン30への押下動作を検出することができる。なおこのとき、隣接する距離情報Cも距離情報Lと同時に変化するが、距離情報Cは押下位置までは変化しない。また、距離情報Rはまったく変化していない。 According to FIG. 5B, the distance information L starts changing from a time of 0.2 seconds and then changes to the pressed position (distance of the aerial image) in about 0.2 seconds thereafter, and the reflected light distance sensor 51 has a reflected light distance sensor 51. By acquiring such distance information L, it is possible to detect the operation of pressing the input operation button 30 on the left side. At this time, the adjacent distance information C also changes at the same time as the distance information L, but the distance information C does not change up to the pressed position. Moreover, the distance information R has not changed at all.
 上記の距離情報L,C,Rの変化は一例であるが、利用者8による入力用操作ボタン30の押下の操作方法は多様であり、距離情報L,C,Rの変化も多様となる。このような様々な距離情報L,C,Rの変化に対して、制御部7の入力判定処理部71は、以下のように入力判定処理を行うことによって、入力用操作ボタン30に対する入力情報を判定する。 The above-mentioned changes in the distance information L, C, and R are examples, but the operation method of pressing the input operation button 30 by the user 8 is various, and the changes in the distance information L, C, and R are also various. In response to such changes in the various distance information L, C, and R, the input determination processing unit 71 of the control unit 7 performs input determination processing as follows to input information to the input operation button 30. judge.
 本例では、入力判定処理を単純化するために、例えば、図5Bの縦軸に示すように、距離情報(センサ出力)にH0,H1,H2,H3の4つの判定レベル(閾値)を設けて、距離情報(センサ出力)がどの判定レベル間にあるかによって、状態を分類することが考えられる。分類される具体的な状態は、距離情報(センサ出力)が、H1~H0の場合は「検出なしの状態」とし、H2~H1の場合は「空中像よりも上側で検出している状態」とし、H3~H2の場合は「空中像の近傍から下側で検出している状態」とする。 In this example, in order to simplify the input determination process, for example, as shown on the vertical axis of FIG. 5B, four determination levels (threshold values) of H0, H1, H2, and H3 are provided in the distance information (sensor output). Therefore, it is conceivable to classify the states according to which determination level the distance information (sensor output) is between. The specific state to be classified is "a state without detection" when the distance information (sensor output) is H1 to H0, and "a state where it is detected above the aerial image" when it is H2 to H1. In the case of H3 to H2, it is assumed that "the state is detected from the vicinity of the aerial image to the lower side".
 図6は、入力判定処理の処理手順例を示すフローチャートである。入力判定処理部71は、図6に示したフローチャートに沿った処理を実行することにより、H0,H1,H2,H3の判定レベルを用いて、距離情報L,C,Rから入力用操作ボタン30に対する入力情報を判定する。 FIG. 6 is a flowchart showing an example of the processing procedure of the input determination process. The input determination processing unit 71 executes the processing according to the flowchart shown in FIG. 6 and uses the determination levels of H0, H1, H2, and H3 to input from the distance information L, C, and R to the input operation button 30. Judge the input information for.
 図6のフローチャートでは、ステップS101~S104,S111~S113の処理群をステップS10と表し、ステップS201~S208,S211~S218(ステップS204,S214を除く)の処理群をステップS20と表している。ステップS10は、ボタン検出を行うための処理であり、詳しくは、空中像である入力用操作ボタン30の何れかに対して利用者8の手80等が操作を開始したことを検出するための処理である。ステップS20は、ボタンの押下状態を検出するための処理であり、詳しくは、ステップS10で検出された入力用操作ボタン30に対して押下動作が行われたことを検出するための処理である。 In the flowchart of FIG. 6, the processing group of steps S101 to S104 and S111 to S113 is represented as step S10, and the processing group of steps S201 to S208 and S211 to S218 (excluding steps S204 and S214) is represented as step S20. Step S10 is a process for detecting the button, and more specifically, for detecting that the hand 80 or the like of the user 8 has started the operation for any of the input operation buttons 30 which are aerial images. It is a process. Step S20 is a process for detecting the pressed state of the button, and more specifically, is a process for detecting that the pressed operation has been performed on the input operation button 30 detected in step S10.
 ステップS10の処理概要は以下の通りである。入力判定処理部71は、距離情報Lか距離情報Rの検出を判定する(ステップS101)。そしてステップS101で距離情報Lが検出された場合は、距離情報LがH2~H1の間の値で、かつ、距離情報Cよりも小さいかを判定することによって、距離情報Lに対応する入力用操作ボタン30(「左」ボタン)への操作開始を検出する(ステップS102~S104)。一方、ステップS101で距離情報Rが検出された場合は、上述した距離情報Lに関する処理と同様の判定処理を距離情報Rについて行うことにより、距離情報Rに対応する入力用操作ボタン30(「右」ボタン)への操作開始を検出する(ステップS111~S113)。 The processing outline of step S10 is as follows. The input determination processing unit 71 determines the detection of the distance information L or the distance information R (step S101). When the distance information L is detected in step S101, it is for input corresponding to the distance information L by determining whether the distance information L is a value between H2 and H1 and smaller than the distance information C. Detecting the start of an operation on the operation button 30 (“left” button) (steps S102 to S104). On the other hand, when the distance information R is detected in step S101, the input operation button 30 corresponding to the distance information R is performed by performing the same determination processing on the distance information R as the above-mentioned processing on the distance information L (“right”). ”Button) to detect the start of the operation (steps S111 to S113).
 次に、入力判定処理部71は、ステップS10の処理結果(ステップS104,S113)に基づいて、ステップS20の処理を実行する。ステップS20の処理概要は以下の通りである。 Next, the input determination processing unit 71 executes the processing of step S20 based on the processing results of step S10 (steps S104 and S113). The processing outline of step S20 is as follows.
 ステップS10で距離情報Lに対応する入力用操作ボタン30(「左」ボタン)への操作開始が検出された場合、入力判定処理部71は、押下動作を判定するためのタイマ(本例では100msタイマ)を利用して、100ms以上の時間に亘って、距離情報LがH2を下回る(距離情報Lが示す距離が空中像までの距離より近くなる)か否かを判定することによって、「左」ボタンの押下動作を検出する(ステップS201~S203,S205~S208)。そして、上記検出結果で押下動作を検出した場合、入力判定処理部71は、「左」ボタン押下という入力判定処理結果を出力する(ステップS204)。 When the start of the operation to the input operation button 30 (“left” button) corresponding to the distance information L is detected in step S10, the input determination processing unit 71 determines the pressing operation (100 ms in this example). By using a timer) to determine whether the distance information L is less than H2 (the distance indicated by the distance information L is closer than the distance to the aerial image) for a time of 100 ms or more, "left". The pressing operation of the button is detected (steps S201 to S203, S205 to S208). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left" button (step S204).
 一方、ステップS10で距離情報Rに対応する入力用操作ボタン30(「右」ボタン)への操作開始が検出された場合、入力判定処理部71は、上述した距離情報Lに関する処理と同様の判定処理を距離情報Rについて行うことにより、「右」ボタンの押下動作を検出する(ステップS211~S213,S215~S218)。そして、上記検出結果で押下動作を検出した場合、入力判定処理部71は、「右」ボタン押下という入力判定処理結果を出力する(ステップS214)。 On the other hand, when the start of the operation to the input operation button 30 (“right” button) corresponding to the distance information R is detected in step S10, the input determination processing unit 71 determines the same as the above-mentioned processing related to the distance information L. By performing the process on the distance information R, the pressing operation of the "right" button is detected (steps S211 to S213, S215 to S218). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "right" button (step S214).
 本実施例に係る空中像表示入力装置1では、以上のような入力判定処理が行われることにより、3つの距離センサ(反射光距離センサ51~53)を使って、空中像による2つのボタン(入力用操作ボタン30)に対する利用者8の押下動作を検出するときに、2つのボタンに対する押すという動作を、左(L)に対応する距離センサ(反射光距離センサ51)または右(R)に対応する距離センサ(反射光距離センサ53)から、それぞれ1出力で検出することができる。この検出方法は、精度が極めて高く、かつ、簡単な論理で判定できるため、高感度かつ高速に処理を実行することができる。また、検出を行うために距離センサ及び制御部7には複雑な構成は必要なく、簡素な構成によって小型の装置を実現することができる。 In the aerial image display input device 1 according to the present embodiment, by performing the input determination process as described above, two buttons (reflected light distance sensors 51 to 53) are used to display two buttons (reflected light distance sensors 51 to 53). When the user 8's pressing action on the input operation button 30) is detected, the pressing action on the two buttons is applied to the distance sensor (reflected light distance sensor 51) corresponding to the left (L) or the right (R). Each can be detected with one output from the corresponding distance sensor (reflected light distance sensor 53). Since this detection method has extremely high accuracy and can be determined by simple logic, processing can be executed with high sensitivity and high speed. Further, the distance sensor and the control unit 7 do not need a complicated configuration for detection, and a small device can be realized by a simple configuration.
 また、上記の入力判定処理による検出方法では、距離情報Cのセンサ(反射光距離センサ52)の出力を監視して、ボタン以外の部分に対する手80等の動作を検出することにより、ボタン操作以外の誤った操作に対する誤検出を防止することができる。 Further, in the detection method by the above input determination process, the output of the sensor (reflected light distance sensor 52) of the distance information C is monitored, and the operation of the hand 80 or the like with respect to the portion other than the button is detected, so that the operation other than the button operation is performed. It is possible to prevent erroneous detection of erroneous operation.
 なお、赤外光による反射光距離センサ51~53は、検出領域が限定的であるため、表示ボタン(入力用操作ボタン30)が検出領域よりも大きい場合には入力操作(ボタン押下動作)を検出できない範囲ができたり、利用者8が細い指先等で操作する場合には検出可能な領域が小さくなったりする場合がある。したがって、これらの事態に配慮して、表示部3に表示するボタン画像は、センサの検出領域とほぼ一致する位置及び大きさのボタンとすることが好適である。表示部3に表示するボタン画像をこのように構成することで、表示部3から三次元空間投射面10に投射された空中像(入力用操作ボタン30)に対する操作性の優れた入力装置を実現することができる。 Since the detection area of the infrared light reflected light distance sensors 51 to 53 is limited, when the display button (input operation button 30) is larger than the detection area, an input operation (button pressing operation) is performed. There may be an undetectable range, or the detectable area may become smaller when the user 8 operates with a thin fingertip or the like. Therefore, in consideration of these situations, it is preferable that the button image displayed on the display unit 3 is a button having a position and size substantially matching the detection area of the sensor. By configuring the button image to be displayed on the display unit 3 in this way, an input device having excellent operability for the aerial image (input operation button 30) projected from the display unit 3 to the three-dimensional space projection surface 10 is realized. can do.
 本実施例ではここまで、利用者8が入力する取扱入力情報として、ボタン押下によって選択する入力操作について説明してきたが、本実施例に係る空中像表示入力装置1が対応可能な入力操作はボタン押下による入力操作に限定されるものではなく、例えば、選択可能な情報が「取扱A」、「取扱B」、「取扱C」、「取扱D」のように複数あるときに、画面表示されていない取扱を表示させる操作(以下、「めくる」操作と称する)にも対応可能である。 In this embodiment, the input operation selected by pressing a button as the handling input information input by the user 8 has been described, but the input operation that can be handled by the aerial image display input device 1 according to this embodiment is a button. It is not limited to the input operation by pressing, and is displayed on the screen when, for example, there are a plurality of selectable information such as "Handling A", "Handling B", "Handling C", and "Handling D". It is also possible to handle operations that display no handling (hereinafter referred to as "turning" operations).
 図7A,図7Bは、反射光距離センサ51,52,53の配置と動作の別例を説明する図である。詳しくは、図7Aは、入力用操作ボタン30に対するスワイプ操作のイメージ図であり、図7Bは、スワイプ操作時に反射光距離センサ51~53が検出する距離情報の変化を示すグラフである。図5Aと同様、図7Aには、3つの反射光距離センサ51,52,53の配置例が示されている。但し、図5Aでは、入力用操作ボタン30として円形の2つのボタン(「左」ボタン,「右」ボタン)が表示されていたが、図7Aでは、入力用操作ボタン30として、左側に円形の「左」ボタンが表示され、右側に矢印形状の「めくる」ボタンが表示されている。「めくる」ボタンは、利用者8が所定の「めくる」操作を行った場合に、「左」ボタンに表示される取扱を所定の順番で遷移させることによって、利用者8が希望する取扱を「左」ボタンに表示可能とするためのボタンである。例えば、図7Aの場合、「左」ボタンには「取扱A」が表示されているとする。なお、詳細な処理手順は図8で後述するが、本例では、「めくる」ボタンに対する押下動作を行うこと(押下操作)、または、矢印方向の表示に従って「めくる」ボタン付近から横方向への移動動作を行うこと(スワイプ操作)を、「めくる」ボタンに対する所定の「めくる」操作として受け付けることができる。図7Bは、図7Aに示した状況、すなわち、利用者8が「めくる」ボタンに対してスワイプ操作を行うときに、反射光距離センサ51,52,53が検出する距離情報の変化をグラフ表示したものであり、当該グラフは、横軸を時間、縦軸を距離情報(センサ出力、距離)としている。 7A and 7B are diagrams illustrating another example of the arrangement and operation of the reflected light distance sensors 51, 52, 53. Specifically, FIG. 7A is an image diagram of a swipe operation with respect to the input operation button 30, and FIG. 7B is a graph showing changes in distance information detected by the reflected light distance sensors 51 to 53 during the swipe operation. Similar to FIG. 5A, FIG. 7A shows an example of arrangement of the three reflected light distance sensors 51, 52, 53. However, in FIG. 5A, two circular buttons (“left” button and “right” button) are displayed as the input operation buttons 30, but in FIG. 7A, the input operation buttons 30 are circular on the left side. The "left" button is displayed, and the arrow-shaped "turn" button is displayed on the right side. The "turn" button changes the handling displayed on the "left" button in a predetermined order when the user 8 performs a predetermined "turning" operation, so that the handling desired by the user 8 is "turned". It is a button to be able to display on the "left" button. For example, in the case of FIG. 7A, it is assumed that "Handling A" is displayed on the "Left" button. The detailed processing procedure will be described later in FIG. 8, but in this example, the pressing operation for the "turning" button is performed (pressing operation), or the display in the direction of the arrow is followed from the vicinity of the "turning" button to the horizontal direction. Performing a moving operation (swipe operation) can be accepted as a predetermined "turning" operation for the "turning" button. FIG. 7B is a graph display of the situation shown in FIG. 7A, that is, the change in the distance information detected by the reflected light distance sensors 51, 52, 53 when the user 8 swipes the “turn” button. In the graph, the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
 図7Bによれば、時間0.2秒から距離情報Lが変化を開始し、その約0.2秒後に距離情報Cが変化を開始し、さらにその約0.2秒後に距離情報Rが変化を開始している。各センサ(反射光距離センサ51~53)は、これらの情報をそれぞれ取得する。 According to FIG. 7B, the distance information L starts changing from the time 0.2 seconds, the distance information C starts changing about 0.2 seconds later, and the distance information R changes about 0.2 seconds later. Has started. Each sensor (reflected light distance sensor 51 to 53) acquires this information, respectively.
 そして、距離情報L,C,Rの変化に対して、制御部7の入力判定処理部71は、以下のような入力判定処理(図5~図6で説明した入力判定処理と区別する場合、第2の入力判定処理と称する)を行うことによって、入力用操作ボタン30に対する入力情報を判定する。なお、第2の入力判定処理においても、処理を単純化するために、図5~図6で説明した入力判定処理と同様に、H0,H1,H2,H3の4つの判定レベル(閾値)を設けて、距離情報を3つの状態に分類する。 Then, with respect to changes in the distance information L, C, and R, the input determination processing unit 71 of the control unit 7 may perform the following input determination processing (when distinguishing from the input determination processing described with reference to FIGS. 5 to 6). By performing the second input determination process), the input information for the input operation button 30 is determined. In the second input determination process as well, in order to simplify the process, four determination levels (threshold values) of H0, H1, H2, and H3 are set in the same manner as the input determination process described with reference to FIGS. 5 to 6. The distance information is classified into three states.
 図8は、第2の入力判定処理の処理手順例を示すフローチャートである。なお、図8に示す処理手順は、図6に示した入力判定処理の処理手順と共通する部分が多く、これら共通部分については、基本的に説明を省略する。 FIG. 8 is a flowchart showing an example of the processing procedure of the second input determination process. The processing procedure shown in FIG. 8 has many parts in common with the processing procedure of the input determination processing shown in FIG. 6, and the description of these common parts is basically omitted.
 図8のフローチャートでは、ステップS101~S104,S111,S113の処理群をステップS30と表し、ステップS201~S208,S211~S213,S402~S407(ステップS204,S403を除く)の処理群をステップS40と表している。ステップS30は、空中像である入力用操作ボタン30の何れかに対して利用者8の手80等が操作を開始したことを検出するための処理であり、ステップS40は、ステップS30で検出された入力用操作ボタン30に対して、所定の動作(「左」ボタンの場合は押下動作、「めくる」ボタンの場合は押下動作またはスワイプ動作)が行われたことを検出するための処理である。 In the flowchart of FIG. 8, the processing group of steps S101 to S104, S111, and S113 is referred to as step S30, and the processing group of steps S201 to S208, S211 to S213, and S402 to S407 (excluding steps S204 and S403) is referred to as step S40. Represents. Step S30 is a process for detecting that the hand 80 or the like of the user 8 has started an operation for any one of the input operation buttons 30 which is an aerial image, and step S40 is detected in step S30. This is a process for detecting that a predetermined operation (pressing operation in the case of the "left" button, pressing operation or swiping operation in the case of the "turning" button) has been performed on the input operation button 30. ..
 ステップS30の詳細な処理手順は、図6のステップS10の処理手順から、ステップS112の処理が削除されたものとなる。ステップS112の処理は、右側のボタンに対応する距離情報Rを検出したときに、押下動作を検出する事前判定として、距離情報Rが空中像(ボタン)よりも遠い距離にあるかを判定するものであるが、ステップS30の場合、右側のボタンは「めくる」ボタンであり、押下動作の判定を必要としない(検出対象の物体(手80)が当該ボタンの近傍にあればよい)ことから、ステップS112が削除されている。詳細な処理手順は繰り返しになるため省略するが、ステップS30の処理によって、距離情報Lまたは距離情報Rに対応する入力用操作ボタン30への操作開始が検出される。 The detailed processing procedure of step S30 is that the processing of step S112 is deleted from the processing procedure of step S10 of FIG. The process of step S112 determines whether the distance information R is farther than the aerial image (button) as a preliminary determination for detecting the pressing operation when the distance information R corresponding to the button on the right side is detected. However, in the case of step S30, the button on the right side is a "turn" button and does not require determination of the pressing operation (the object to be detected (hand 80) may be in the vicinity of the button). Step S112 has been deleted. Although the detailed processing procedure is repeated, it is omitted, but the processing in step S30 detects the start of the operation to the input operation button 30 corresponding to the distance information L or the distance information R.
 次に、入力判定処理部71は、ステップS30の処理結果(ステップS104,S113)に基づいて、ステップS40の処理を実行する。ステップS40の処理概要は以下の通りである。 Next, the input determination processing unit 71 executes the processing of step S40 based on the processing results of step S30 (steps S104 and S113). The processing outline of step S40 is as follows.
 ステップS30で距離情報Lに対応する入力用操作ボタン30(「左」ボタン)への操作開始が検出された場合、入力判定処理部71は、図6のステップS20と同様の処理を行うことで、「左」ボタンの押下動作を検出する(ステップS201~S203,S205~S208)。そして入力判定処理部71は、上記検出結果で押下動作を検出した場合、「左」ボタン押下という入力判定処理結果を出力する(ステップS204)。 When the start of the operation to the input operation button 30 (“left” button) corresponding to the distance information L is detected in step S30, the input determination processing unit 71 performs the same processing as in step S20 of FIG. , The pressing operation of the "left" button is detected (steps S201 to S203, S205 to S208). Then, when the input determination processing unit 71 detects the pressing operation in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left" button (step S204).
 一方、ステップS30で距離情報Rに対応する入力用操作ボタン30(「めくる」ボタン)への操作開始が検出された場合、入力判定処理部71は、「めくる」ボタンに対して所定の「めくる」操作が行われたかを検出するために、距離情報Rの変化に基づいて「めくる」ボタンの押下動作を検出する処理(ステップS211~S213)、または、距離情報Cの変化に基づいて「めくる」ボタンのスワイプ動作を検出する処理(ステップS402,S404~S407)を実行する。そして入力判定処理部71は、上記検出結果で押下動作を検出した場合、「めくる」押下という入力判定処理結果を出力し(ステップS401)、上記検出結果でスワイプ動作を検出した場合、「めくる」スワイプという入力判定処理結果を出力する(ステップS403)。このようにステップS401またはステップS403で「めくる」操作の判定結果が出力されると、制御部7の画面制御処理部73は、表示部3における「左」ボタンの表示画像を、所定の遷移順序にしたがって次の取扱に切り替える処理を行う。具体的には例えば、「取扱A」から「取扱B」に表示画像を切り替える。 On the other hand, when the start of the operation to the input operation button 30 (“turning” button) corresponding to the distance information R is detected in step S30, the input determination processing unit 71 performs a predetermined “turning” button with respect to the “turning” button. In order to detect whether or not the operation has been performed, the process of detecting the pressing operation of the "turn" button based on the change in the distance information R (steps S211 to S213) or the "turning" based on the change in the distance information C. The process of detecting the swipe operation of the button (steps S402, S404 to S407) is executed. Then, the input determination processing unit 71 outputs an input determination processing result of "turning" pressing when the pressing operation is detected in the detection result (step S401), and "turning" when the swipe operation is detected in the detection result. The input determination process result of swiping is output (step S403). When the determination result of the "turning" operation is output in step S401 or step S403 in this way, the screen control processing unit 73 of the control unit 7 displays the display image of the "left" button on the display unit 3 in a predetermined transition order. The process of switching to the next handling is performed according to. Specifically, for example, the display image is switched from "Handling A" to "Handling B".
 本実施例に係る空中像表示入力装置1では、以上のような第2の入力判定処理が行われることにより、3つ以上の取扱から選択する場合にも、空中像による2つのボタン(入力用操作ボタン30)のみで、そのうち1つを「めくり」ボタンとすることで、利用者8は、水平方向に移動するスワイプ動作と、垂直方向に押下する押下動作という、直感的で単純な2種類の動作を繰り返すだけで、所望の1つの取扱を選択することができる。したがって、例えばスマートフォンやタブレット等でスワイプ操作に慣れた利用者でも、あるいは、スワイプ操作に不慣れで、押下操作にしか慣れていない利用者でも、容易なタッチレス操作を行うことが可能となる。 In the aerial image display input device 1 according to the present embodiment, the second input determination process as described above is performed, so that even when selecting from three or more handlings, two buttons (for input) by the aerial image are used. By using only the operation button 30) and making one of them a "turn" button, the user 8 can perform two types of intuitive and simple operations: a swipe operation that moves in the horizontal direction and a pressing operation that presses in the vertical direction. It is possible to select one desired handling simply by repeating the operation of. Therefore, for example, a user who is accustomed to the swipe operation on a smartphone or a tablet, or a user who is unfamiliar with the swipe operation and is accustomed only to the pressing operation can easily perform the touchless operation.
 次に、本実施例に係る空中像表示入力装置1の利用方法の一例を説明する。 Next, an example of how to use the aerial image display input device 1 according to this embodiment will be described.
 図9は、空中像表示入力装置1を病院向けの受付精算機94に接続したときのシステムの外観図である。 FIG. 9 is an external view of the system when the aerial image display input device 1 is connected to the reception settlement machine 94 for hospitals.
 病院向けの受付精算機94は、取扱処理装置9の一例であって、利用者である患者が、受信前に、診察券を提示して、受信する診療科を選択して受付処理を行ったり、診察後に、受診料を精算するために、受診額を確認して精算金額にあたる現金を投入し、支払い処理を行ったりする装置である。 The reception settlement machine 94 for hospitals is an example of a handling processing device 9, and a patient who is a user presents a medical examination ticket before receiving, selects a clinical department to receive, and performs reception processing. After the medical examination, in order to settle the medical examination fee, it is a device that confirms the medical examination amount, inserts cash corresponding to the settlement amount, and performs payment processing.
 図9の場合、空中像表示入力装置1は、受付精算機94の現金投入口の右側に配置されることで、操作する利用者に対面して、図2で示すような利用者の視線の下方に位置する。そして利用者は、図9に示すように、受付精算機94の表示部95に表示された、「受付」及び「精算」の入力用操作ボタン96と、空中像表示入力装置1によって空中像で表示された「受付」及び「精算」の入力用操作ボタン30のいずれも操作可能とされる。したがって、利用者はいずれかのボタン操作を実行して、取扱を選択することができる。 In the case of FIG. 9, the aerial image display input device 1 is arranged on the right side of the cash slot of the reception settlement machine 94, so that the user faces the operating user and has a line of sight of the user as shown in FIG. Located below. Then, as shown in FIG. 9, the user can use the aerial image display input device 1 and the "reception" and "payment" input operation buttons 96 displayed on the display unit 95 of the reception settlement machine 94 to display the aerial image. Both the displayed "reception" and "payment" input operation buttons 30 can be operated. Therefore, the user can execute any button operation and select the handling.
 図10は、図9に示したシステムにおける空中像表示入力装置1と受付精算機94の制御部間の処理を模式的に示す図である。図10には、利用者8が「受付」の取扱を選択したときの制御部間の処理が示されている。 FIG. 10 is a diagram schematically showing the processing between the aerial image display input device 1 and the control unit of the reception settlement machine 94 in the system shown in FIG. FIG. 10 shows the processing between the control units when the user 8 selects the handling of “reception”.
 本例の場合、空中像表示入力装置1と受付精算機94の制御部間で、空中像表示入力装置1で表示可能な3種類の表示画面(画面A1~A3)が予め定義される。また、受付精算機94では、自身が表示可能な4種類の表示画面(画面B1~B4)が予め定義されている。 In the case of this example, three types of display screens (screens A1 to A3) that can be displayed by the aerial image display input device 1 are defined in advance between the aerial image display input device 1 and the control unit of the reception settlement machine 94. Further, in the reception settlement machine 94, four types of display screens (screens B1 to B4) that can be displayed by the reception settlement machine 94 are defined in advance.
 図10に示すように、利用者操作開始時には、空中像表示入力装置1は、「受付」と「精算」の選択画面である画面A1の表示指示を受付精算機94から受信し、画面A1を表示部3に表示する。この画面A1に対して入力操作があった場合、空中像表示入力装置1は入力キー情報を受付精算機94に送信する。また、受付精算機94は、自身の表示部95に、画面B1を表示する。 As shown in FIG. 10, at the start of the user operation, the aerial image display input device 1 receives the display instruction of the screen A1 which is the selection screen of "reception" and "payment" from the reception settlement machine 94, and displays the screen A1. It is displayed on the display unit 3. When there is an input operation on the screen A1, the aerial image display input device 1 transmits the input key information to the reception settlement machine 94. Further, the reception settlement machine 94 displays the screen B1 on its own display unit 95.
 そして、画面A1または画面B1で「受付」が選択された場合、受付精算機94は、診察券の読取のための案内である画面B2を表示部95に表示するとともに、「中止」を選択可能な画面である画面A2の表示指示を空中像表示入力装置1に送信する。空中像表示入力装置1は、この表示指示を受けて画面B2を表示部3に表示する。 When "reception" is selected on the screen A1 or the screen B1, the reception settlement machine 94 can display the screen B2, which is a guide for reading the medical examination ticket, on the display unit 95 and select "cancel". The display instruction of the screen A2, which is a simple screen, is transmitted to the aerial image display input device 1. The aerial image display input device 1 receives this display instruction and displays the screen B2 on the display unit 3.
 その後、受付精算機94において診察券の読取が完了すると、受付精算機94は、診療科の選択画面である画面B3を表示部95に表示するとともに、診療科を選択可能な画面である画面A3の表示指示を空中像表示入力装置1に送信する。空中像表示入力装置1は、この表示指示を受けて画面A3を表示部3に表示する。なお、空中像表示入力装置1が表示する画面A3は、図7及び図8を参照して説明したような、「めくる」ボタンを利用した表示画面とすることが好ましく、図10にも示したように、右側の「めくる」ボタンが操作された場合には、左側のボタンに取扱(診療科)が遷移して表示される。そしてこの画面A3に対して入力操作(何れかの診療科を選択する操作)があった場合、空中像表示入力装置1は入力キー情報を受付精算機94に送信する。 After that, when the reading of the medical examination ticket is completed on the reception settlement machine 94, the reception settlement machine 94 displays the screen B3, which is a selection screen of the clinical department, on the display unit 95, and the screen A3, which is a screen on which the clinical department can be selected. The display instruction of is transmitted to the aerial image display input device 1. The aerial image display input device 1 receives this display instruction and displays the screen A3 on the display unit 3. The screen A3 displayed by the aerial image display input device 1 is preferably a display screen using the "turn" button as described with reference to FIGS. 7 and 8, and is also shown in FIG. As described above, when the "turn" button on the right side is operated, the handling (clinical department) is displayed by transitioning to the button on the left side. Then, when there is an input operation (operation of selecting one of the clinical departments) on the screen A3, the aerial image display input device 1 transmits the input key information to the reception settlement machine 94.
 そして、受付精算機94は、自身の表示部3に表示した画面B3で診療科が選択されるか、空中像表示入力装置1から画面A3に対する入力キー情報(選択された診療科の入力情報)を受信すると、内部で所定の受付処理を実行し、その完了時には、受付完了を報知する画面B4を表示部95に表示する。以上で、1人の利用客(患者)に対する受付処理が完了する。 Then, the reception settlement machine 94 selects a clinical department on the screen B3 displayed on its own display unit 3, or inputs key information from the aerial image display input device 1 to the screen A3 (input information of the selected clinical department). Is received, a predetermined reception process is internally executed, and when the reception process is completed, the screen B4 notifying the completion of the reception is displayed on the display unit 95. This completes the reception process for one user (patient).
 以上のように、本実施例に係る空中像表示入力装置1は、空中像に対して、利用者にとって直感的で、反応の良い、使いやすい入力装置を、簡素な構成からなる小型の装置で提供することができるだけでなく、既存の取扱処理装置に接続することで、容易なタッチレスの入力装置を実現することができる。 As described above, the aerial image display input device 1 according to the present embodiment is a small device having a simple configuration, which is an easy-to-use input device that is intuitive to the user and has a good response to the aerial image. Not only can it be provided, but by connecting to an existing handling processing device, an easy touchless input device can be realized.
 また、本実施例に係る空中像表示入力装置1は、取扱処理装置9(受付精算機94)に接続された状態において、空中像表示入力装置1及び取扱処理装置9のどちらでも利用者から操作可能にできるため、仮に、利用者が空中像の操作に不適であったとしても、取扱処理装置9の操作を行えばよく、利便性の高い入力システムを提供することができる。 Further, the aerial image display input device 1 according to the present embodiment is operated by the user in either the aerial image display input device 1 or the handling processing device 9 in a state of being connected to the handling processing device 9 (reception settlement machine 94). Therefore, even if the user is unsuitable for operating the aerial image, the handling processing device 9 may be operated, and a highly convenient input system can be provided.
 また、本実施例では、空中像表示入力装置1を取扱処理装置9と別構成とすることで、空中像表示入力装置1既存の様々な取扱処理装置に追加することが容易となる。またこのとき、ハード的な面だけでなく、ソフト的な面においても、アプリケーションソフトの画面や操作方法の仕様は変更せずに、空中像表示入力装置1からの入力情報を処理する論理の追加だけを行えばよいため、ソフトウェアの変更が容易となり、導入が簡素で、費用面でも優位性が得られる。また、空中像表示入力装置1は、入力ボタンを画面表示するだけのコンパクトな装置であるため、様々な形態の処理装置の各操作部の近傍に配置することが容易であり、全体として設置スペースの増加を抑えて、操作性も確保することができる。 Further, in the present embodiment, by configuring the aerial image display input device 1 separately from the handling processing device 9, it becomes easy to add the aerial image display input device 1 to various existing handling processing devices. At this time, not only in terms of hardware but also in terms of software, the logic for processing the input information from the aerial image display input device 1 is added without changing the specifications of the screen and operation method of the application software. It is easy to change the software, it is easy to install, and it has a cost advantage. Further, since the aerial image display input device 1 is a compact device that only displays an input button on the screen, it can be easily arranged in the vicinity of each operation unit of various types of processing devices, and the installation space as a whole. It is possible to suppress the increase in the number and secure the operability.
 図11は、本発明の第2の実施例(実施例2)に係る空中像表示入力装置1Aの外観斜視図である。実施例2の空中像表示入力装置1Aは、入力検出センサと入力用操作ボタンの配置関係を実施例1から変更したものであり、具体的には、3個の反射光距離センサ51,52,53にそれぞれ対向して、3個の入力用操作ボタン31を配置する点で、2個の入力用操作ボタン30を配置する実施例1の空中像表示入力装置1と異なる。 FIG. 11 is an external perspective view of the aerial image display input device 1A according to the second embodiment (Example 2) of the present invention. In the aerial image display input device 1A of the second embodiment, the arrangement relationship between the input detection sensor and the input operation button is changed from the first embodiment. Specifically, the three reflected light distance sensors 51, 52, It differs from the aerial image display input device 1 of the first embodiment in that the three input operation buttons 31 are arranged so as to face each of the 53.
 図11に示すように、空中像表示入力装置1Aでは、それぞれの入力用操作ボタン31の間の距離が近くなるが、取扱の選択種類が2個から3個に増えることで、取扱の活用範囲が広がるという利点がある。 As shown in FIG. 11, in the aerial image display input device 1A, the distance between the respective input operation buttons 31 becomes short, but the range of utilization of handling is increased by increasing the number of selection types of handling from two to three. Has the advantage of spreading.
 前述した実施例1では、2個の入力用操作ボタン30の両側と、それらの間に、直線状に合計3個の入力検出センサ5(反射光距離センサ51,52,53)を配置し、両側のセンサ(反射光距離センサ51,53)でそれぞれの入力用操作ボタン30の押下操作を検出し、残りのセンサ(反射光距離センサ52)は2個の入力用操作ボタン30の間における手80の動作(スワイプ操作)を検出することで、手80よりも大きな物体の動き、例えば利用者8の荷物や利用者の手80以外の部位の動きなどによる誤操作を起因とする誤入力に対して、図6に示した単純な論理処理の入力判定処理によって防止することができる。他方、実施例2では、3個の入力用操作ボタン31に対向して直線状に3個の入力検出センサ5(反射光距離センサ51,52,53)を配置しており、ボタン間が近付くことで上述したような誤操作に対して、適切ではない入力判定処理を行うおそれがあるため、これを防止するために、実施例1とは異なる検出手段や入力判定処理を用意することが好ましい。 In the first embodiment described above, a total of three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are linearly arranged on both sides of the two input operation buttons 30 and between them. The sensors on both sides (reflected light distance sensors 51 and 53) detect the pressing operation of each input operation button 30, and the remaining sensors (reflected light distance sensor 52) are hands between the two input operation buttons 30. By detecting the movement of 80 (swipe operation), for erroneous input caused by the movement of an object larger than the hand 80, for example, the movement of the baggage of the user 8 or the movement of a part other than the user's hand 80. Therefore, it can be prevented by the input determination process of the simple logical process shown in FIG. On the other hand, in the second embodiment, the three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are arranged in a straight line facing the three input operation buttons 31, and the buttons are close to each other. This may result in improper input determination processing for the above-mentioned erroneous operation. Therefore, in order to prevent this, it is preferable to prepare a detection means and input determination processing different from those in the first embodiment.
 本発明に係る空中像表示入力装置1(1A)では、ボタンの数とセンサの数に関して、基本的には入力用操作ボタン30(31)の数が増えても同様の関係が成立する。すなわち、入力検出センサ5は直線状に2個以上でN個配置し、N個未満([N-1]個以下)の入力用操作ボタン30(31)を表示させる構成を採用することが、装置の性能や部品装填コストを考慮すると最も効率的といえる。ただし、原理的には、入力検出センサ5をN個配置し、入力用操作ボタン30(31)を同数のN個表示させる構成を採用することも可能である。 In the aerial image display input device 1 (1A) according to the present invention, basically the same relationship is established with respect to the number of buttons and the number of sensors even if the number of input operation buttons 30 (31) increases. That is, it is possible to adopt a configuration in which two or more input detection sensors 5 are linearly arranged and N are arranged, and less than N ([N-1] or less) input operation buttons 30 (31) are displayed. It can be said that it is the most efficient considering the performance of the device and the cost of loading parts. However, in principle, it is also possible to adopt a configuration in which N input detection sensors 5 are arranged and N input operation buttons 30 (31) are displayed in the same number.
 また、本発明において、入力用操作ボタンを3個以上にさらに増やす場合は、入力検出センサを構成する反射光距離センサの個数を合わせて増やせばよい。その場合、反射光距離センサには、ライン上に素子が配置された一体型ライン型の検出センサを採用してもよい。 Further, in the present invention, when the number of input operation buttons is further increased to three or more, the number of reflected light distance sensors constituting the input detection sensor may be increased together. In that case, an integrated line-type detection sensor in which elements are arranged on the line may be adopted as the reflected light distance sensor.
 本発明の第3の実施例(実施例3)では、実施例1で説明した空中像表示入力装置1を、ビルのエレベータのフロアごとに設置されている呼び出し用のボタンと接続した構成を想定する。 In the third embodiment (Example 3) of the present invention, it is assumed that the aerial image display input device 1 described in the first embodiment is connected to a calling button installed on each floor of the elevator of the building. do.
 図12は、実施例3における入力用操作ボタン30の表示例を示す図である。実施例3において、呼び出し用のボタンと接続された空中像表示入力装置1は、表示部3に呼び出し用のボタン画像を表示することによって、三次元空間投射面10に入力用操作ボタン30を表示する。より具体的には、図12に示すように、上方向への移動のためにエレベータの乗りかごを呼び出す上呼出ボタン30Aと、下方向への移動のためにエレベータの乗りかごを呼び出す下呼出ボタン30Bとが三次元空間投射面10に表示される。そして、空中像表示入力装置1は、上呼出ボタン30Aまたは下呼出ボタン30Bに対して利用者8が所定の操作(押下操作を基本とするがスワイプ操作も受付可能としてもよい)を行った場合には、当該操作に基づく入力を、エレベータの呼び出し用ボタンのボタン情報としてエレベータの制御装置に直接出力するように構成することで、エレベータの呼び出し用ボタンの代替としてボタン情報を出力することができる。この結果、実施例3に係る空中像表示入力装置1は、既存のエレベータでフロアごとに設置されている呼び出し用の押しボタンを、タッチレス操作に切り替えることができる。 FIG. 12 is a diagram showing a display example of the input operation button 30 in the third embodiment. In the third embodiment, the aerial image display input device 1 connected to the call button displays the input operation button 30 on the three-dimensional space projection surface 10 by displaying the call button image on the display unit 3. do. More specifically, as shown in FIG. 12, an upper call button 30A for calling the elevator car for upward movement and a lower call button for calling the elevator car for downward movement. 30B is displayed on the three-dimensional space projection surface 10. Then, in the aerial image display input device 1, when the user 8 performs a predetermined operation (basically a pressing operation, but a swipe operation may also be accepted) with respect to the upper call button 30A or the lower call button 30B. By configuring the input based on the operation to be directly output to the elevator control device as the button information of the elevator call button, the button information can be output as an alternative to the elevator call button. .. As a result, the aerial image display input device 1 according to the third embodiment can switch the call push button installed for each floor in the existing elevator to the touchless operation.
 本発明の第4の実施例(実施例4)では、上述した各実施例よりも入力用操作ボタンの個数を増やす場合の空中像表示入力装置1Bについて説明する。 In the fourth embodiment (Example 4) of the present invention, the aerial image display input device 1B when the number of input operation buttons is increased as compared with each of the above-described embodiments will be described.
 図13は、実施例4に係る空中像表示入力装置1Bの外観斜視図である。また図14は、空中像表示入力装置1Bの側面図である。図13及び図14では、図1及び図2と共通する構成について同じ符号を付しており、これらについての説明は省略する。 FIG. 13 is an external perspective view of the aerial image display input device 1B according to the fourth embodiment. Further, FIG. 14 is a side view of the aerial image display input device 1B. In FIGS. 13 and 14, the same reference numerals are given to the configurations common to those in FIGS. 1 and 2, and the description thereof will be omitted.
 図13及び図14に示した空中像表示入力装置1Bは、図1及び図2に示した空中像表示入力装置1と比べると、入力検出センサ5が入力検出センサ5Bとなり、1面の入力検出面61が2面の入力検出面62,63になっている。入力検出センサ5Bは、一軸上に並ぶ5個のセンサ(発光素子と受光素子とからなる距離センサ)が2軸にわたって配置されて構成される。より具体的には、線分B1-B2上と線分C1-C2上に、それぞれ5個のセンサが配置されることで、入力検出面62(B1-B2-B3-B4)と入力検出面63(C1-C2-C3-C4)からなる2面の入力検出領域を形成している。また、三次元空間投射面10は、線分B1-B2と線分C1-C2を含む面であり、例えば、10個の入力用操作ボタン32が表示され、各ボタンに「0」~「9」の数字を割り当てることによって、1桁の数字を選択して入力可能としている。 In the aerial image display input device 1B shown in FIGS. 13 and 14, the input detection sensor 5 becomes the input detection sensor 5B as compared with the aerial image display input device 1 shown in FIGS. 1 and 2, and the input detection on one surface is performed. The surface 61 is a two-sided input detection surface 62, 63. The input detection sensor 5B is configured by arranging five sensors (distance sensors including a light emitting element and a light receiving element) arranged on one axis over two axes. More specifically, by arranging five sensors on each of the line segment B1-B2 and the line segment C1-C2, the input detection surface 62 (B1-B2-B3-B4) and the input detection surface It forms a two-sided input detection region consisting of 63 (C1-C2-C3-C4). Further, the three-dimensional space projection surface 10 is a surface including a line segment B1-B2 and a line segment C1-C2. For example, 10 input operation buttons 32 are displayed, and "0" to "9" are displayed on each button. By assigning a number of "", a one-digit number can be selected and input.
 上記のような空中像表示入力装置1Bは、入力検出センサ5Bを構成するセンサの個数を増やした以外の構成は、実施例1の空中像表示入力装置1と同様の構成で実現することができる。但し、入力検出面62,63の2面となることで、奥側の入力検出面62に表示された線分B3-B4上の入力用操作ボタン32を利用者8が操作しようとするときに、利用者8の手80が手前側の入力検出面63も遮光するため、2軸の距離センサの出力が変化することになるが、この問題に対しては、入力判定処理部71による入力判定処理において、操作対象が何れのボタンかを判断する論理を追加することで技術的には容易に解決することができる。したがって、実施例4に係る空中像表示入力装置1Bによれば、他の実施例で得られる効果に加えて、入力選択の種類を増やして、適用範囲を拡大する効果が得られる。 The aerial image display input device 1B as described above can be realized with the same configuration as the aerial image display input device 1 of the first embodiment except that the number of sensors constituting the input detection sensor 5B is increased. .. However, when the user 8 tries to operate the input operation button 32 on the line segment B3-B4 displayed on the input detection surface 62 on the back side because the input detection surfaces 62 and 63 are two surfaces. Since the hand 80 of the user 8 also shields the input detection surface 63 on the front side, the output of the two-axis distance sensor changes. However, for this problem, the input determination processing unit 71 determines the input. In the process, it can be technically easily solved by adding a logic for determining which button the operation target is. Therefore, according to the aerial image display input device 1B according to the fourth embodiment, in addition to the effects obtained in the other examples, the effect of increasing the types of input selection and expanding the applicable range can be obtained.
 本発明の第5の実施例(実施例5)では、5個の入力用操作ボタンを配置した空中像表示入力装置1Cについて説明する。 In the fifth embodiment (Example 5) of the present invention, an aerial image display input device 1C in which five input operation buttons are arranged will be described.
 図15は、実施例5に係る空中像表示入力装置1Cの外観斜視図である。また、図16A,図16Bは、空中像表示入力装置1Cによる三次元空間投射面10の表示画像例を示す図(その1,その2)である。そして図17は、図16Bに示した表示画像と空中像表示入力装置1Cの反射光距離センサの配置との関係を示す図である。 FIG. 15 is an external perspective view of the aerial image display input device 1C according to the fifth embodiment. 16A and 16B are diagrams (No. 1 and No. 2) showing an example of a display image of the three-dimensional space projection surface 10 by the aerial image display input device 1C. FIG. 17 is a diagram showing the relationship between the display image shown in FIG. 16B and the arrangement of the reflected light distance sensor of the aerial image display input device 1C.
 実施例5の空中像表示入力装置1Cは、入力検出センサ(反射光距離センサ)と入力用操作ボタンの配置関係を実施例1の空中像表示入力装置1から変更したものであり、具体的には、5個の反射光距離センサ54~58に対向して、5個の入力用操作ボタン33~37を配置する。このうち、3個の反射光距離センサ55,56,57に対向して配置された3個の入力用操作ボタン33,34,35は、ボタン押下による入力操作(選択操作)に対応する操作ボタンであり、2個の反射光距離センサ54,58に対向して配置された2個の入力用操作ボタン36,37は、「めくる」操作に対応する操作ボタンである。 The aerial image display input device 1C of the fifth embodiment is a modification of the arrangement relationship between the input detection sensor (reflected light distance sensor) and the input operation button from the aerial image display input device 1 of the first embodiment. Places five input operation buttons 33 to 37 facing the five reflected light distance sensors 54 to 58. Of these, the three input operation buttons 33, 34, 35 arranged facing the three reflected light distance sensors 55, 56, 57 are operation buttons corresponding to the input operation (selection operation) by pressing the buttons. The two input operation buttons 36 and 37 arranged to face the two reflected light distance sensors 54 and 58 are operation buttons corresponding to the "turning" operation.
 また、図15及び図17に示された側面ガイド13は、筐体11の両側に配置される、例えば板状の部材であって、三次元空間投射面10の側方に位置する投射面部14と入力検出センサ5(反射光距離センサ54~58)の側方に位置するセンサ面部15とを有する。 Further, the side guide 13 shown in FIGS. 15 and 17 is, for example, a plate-shaped member arranged on both sides of the housing 11, and is a projection surface portion 14 located on the side of the three-dimensional space projection surface 10. And a sensor surface portion 15 located on the side of the input detection sensor 5 (reflected light distance sensor 54 to 58).
 図16Aの表示画像(三次元空間投射面10)は、入力用操作ボタン33~37の操作前の画面であり、図16Bの表示画像(三次元空間投射面10)は、入力用操作ボタン35の押下操作後の画面である。 The display image (three-dimensional space projection surface 10) of FIG. 16A is a screen before the operation of the input operation buttons 33 to 37, and the display image (three-dimensional space projection surface 10) of FIG. 16B is the input operation button 35. This is the screen after pressing.
 図15に示すように、空中像表示入力装置1Cでは、実施例1に比べてそれぞれの入力用操作ボタン33~37の間の距離が近くなるが、「めくる」操作に対応する入力用操作ボタン36,37を有することによって、取扱の選択種類が2種類(取扱A,取扱B)から5種類以上(例えば取扱A~取扱E)に増えることで、取扱の活用範囲が広がるという利点がある。 As shown in FIG. 15, in the aerial image display input device 1C, the distance between the respective input operation buttons 33 to 37 is shorter than in the first embodiment, but the input operation buttons corresponding to the "turn" operation are provided. Having 36 and 37 has the advantage that the range of utilization of handling is expanded by increasing the selection types of handling from two types (handling A and handling B) to five or more types (for example, handling A to handling E).
 実施例5において、5個の入力検出センサ5(反射光距離センサ54~58)は、例えば約25mm間隔で配置される。反射光距離センサ54~58は、発光側に(発光素子として)赤外線発光素子の1種である赤外線LEDと、受光側に(受光素子として)位置検出素子(PSD:Position Sensitive Detector)を内蔵し、赤外線LEDによる発光の対象物による反射光をPSDが受光する際の受光方向に基づいて、三角測量方式により対象物までの距離に応じた出力を行うセンサであり、図15に示すように、各センサは、発光側(発光素子)を映像透過プレート4に近い側に、受光側(受光素子)を遠い側に配置する。 In Example 5, the five input detection sensors 5 (reflected light distance sensors 54 to 58) are arranged at intervals of, for example, about 25 mm. The reflected light distance sensors 54 to 58 have a built-in infrared LED, which is a kind of infrared light emitting element (as a light emitting element), on the light emitting side, and a position detection element (PSD: Position Sensitive Detector) on the light receiving side (as a light receiving element). , A sensor that outputs light according to the distance to the object by a triangular survey method based on the light receiving direction when the PSD receives the reflected light from the object emitted by the infrared LED, as shown in FIG. In each sensor, the light emitting side (light emitting element) is arranged on the side closer to the image transmission plate 4, and the light receiving side (light receiving element) is arranged on the far side.
 実施例5における5個の入力検出センサ5(反射光距離センサ54~58)は、実施例1で示した3個の入力検出センサ5(反射光距離センサ51~53)と比べると、発光側及び受光側の配置を90度回転した配置としている。反射光距離センサは、物体の検出領域は、発光受光素子方向に広く、その直角方向に狭い。本実施例で採用する反射光距離センサでは、素子から約100mmの距離において、発光受光素子方向に約40mm、その直角方向に約20mmの検出領域を持つ。本実施例のように、画像上の複数のボタン操作のいずれかを検出するためには、隣のセンサの検出領域と重ならないようにするのがよく、例えば発光受光素子(発光素子及び受光素子)からなる各組のセンサ間距離を25mm程度とし、各組における発光受光素子の配置方向を、複数の同種素子(発光素子、受光素子)の配置方向に対して直角(垂直)方向としている。なお、図15や図17からも分かるように、複数の同種素子は、映像透過プレート4の底辺の近傍に、それぞれ底辺と平行な直線上に配置されていることから、本実施例において、各組における受光発光素子の配置方向は、映像透過プレート4の底辺方向に対して直角(垂直)方向をなす、とも言える。 The five input detection sensors 5 (reflected light distance sensors 54 to 58) in the fifth embodiment are on the light emitting side as compared with the three input detection sensors 5 (reflected light distance sensors 51 to 53) shown in the first embodiment. And the arrangement on the light receiving side is rotated 90 degrees. In the reflected light distance sensor, the detection area of the object is wide in the direction of the light emitting / receiving element and narrow in the direction perpendicular to the light emitting / receiving element. The reflected light distance sensor used in this embodiment has a detection region of about 40 mm in the light emitting / receiving element direction and about 20 mm in the direction perpendicular to the light emitting / receiving element at a distance of about 100 mm from the element. In order to detect any of a plurality of button operations on the image as in this embodiment, it is better not to overlap the detection area of the adjacent sensor, for example, a light emitting / receiving element (light emitting element and a light receiving element). The distance between the sensors of each set consisting of) is about 25 mm, and the arrangement direction of the light emitting / receiving element in each set is a direction perpendicular to (perpendicular) to the arrangement direction of a plurality of similar elements (light emitting element, light receiving element). As can be seen from FIGS. 15 and 17, a plurality of homogenous elements are arranged in the vicinity of the bottom of the image transmission plate 4 on a straight line parallel to the bottom, respectively. It can be said that the arrangement direction of the light receiving and emitting elements in the set is a direction perpendicular to the bottom direction of the image transmitting plate 4.
 また、本実施例のように入力検出センサ5を5個の反射光距離センサ54~58とした場合にも、実施例1と同様に、それぞれのセンサに対向した位置に、ボタン押下による入力操作(選択操作)に対応する操作ボタンや「めくる」操作に対応する操作ボタンを割り付けることで、実施例1で説明した入力判定処理と同様な処理を実行することができる。 Further, even when the input detection sensor 5 is set to five reflected light distance sensors 54 to 58 as in the present embodiment, the input operation by pressing a button is performed at the position facing each sensor as in the first embodiment. By allocating the operation buttons corresponding to the (selection operation) and the operation buttons corresponding to the "turning" operation, the same processing as the input determination processing described in the first embodiment can be executed.
 また、側面ガイド13の投射面部14は、三次元空間投射面10と一致する面を形成することによって、利用者にとって、図17に示したように、空中に浮かんだ操作画像の視野の基準とする(操作画像の両端を形成する)ことができるため、操作画像の視認性を上げることができる。また、側面ガイド13は、センサ面部15を形成することで、側面からの外光によって画像が見え難くなることや、入力検出センサ5が誤動作することを防止できる効果を奏する。 Further, the projection surface portion 14 of the side guide 13 forms a surface that coincides with the three-dimensional space projection surface 10, so that the user can refer to the field of view of the operation image floating in the air as shown in FIG. (Forming both ends of the operation image), so that the visibility of the operation image can be improved. Further, by forming the sensor surface portion 15, the side guide 13 has an effect of preventing the image from being difficult to see due to external light from the side surface and preventing the input detection sensor 5 from malfunctioning.
 また、図16A,図16Bに示す表示画像において、配置された入力用操作ボタン33~37は、手前方向に3次元的な厚みがあることを示す立体図で表示されており、図16Bに示された入力用操作ボタン35(取扱D)の押下状態は、選択された入力用操作ボタン35のみを、3次元的な厚みがない平面図を手前方向に厚みに相当する距離を移動させた位置に表示している。これを、図17に示すような空中像表示入力装置1Cに表示することで、利用者は、表示されたボタン「D」が、押下されたことを視覚的に理解可能となる。 Further, in the display images shown in FIGS. 16A and 16B, the arranged input operation buttons 33 to 37 are displayed in a three-dimensional diagram showing that they have a three-dimensional thickness in the front direction, and are shown in FIG. 16B. The pressed state of the input operation button 35 (handling D) is a position in which only the selected input operation button 35 is moved toward the front in a plan view having no three-dimensional thickness by a distance corresponding to the thickness. It is displayed in. By displaying this on the aerial image display input device 1C as shown in FIG. 17, the user can visually understand that the displayed button "D" has been pressed.
 本発明の第6の実施例(実施例6)では、10個の入力用操作ボタンを配置した空中像表示入力装置1Dについて説明する。 In the sixth embodiment (Example 6) of the present invention, the aerial image display input device 1D in which 10 input operation buttons are arranged will be described.
 図18は、実施例6に係る空中像表示入力装置1Dの外観斜視図である。また、図19は、空中像表示入力装置1Dの側面図である。 FIG. 18 is an external perspective view of the aerial image display input device 1D according to the sixth embodiment. Further, FIG. 19 is a side view of the aerial image display input device 1D.
 実施例6の空中像表示入力装置1Dは、実施例5の空中像表示入力装置1Cと同様に、入力検出センサ5として5個の反射光距離センサ54,55,56,57,58を配置し、さらに、実施例5の空中像表示入力装置1Cと異なる構成として、三次元空間投射面10の近傍に、投射面タッチセンサ16を配置する。 Similar to the aerial image display input device 1C of the fifth embodiment, the aerial image display input device 1D of the sixth embodiment arranges five reflected light distance sensors 54, 55, 56, 57, 58 as the input detection sensor 5. Further, as a configuration different from the aerial image display input device 1C of the fifth embodiment, the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10.
 投射面タッチセンサ16は、赤外線発光素子と受光素子とを有する。赤外線発光素子は三次元空間投射面10の面上方向に赤外線を発光させ、受光素子は三次元空間投射面10の幅方向にライン上に配置され、物体が遮った状態の時の三次元空間投射面10上の位置を、幅方向X,それと直角方向Yとして、XY座標で出力する。具体的には例えば、投射面タッチセンサ16には、neonodo社のzForce(登録商標)AIRタッチセンサを用いることができる。 The projection surface touch sensor 16 has an infrared light emitting element and a light receiving element. The infrared light emitting element emits infrared rays in the above-plane direction of the three-dimensional space projection surface 10, and the light receiving element is arranged on the line in the width direction of the three-dimensional space projection surface 10, and is a three-dimensional space when an object is obstructed. The position on the projection surface 10 is output in XY coordinates with X in the width direction and Y in the direction perpendicular to it. Specifically, for example, as the projection surface touch sensor 16, a zForce (registered trademark) AIR touch sensor manufactured by neodo can be used.
 実施例6では、図18に示すような10個の入力用操作ボタン38を表示した場合に、投射面タッチセンサ16によって、利用者が三次元空間投射面10上で、各入力用操作ボタン38の位置で物体が遮っているかを検出し、投射面タッチセンサ16が検出した物体が押下方向に移動する距離を、5個の反射光距離センサ54~58を用いて検出することで、各入力用操作ボタン38の押下操作を検出することができる。 In the sixth embodiment, when ten input operation buttons 38 as shown in FIG. 18 are displayed, the user can use the projection surface touch sensor 16 on the three-dimensional space projection surface 10 to display each input operation button 38. Each input is detected by detecting whether the object is obstructed at the position of, and detecting the distance that the object detected by the projection surface touch sensor 16 moves in the pressing direction by using the five reflected light distance sensors 54 to 58. It is possible to detect the pressing operation of the operation button 38.
 本実施例によれば、実施例5よりも操作選択できる種類を増やすことができると同時に、操作選択する種類が増えても、投射面タッチセンサ16を備えることによって、押下動作が行われたときにのみ押下検出を行うことから、誤動作や誤操作を防止することができる。 According to the present embodiment, it is possible to increase the types of operation selections as compared with the fifth embodiment, and at the same time, even if the number of types of operation selections increases, when the pressing operation is performed by providing the projection surface touch sensor 16. Since the press detection is performed only on the button, it is possible to prevent a malfunction or a malfunction.
 本実施例では、投射面タッチセンサ16を三次元空間投射面10の近傍に配置し、物体が遮った状態の時の三次元空間投射面10上の位置を、幅方向X,それと直角方向Yとして、XY座標で出力するタイプのセンサとしたが、別案として、投射面タッチセンサ16は、三次元空間投射面10の幅方向Xの両側に、片側に発光素子を、反対側に受光素子を配置し、物体が遮った状態の時の三次元空間投射面10上の位置を、幅方向Xと直角の方向YのY座標を出力するタイプのセンサであってもよい。X方向の位置は、反射光距離センサ54~58を用いて検出することができるからである。 In this embodiment, the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10, and the position on the three-dimensional space projection surface 10 when the object is obstructed is set to the width direction X and the direction perpendicular to it Y. However, as an alternative, the projection surface touch sensor 16 has a light emitting element on both sides of the three-dimensional space projection surface 10 in the width direction X on one side and a light receiving element on the other side. May be a type of sensor that outputs the Y coordinate of the direction Y perpendicular to the width direction X at the position on the three-dimensional space projection surface 10 when the object is obstructed. This is because the position in the X direction can be detected by using the reflected light distance sensors 54 to 58.
 なお、本発明は上記した実施例に限定されるものではなく、様々な変形例が含まれる。例えば、上記した実施例は本発明を分かりやすく説明するために詳細に説明したものであり、必ずしも説明した全ての構成を備えるものに限定されるものではない。また、ある実施例の構成の一部を他の実施例の構成に置き換えることが可能であり、また、ある実施例の構成に他の実施例の構成を加えることも可能である。また、各実施例の構成の一部について、他の構成の追加・削除・置換をすることが可能である。 The present invention is not limited to the above-described embodiment, but includes various modifications. For example, the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations. Further, it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment, and it is also possible to add the configuration of another embodiment to the configuration of one embodiment. Further, it is possible to add / delete / replace a part of the configuration of each embodiment with another configuration.
 また、上記の各構成、機能、処理部、処理手段等は、それらの一部又は全部を、例えば集積回路で設計する等によりハードウェアで実現してもよい。また、上記の各構成、機能等は、プロセッサがそれぞれの機能を実現するプログラムを解釈し、実行することによりソフトウェアで実現してもよい。各機能を実現するプログラム、テーブル、ファイル等の情報は、メモリや、ハードディスク、SSD(Solid State Drive)等の記録装置、または、ICカード、SDカード、DVD等の記録媒体に置くことができる。 Further, each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
 また、図面において制御線や情報線は説明上必要と考えられるものを示しており、製品上必ずしも全ての制御線や情報線を示しているとは限らない。実施には殆ど全ての構成が相互に接続されていると考えてもよい。 In addition, control lines and information lines are shown in the drawings as necessary for explanation, and not all control lines and information lines are shown in the product. In practice it may be considered that almost all configurations are interconnected.
 1,1A,1B,1C,1D 空中像表示入力装置
 2  空中像投射ユニット
 3  表示部
 4  映像透過プレート
 5,5B 入力検出センサ
 6  入力検出方向
 7  制御部
 8  利用者
 9  取扱処理装置
 10 三次元空間投射面
 11 筐体
 12 筐体台座
 13 側面ガイド
 14 投射面部
 15 センサ面部
 16 投射面タッチセンサ
 30~38,30A,30B 入力用操作ボタン
 40 光分岐部材
 41 再帰反射部材
 51~58 反射光距離センサ
 61,62,63 入力検出面
 71 入力判定処理部
 72 入出力I/F処理部
 73 画面制御処理部
 74 スピーカ
 80 手
 91 制御部
 92 表示部
 93 処理ユニット
 94 受付精算機
 95 表示部
 96 入力用操作ボタン
 
1,1A, 1B, 1C, 1D aerial image display input device 2 aerial image projection unit 3 display unit 4 image transmission plate 5, 5B input detection sensor 6 input detection direction 7 control unit 8 user 9 handling processing device 10 three-dimensional space Projection surface 11 Housing 12 Housing pedestal 13 Side guide 14 Projection surface 15 Sensor surface 16 Projection surface touch sensor 30-38, 30A, 30B Input operation button 40 Optical branch member 41 Retroreflective member 51-58 Reflected light distance sensor 61 , 62, 63 Input detection surface 71 Input judgment processing unit 72 Input / output I / F processing unit 73 Screen control processing unit 74 Speaker 80 Hand 91 Control unit 92 Display unit 93 Processing unit 94 Reception settlement machine 95 Display unit 96 Input operation button

Claims (15)

  1.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
     前記入力検出センサは、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺に平行な直線上に配し、
     前記矩形投影面の辺のうち前記映像透過プレートと平行でない一組の辺の中点の近傍を結んだ線分と、前記入力検出センサの配置直線とで形成する面を入力検出領域として配置し、
     前記所定の映像を表示する表示部は、前記矩形投影面の前記中点を結んだ線分上の近傍に、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
     前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The input detection sensor is arranged on a straight line parallel to the other side, which is located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
    A surface formed by a line segment connecting the vicinity of the midpoint of a set of sides not parallel to the image transmission plate among the sides of the rectangular projection surface and the arrangement straight line of the input detection sensor is arranged as an input detection area. ,
    The display unit that displays the predetermined image displays an input operation button in the input guidance screen in the vicinity of the line segment connecting the midpoints of the rectangular projection surface.
    The input detection sensor detects the presence or absence of an object in the area of the input operation button.
    The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
  2.  前記入力検出センサは、反射型距離測定センサであって、直線状に2個以上のN個が配置され、
     前記表示部は、N個以下の入力用操作ボタンを表示し、
     前記制御部は、前記N個の反射型距離測定センサの出力信号の変化により、前記入力用操作ボタンの押下を判定する
     ことを特徴とする、請求項1に記載の空中像表示入力装置。
    The input detection sensor is a reflection type distance measurement sensor, and two or more N pieces are linearly arranged.
    The display unit displays N or less input operation buttons.
    The aerial image display input device according to claim 1, wherein the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
  3.  前記入力案内画面で入力可能な入力内容の変更を要求する前記空中操作として、所定の操作方向に物体を移動させる移動操作が設けられ、
     前記表示部が表示するN個以下の前記入力用操作ボタンの一つ以上は、前記移動操作の前記操作方向を示す表示変更要求ボタンであって、
     前記制御部は、前記N個の反射型距離測定センサの出力信号のうちで、前記表示変更要求ボタンを検出するセンサ出力信号と、前記表示変更要求ボタンに隣接する部分を検出するセンサ出力信号との変化情報に基づいて、前記表示変更要求ボタンの押下を判定する
     ことを特徴とする、請求項2に記載の空中像表示入力装置。
    As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided.
    One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
    Among the output signals of the N reflection type distance measurement sensors, the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button. The aerial image display input device according to claim 2, wherein the pressing of the display change request button is determined based on the change information of the above.
  4.  前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、前記一体構造は角度を可変とする
     ことを特徴とする、請求項1乃至3の何れか1項に記載の空中像表示入力装置。
    The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. The aerial image display input device according to any one of claims 1 to 3, wherein the integrated structure has a variable angle.
  5.  前記空中像表示入力装置は、利用者との間で媒体処理または所定の作業を実行する取扱処理装置と接続し、前記制御部は、前記取扱処理装置からの情報に基づき、前記表示部に表示する前記入力用操作ボタンの映像を切り替え、前記入力用操作ボタンの押下の判定情報を、前記取扱処理装置に出力する
     ことを特徴とする、請求項1乃至4の何れか1項に記載の空中像表示入力装置。
    The aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device. The air according to any one of claims 1 to 4, wherein the image of the input operation button is switched, and the determination information of pressing the input operation button is output to the handling processing device. Image display input device.
  6.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度を持って、利用者視線に対向する矩形投影面であって、
     前記入力検出センサは、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺に平行な2本以上のM本の直線上に配し、
     前記矩形投影面の辺のうち前記映像透過プレートと平行でない一組の辺の(M+1)等分点の近傍を結んだ線分と、前記入力検出センサのM本の配置直線で形成するM個の面を入力検出領域として配置し、
     前記所定の映像を表示する表示部は、前記矩形投影面における前記(M+1)等分点の近傍を結んだ線分上の近傍に、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
     前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The input detection sensor is arranged on two or more M straight lines parallel to the other side, which are located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
    Of the sides of the rectangular projection surface, M pieces formed by a line segment connecting the vicinity of the (M + 1) equal division point of a set of sides not parallel to the image transmission plate and M arrangement straight lines of the input detection sensor. Place the surface as an input detection area,
    The display unit that displays the predetermined image displays an input operation button in the input guide screen in the vicinity of a line segment connecting the vicinity of the (M + 1) equal division point on the rectangular projection surface.
    The input detection sensor detects the presence or absence of an object in the area of the input operation button.
    The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
  7.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと約75度の角度をなして、利用者視線に対向する矩形投影面であって、
     前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記三次元空間投射面に表示された前記入力用操作ボタンの領域内を略垂直方向に物体が移動することを検出し、
     前記制御部は、前記入力検出センサによる物体移動の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at an angle of about 75 degrees with the image transmission plate.
    The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
    The input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
    The control unit determines that the input operation button is pressed based on the detection information of the object movement by the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
  8.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
     前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを、前記映像透過プレートの矩形の一辺と反対側に厚み方向の影像を持った立体斜視画像として表示し、
     前記入力検出センサは、前記三次元空間投射面に表示された前記入力用操作ボタンの領域内を略垂直方向に物体が移動することを検出し、
     前記制御部は、前記入力検出センサの物体移動の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記入力用操作ボタンが押下されたと判定した場合は、前記表示部が表示する当該入力用操作ボタンを、前記厚み方向の影像を持たない平面画像で、押下状態にないときの当該入力用操作ボタンの前記厚み方向の影像の長さの分だけ移動した位置に表示することにより、当該入力用操作ボタンが押下状態であることを示すようにする
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The display unit that displays the predetermined image displays the input operation button in the input guidance screen as a stereoscopic perspective image having an image in the thickness direction on the side opposite to one side of the rectangle of the image transmission plate.
    The input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
    The control unit determines that the input operation button is pressed based on the detection information of the object movement of the input detection sensor, and when it is determined that the input operation button is pressed, the display unit displays the display. By displaying the input operation button on a flat image having no image in the thickness direction at a position moved by the length of the image in the thickness direction of the input operation button when it is not in the pressed state. An aerial image display input device characterized by indicating that the input operation button is in the pressed state.
  9.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
     前記入力検出センサは、複数組の赤外線発光素子及び受光素子を有し、前記赤外線発光素子による発光の対象物による反射光を前記受光素子が受光する際の受光方向に基づいて、三角測量の原理で前記対象物までの距離を算出する反射光距離センサであって、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺にそれぞれ平行な2直線のうち、複数の前記赤外線発光素子が前記映像透過プレートに近い側の直線上に配置され、複数の前記受光素子が前記映像透過プレートに遠い側の直線上に配置され、かつ、各組の前記赤外線発光素子と前記受光素子とを結ぶ直線が前記平行な2直線と直角に交わるように配置され、
     前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
     前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The input detection sensor has a plurality of sets of infrared light emitting elements and light receiving elements, and the principle of triangular measurement is based on the light receiving direction when the light receiving element receives the reflected light by the object of light emission by the infrared light emitting element. It is a reflected light distance sensor that calculates the distance to the object, and is a two straight line that is located near the other side of the rectangular shape of the image transmission plate and is parallel to the other side. Among them, a plurality of the infrared light emitting elements are arranged on a straight line on the side close to the image transmitting plate, a plurality of the light receiving elements are arranged on a straight line on the side far from the image transmitting plate, and each set of the infrared rays is arranged. A straight line connecting the light emitting element and the light receiving element is arranged so as to intersect the two parallel straight lines at right angles.
    The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
    The input detection sensor detects the presence or absence of an object in the area of the input operation button.
    The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input device characterized by.
  10.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置において、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
     前記入力検出センサは、
     前記三次元空間投射面と同一の面上で、前記利用者の空中操作に伴う物体の位置について、前記映像透過プレートの矩形の一辺方向をX方向、それと直角方向をY方向としたとき、少なくとも前記Y方向の位置を検出する投射面タッチセンサと、
     前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する当該他の一辺に平行な直線上にそれぞれ配置され、対象物までの距離を算出する複数個の距離センサと、を配し、
     前記所定の映像を表示する表示部は、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記投射面タッチセンサによって、前記入力用操作ボタンの領域の物体有無を検出するとともに、前記複数個の反射光距離センサによって、前記入力用操作ボタンの領域に対する前記物体の押す方向の移動量を検出し、
     前記制御部は、前記入力検出センサによって検出された物体有無の検出情報と、前記物体の押す方向の移動量の検出情報とに基づき、前記入力用操作ボタンの押下を判定し、前記入力用操作ボタンが押下されたと判定した場合は、前記表示部が表示する当該入力用操作ボタンを、押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力装置。
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    In the aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The input detection sensor is
    With respect to the position of the object associated with the user's aerial operation on the same surface as the three-dimensional space projection surface, at least when one side direction of the rectangle of the image transmission plate is the X direction and the direction perpendicular to it is the Y direction. A projection surface touch sensor that detects the position in the Y direction, and
    A plurality of distance sensors arranged on a straight line parallel to the other side located in the vicinity of the other side of the rectangle of the image transmission plate and calculating the distance to the object. Arrange,
    The display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
    The input detection sensor detects the presence or absence of an object in the area of the input operation button by the projection surface touch sensor, and pushes the object against the area of the input operation button by the plurality of reflected light distance sensors. Detects the amount of movement in the direction and
    The control unit determines that the input operation button is pressed based on the detection information of the presence or absence of an object detected by the input detection sensor and the detection information of the amount of movement of the object in the pushing direction, and determines that the input operation button is pressed. An aerial image display input device, characterized in that, when it is determined that a button has been pressed, the input operation button displayed by the display unit is changed to indicate that the button is in the pressed state.
  11.  前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、
     前記空中像投射ユニットの両側面に、前記入力検出センサ、前記映像透過プレート、及び前記三次元空間投射面を覆う三角形状の側面ガイドをさらに有する
     ことを特徴とする、請求項7乃至10の何れか1項に記載の空中像表示入力装置。
    The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. Arranged in
    7. The aerial image display input device according to item 1.
  12.  前記入力案内画面で入力可能な入力内容の変更を要求する前記空中操作として、所定の操作方向に物体を移動させる移動操作が設けられ、
     前記表示部が表示するN個以下の前記入力用操作ボタンの一つ以上は、前記移動操作の前記操作方向を表す表示変更要求ボタンであって、
     前記制御部は、前記N個の反射型距離測定センサの出力信号のうちで、前記表示変更要求ボタンを検出するセンサ出力信号と、前記表示変更要求ボタンに隣接する部分を検出するセンサ出力信号との変化情報に基づいて、前記表示変更要求ボタンの押下を判定する
     ことを特徴とする、請求項7乃至11の何れか1項に記載の空中像表示入力装置。
    As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided.
    One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
    Among the output signals of the N reflection type distance measurement sensors, the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button. The aerial image display input device according to any one of claims 7 to 11, wherein the pressing of the display change request button is determined based on the change information of the above.
  13.  前記空中像投射ユニット及び前記入力検出センサは一体構造であって、前記映像透過プレートは略垂直で、前記三次元空間投射面は水平面に対して利用者の視線側に傾斜した面を形成するように配し、前記一体構造は角度を可変とする
     ことを特徴とする、請求項7乃至12の何れか1項に記載の空中像表示入力装置。
    The aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane. The aerial image display input device according to any one of claims 7 to 12, wherein the integrated structure has a variable angle.
  14.  前記空中像表示入力装置は、利用者との間で媒体処理または所定の作業を実行する取扱処理装置と接続し、前記制御部は、前記取扱処理装置からの情報に基づき、前記表示部に表示する前記入力用操作ボタンの映像を切り替え、前記入力用操作ボタンの押下の判定情報を、前記取扱処理装置に出力する
     ことを特徴とする、請求項7乃至13の何れか1項に記載の空中像表示入力装置。
    The aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device. The air according to any one of claims 7 to 13, wherein the image of the input operation button is switched, and the determination information of pressing the input operation button is output to the handling processing device. Image display input device.
  15.  所定の映像を表示する矩形の表示部を内部に持ち、前記表示部に表示される前記映像を利用者が視認可能な三次元空間投射面に投影するための矩形の映像透過プレートを外部に持ち、前記表示部に表示される前記映像を利用者の入力案内画面として空中に結像させる空中像投射ユニットと、
     前記利用者による前記入力案内画面への空中操作を検出する入力検出センサと、
     所定の制御を行う制御部と、
     を有する空中像表示入力装置による空中像表示入力方法であって、
     前記三次元空間投射面は、前記映像透過プレートの矩形の一辺の近傍から、前記映像透過プレートと所定の角度をなして、利用者視線に対向する矩形投影面であって、
     前記入力検出センサは、前記映像透過プレートの矩形の前記一辺と対向する他の一辺の近傍に位置する、当該他の一辺に平行な直線上に配し、
     前記矩形投影面の辺のうち前記映像透過プレートと平行でない一組の辺の中点の近傍を結んだ線分と、前記入力検出センサの配置直線とで形成する面を入力検出領域として配置し、
     前記所定の映像を表示する表示部は、前記矩形投影面の前記中点を結んだ線分上の近傍に、入力案内画面の中の入力用操作ボタンを表示し、
     前記入力検出センサは、前記入力用操作ボタンの領域の物体有無を検出し、
     前記制御部は、前記入力検出センサの物体有の検出情報に基づき、前記入力用操作ボタンの押下を判定し、前記表示部の入力用操作ボタンを押下状態であることを示すように変更する
     ことを特徴とする空中像表示入力方法。
     
    It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
    An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
    A control unit that performs predetermined control and
    It is an aerial image display input method by an aerial image display input device having
    The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
    The input detection sensor is arranged on a straight line parallel to the other side, which is located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
    A surface formed by a line segment connecting the vicinity of the midpoint of a set of sides not parallel to the image transmission plate among the sides of the rectangular projection surface and the arrangement straight line of the input detection sensor is arranged as an input detection area. ,
    The display unit that displays the predetermined image displays an input operation button in the input guidance screen in the vicinity of the line segment connecting the midpoints of the rectangular projection surface.
    The input detection sensor detects the presence or absence of an object in the area of the input operation button.
    The control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state. An aerial image display input method characterized by.
PCT/JP2021/003495 2020-06-24 2021-02-01 Aerial image display input device and aerial mage display input method WO2021260989A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020108346 2020-06-24
JP2020-108346 2020-06-24
JP2020-171039 2020-10-09
JP2020171039A JP2022007868A (en) 2020-06-24 2020-10-09 Aerial image display input device and aerial image display input method

Publications (1)

Publication Number Publication Date
WO2021260989A1 true WO2021260989A1 (en) 2021-12-30

Family

ID=79282287

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/003495 WO2021260989A1 (en) 2020-06-24 2021-02-01 Aerial image display input device and aerial mage display input method

Country Status (1)

Country Link
WO (1) WO2021260989A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4242730A1 (en) * 2022-03-09 2023-09-13 Alps Alpine Co., Ltd. Method for manufacturing optical element, optical element, aerial image display device, and spatial input device

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016132568A1 (en) * 2015-02-16 2016-08-25 株式会社アスカネット Non-contact input device and method
JP2017107133A (en) * 2015-12-11 2017-06-15 株式会社ニコン Display device, electronic device, image processing device, and image processing program
WO2017125984A1 (en) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Aerial display device
JP2018018305A (en) * 2016-07-28 2018-02-01 ラピスセミコンダクタ株式会社 Space input device and indication point detection method
JP2019003332A (en) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 Aerial graphic display device
JP2019002976A (en) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 Aerial video display device
WO2019167425A1 (en) * 2018-02-27 2019-09-06 ソニー株式会社 Electronic device

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016132568A1 (en) * 2015-02-16 2016-08-25 株式会社アスカネット Non-contact input device and method
JP2017107133A (en) * 2015-12-11 2017-06-15 株式会社ニコン Display device, electronic device, image processing device, and image processing program
WO2017125984A1 (en) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Aerial display device
JP2018018305A (en) * 2016-07-28 2018-02-01 ラピスセミコンダクタ株式会社 Space input device and indication point detection method
JP2019003332A (en) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 Aerial graphic display device
JP2019002976A (en) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 Aerial video display device
WO2019167425A1 (en) * 2018-02-27 2019-09-06 ソニー株式会社 Electronic device

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4242730A1 (en) * 2022-03-09 2023-09-13 Alps Alpine Co., Ltd. Method for manufacturing optical element, optical element, aerial image display device, and spatial input device

Similar Documents

Publication Publication Date Title
US11379048B2 (en) Contactless control panel
EP3250989B1 (en) Optical proximity sensor and associated user interface
JP2022007868A (en) Aerial image display input device and aerial image display input method
US20040104894A1 (en) Information processing apparatus
US9001087B2 (en) Light-based proximity detection system and user interface
US9880637B2 (en) Human interface apparatus having input unit for pointer location information and pointer command execution unit
US8907894B2 (en) Touchless pointing device
GB2462171A (en) Displaying enlarged content on a touch screen in response to detecting the approach of an input object
KR102052752B1 (en) Multi human interface devide having text input unit and pointer location information input unit
US20170170826A1 (en) Optical sensor based mechanical keyboard input system and method
WO2021260989A1 (en) Aerial image display input device and aerial mage display input method
US9703410B2 (en) Remote sensing touchscreen
KR20150050546A (en) Multi functional human interface apparatus
KR20090030697A (en) Multi-functional mouse
EP3242190B1 (en) System, method and computer program for detecting an object approaching and touching a capacitive touch device
KR101682527B1 (en) touch keypad combined mouse using thin type haptic module
KR102514832B1 (en) Multi human interface device having text input unit and pointer location information input unit
KR102096861B1 (en) Integrated touch system and diplay device
JP2007310477A (en) Screen operation device and screen operation method and display input device to be used for screen operation device
JP5027084B2 (en) Input device and input method
KR20140066378A (en) Display apparatus and method of controlling the same
KR20140063487A (en) Multi human interface devide having display unit
KR20140063483A (en) Multi human interface devide having display unit
KR20140063484A (en) Multi human interface devide having display unit
KR20140063486A (en) Multi human interface devide having display unit

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21828998

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21828998

Country of ref document: EP

Kind code of ref document: A1