WO2021260989A1 - Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne - Google Patents

Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne Download PDF

Info

Publication number
WO2021260989A1
WO2021260989A1 PCT/JP2021/003495 JP2021003495W WO2021260989A1 WO 2021260989 A1 WO2021260989 A1 WO 2021260989A1 JP 2021003495 W JP2021003495 W JP 2021003495W WO 2021260989 A1 WO2021260989 A1 WO 2021260989A1
Authority
WO
WIPO (PCT)
Prior art keywords
input
image
user
detection sensor
operation button
Prior art date
Application number
PCT/JP2021/003495
Other languages
English (en)
Japanese (ja)
Inventor
誠 飯田
利一 加藤
靖久 吉田
寛之 斎藤
真也 松井
雄希 前田
公 松島
Original Assignee
日立チャネルソリューションズ株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2020171039A external-priority patent/JP2022007868A/ja
Application filed by 日立チャネルソリューションズ株式会社 filed Critical 日立チャネルソリューションズ株式会社
Publication of WO2021260989A1 publication Critical patent/WO2021260989A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory

Definitions

  • the present invention relates to an aerial image display input device and an aerial image display input method, and is suitable for application to an aerial image display input device and an aerial image display input method for inputting a user's operation on an image displayed in the air (aerial image). It is a thing.
  • Patent Document 1 arranges an image display device, a half mirror, and a retroreflective material in the air.
  • An aerial image display device for displaying an image is disclosed.
  • Patent Document 2 a display object installed inside is displayed in an external three-dimensional space by an image coupling plate, and a gesture of hand movement is performed from information of a camera and a distance sensor that captures the user's hand.
  • a gesture operation device that recognizes an operation is disclosed.
  • an aerial image display device including a light source such as an image, a retroreflective member, and an optical branching member.
  • a non-contact operation detecting device including a detecting means for detecting whether or not an object is present at a point is disclosed.
  • the detection means includes a detection light source that emits infrared light provided in a light source such as an image, and a photodiode that detects infrared light that is return light from a point of interest. It is shown.
  • Patent Document 4 describes a proposal of arranging a built-in input detection sensor in an input / output device provided with an image forming mechanism unit having a display unit for displaying an image to be imaged in the air, and an operator's face or line of sight.
  • a proposal to install a detection device for detecting is disclosed.
  • a control unit having high processing capacity is required in order to perform image recognition processing for recognizing a user's gesture as a predetermined input operation. There were issues in terms of time and product price.
  • the present invention has been made in consideration of the above points, and is an aerial image display input device that realizes a highly convenient input device that can be intuitively operated by a user for an aerial image with a simple device configuration. This is an attempt to propose an aerial image display input method.
  • the present invention provides the following aerial image display input devices [1] to [14] and an aerial image display input method using these aerial image display input devices.
  • [1] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is arranged on a straight line parallel to the other side, which is located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
  • a surface formed by a line segment connecting the vicinity of the midpoint of a set of sides not parallel to the image transmission plate among the sides of the rectangular projection surface and the arrangement straight line of the input detection sensor is arranged as an input detection area.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen in the vicinity of the line segment connecting the midpoints of the rectangular projection surface.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [2]
  • the input detection sensor is a reflection type distance measurement sensor, and two or more N pieces are linearly arranged.
  • the display unit displays N or less input operation buttons.
  • the aerial image display input device according to the above [1], wherein the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
  • the control unit determines whether or not the input operation button is pressed based on a change in the output signal of the N reflection type distance measurement sensors.
  • a movement operation for moving an object in a predetermined operation direction is provided.
  • One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
  • the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button.
  • the aerial image display input device according to the above [2], wherein the pressing of the display change request button is determined based on the change information of the above.
  • the aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane.
  • the aerial image display input device according to any one of the above [1] to [3], wherein the integrated structure has a variable angle.
  • the aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device.
  • Aerial image display input device [6] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is arranged on two or more M straight lines parallel to the other side, which are located in the vicinity of the other side of the rectangle of the image transmission plate facing the other side.
  • the display unit that displays the predetermined image displays an input operation button in the input guide screen in the vicinity of a line segment connecting the vicinity of the (M + 1) equal division point on the rectangular projection surface.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by.
  • It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at an angle of about 75 degrees with the image transmission plate.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
  • the control unit determines that the input operation button is pressed based on the detection information of the object movement by the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [8] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the display unit that displays the predetermined image displays the input operation button in the input guidance screen as a stereoscopic perspective image having an image in the thickness direction on the side opposite to one side of the rectangle of the image transmission plate.
  • the input detection sensor detects that an object moves in a substantially vertical direction within the area of the input operation button displayed on the three-dimensional space projection surface, and detects that the object moves in a substantially vertical direction.
  • the control unit determines that the input operation button is pressed based on the detection information of the object movement of the input detection sensor, and when it is determined that the input operation button is pressed, the display unit displays the display. By displaying the input operation button on a flat image having no image in the thickness direction at a position moved by the length of the image in the thickness direction of the input operation button when it is not in the pressed state.
  • An aerial image display input device characterized by indicating that the input operation button is in the pressed state.
  • It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user.
  • An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen.
  • An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor has a plurality of sets of infrared light emitting elements and light receiving elements, and the principle of triangular measurement is based on the light receiving direction when the light receiving element receives the reflected light by the object of light emission by the infrared light emitting element. It is a reflected light distance sensor that calculates the distance to the object, and is a two straight line that is located near the other side of the rectangular shape of the image transmission plate and is parallel to the other side.
  • the plurality of the infrared light emitting elements are arranged on a straight line on the side close to the image transmitting plate
  • the plurality of the light receiving elements are arranged on the straight line on the side far from the image transmitting plate
  • each set of the infrared rays is arranged.
  • a straight line connecting the light emitting element and the light receiving element is arranged so as to intersect the two parallel straight lines at right angles.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button.
  • the control unit determines that the input operation button is pressed based on the detection information of the object possession of the input detection sensor, and changes the control unit so as to indicate that the input operation button of the display unit is in the pressed state.
  • An aerial image display input device characterized by. [10] It has a rectangular display unit that displays a predetermined image inside, and has a rectangular image transmission plate that projects the image displayed on the display unit onto a three-dimensional space projection surface that can be seen by the user. An aerial image projection unit that forms an image of the image displayed on the display unit in the air as a user's input guidance screen. An input detection sensor that detects an aerial operation on the input guidance screen by the user, and an input detection sensor.
  • a control unit that performs predetermined control and In the aerial image display input device having The three-dimensional space projection surface is a rectangular projection surface facing the user's line of sight from the vicinity of one side of the rectangle of the image transmission plate at a predetermined angle with the image transmission plate.
  • the input detection sensor is With respect to the position of the object associated with the user's aerial operation on the same surface as the three-dimensional space projection surface, at least when one side direction of the rectangle of the image transmission plate is the X direction and the direction perpendicular to it is the Y direction.
  • a projection surface touch sensor that detects the position in the Y direction, and A plurality of distance sensors arranged on a straight line parallel to the other side located in the vicinity of the other side of the rectangle of the image transmission plate and calculating the distance to the object.
  • the display unit that displays the predetermined image displays an input operation button in the input guidance screen, and displays the input operation button.
  • the input detection sensor detects the presence or absence of an object in the area of the input operation button by the projection surface touch sensor, and pushes the object against the area of the input operation button by the plurality of reflected light distance sensors.
  • the control unit determines that the input operation button is pressed based on the detection information of the presence or absence of an object detected by the input detection sensor and the detection information of the amount of movement of the object in the pushing direction, and determines that the input operation button is pressed.
  • An aerial image display input device characterized in that, when it is determined that a button has been pressed, the input operation button displayed by the display unit is changed to indicate that the button is in the pressed state.
  • the aerial image display input device according to any one of. [12] As the aerial operation for requesting a change in the input contents that can be input on the input guidance screen, a movement operation for moving an object in a predetermined operation direction is provided. One or more of the N or less input operation buttons displayed by the display unit are display change request buttons indicating the operation direction of the movement operation.
  • the control unit includes a sensor output signal for detecting the display change request button and a sensor output signal for detecting a portion adjacent to the display change request button.
  • the aerial image display input device according to any one of the above [7] to [11], wherein the pressing of the display change request button is determined based on the change information of the above. [13]
  • the aerial image projection unit and the input detection sensor have an integral structure, the image transmission plate is substantially vertical, and the three-dimensional space projection surface forms a surface inclined toward the user's line of sight with respect to the horizontal plane.
  • the aerial image display input device according to any one of [7] to [12] above, wherein the integrated structure has a variable angle.
  • the aerial image display input device is connected to a handling processing device that performs medium processing or a predetermined work with the user, and the control unit displays on the display unit based on the information from the handling processing device.
  • a handling processing device that performs medium processing or a predetermined work with the user
  • the control unit displays on the display unit based on the information from the handling processing device.
  • FIG. 1A It is an external perspective view of the aerial image display input device 1A which concerns on Example 2 of this invention. It is a figure which shows the display example of the input operation button 30 in Example 3.
  • FIG. It is an external perspective view of the aerial image display input device 1B which concerns on Example 4.
  • FIG. It is a side view of the aerial image display input device 1B.
  • FIG. 2 is a diagram (No. 2) showing an example of a display image of a three-dimensional space projection surface 10 by an aerial image display input device 1C.
  • FIG. 16B It is a figure which shows the relationship between the display image shown in FIG. 16B, and the arrangement of the reflected light distance sensor of the aerial image display input device 1C. It is an external perspective view of the aerial image display input device 1D which concerns on Example 6. FIG. It is a side view of the aerial image display input device 1D.
  • FIG. 1 is an external perspective view of the aerial image display input device 1 according to the first embodiment (Example 1) of the present invention.
  • FIG. 2 is a side view of the aerial image display input device 1.
  • the aerial image display input device 1 of FIG. 2 shows the internal structure in an easy-to-understand manner by showing a cross section seen from the side.
  • FIG. 3 is a block diagram showing a configuration example of the aerial image display input device 1.
  • the aerial image display input device 1 is configured to include an aerial image projection unit 2, an input detection sensor 5, and a control unit 7, which are mounted on a housing pedestal 12. It is implemented in 11.
  • FIG. 1 schematically shows an input operation button 30 and a user-operated hand 80 on the three-dimensional space projection surface 10.
  • the three-dimensional space projection surface 10 is a projection surface in three-dimensional space on which an aerial image (input operation button 30) is displayed by the aerial image display input device 1, and the hand 80 uses the aerial image display input device 1. This is an example of the part of the user 8.
  • the user 8 places the hand 80 at the position of the input operation button 30 projected on the three-dimensional space projection surface 10 to perform an input operation on the aerial image display input device 1. It can be performed.
  • the aerial image display input device 1 includes an aerial image projection unit 2, an input detection sensor 5, and a control unit 7.
  • the aerial image projection unit 2 has a display unit 3 and an image transmission plate 4 (light branch member 40, retroreflection member 41).
  • the input detection sensor 5 has reflected light distance sensors 51, 52, and 53, which are reflection type distance measurement sensors.
  • the control unit 7 has a built-in speaker 74, and has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73.
  • the aerial image display input device 1 can be connected to a handling processing device 9 which is a separate device, and the handling processing device 9 includes a control unit 91, a display unit 92, and a processing unit 93.
  • the configurations of the aerial image display input device 1 will be described in detail below.
  • the aerial image projection unit 2 is a unit that transmits an image of the display unit 3 by a liquid crystal display or the like through an image transmission plate 4 by internal light reflection or transmission action to form an image on a three-dimensional space projection surface 10. ..
  • a high-brightness liquid crystal display unit 3 is arranged at the top, and an optical branching member 40 such as a half mirror is provided on the image transmission plate 4 with respect to the display unit 3.
  • an optical branching member 40 such as a half mirror is provided on the image transmission plate 4 with respect to the display unit 3.
  • a retroreflective member 41 Arranged at an angle of about degrees, and further arranged, a retroreflective member 41 that retroreflects the light reflected by the optical branch member 40 in the same direction.
  • the reflected light passes through the optical branching member 40 and is imaged on the three-dimensional space projection surface 10 to form an aerial image.
  • the display unit 3 and the aerial image of the three-dimensional space projection surface 10 are line-symmetrical with respect to the image transmission plate 4.
  • the aerial image projection technique having the above configuration is widely disclosed in the above-mentioned prior art documents and the like, and a configuration other than the present embodiment may be adopted.
  • an optical branching member 40 such as a half mirror is adopted for the image transmitting plate 4 to form an aerial image in combination with the retroreflective member 41, but as another embodiment, optical branching is performed.
  • An image transmission plate 4 that does not use the member 40 has also been devised, and the formation of an aerial image can be realized by such another embodiment as in the present embodiment.
  • a small high-brightness liquid crystal having a size of 5 inches is arranged in the display unit 3, and the optical branching member 40 has a rectangular shape forming a vertical surface as shown in FIG.
  • a rectangular display area having a size substantially the same as the screen size of the 5-inch liquid crystal of the display unit 3 is formed on the original space projection surface 10, and the angle ⁇ 1 between the optical branch member 40 and the three-dimensional space projection surface 10 (see FIG. 2). ) Is arranged so that it is about 75 degrees.
  • the rectangle in the display area is also represented as a rectangle A1-A2-A3-A4 by using the respective vertices A1, A2, A3, A4 of the rectangle (see FIG. 1), and such a notation method will be described later. The same applies to other rectangles.
  • the input detection sensor 5 includes three reflected light distance sensors 51, 52, and 53.
  • Each of the reflected light distance sensors 51, 52, and 53 has a light emitting element (for example, an infrared light emitting element) and a light receiving element, and can detect the distance to the object in the optical axis direction of the light emitting element and the light receiving element. It is a distance sensor.
  • the three reflected light distance sensors 51, 52, and 53 are arranged in a straight line (on the line segments B1-B2 in FIG. 1). Further, as shown in FIG.
  • the reflected light distance sensors 51, 52, and 53 are arranged near the lower end of the optical branch member 40, and include an input detection direction 6 determined by the optical axis direction of the light emitting element and the light receiving element.
  • 61 that is, the rectangles B1-B2-B3-B4 shown in FIG. 1 are arranged so as to intersect the three-dimensional space projection surface 10.
  • the input detection direction 6 (input detection surface 61) and the optical branch member 40 are arranged at an angle ⁇ 2.
  • the line segments B3 and B4 of the line of intersection between the input detection surface 61 and the three-dimensional space projection surface 10 are almost the same as the line segments connecting the midpoints of the sides A1-A4 and the sides A2-A3 of the three-dimensional space projection surface 10.
  • the angle ⁇ 2 formed by the three-dimensional space projection surface 10 and the optical branching member 40 is set to about 10 degrees. Further, since the reflected light distance sensors 51, 52, and 53 can acquire distance information when an object exists in the direction facing each of the light emitting element and the light receiving element (see the solid line with an arrow in FIG. 1). When the user's hand 80 in the vicinity of the three-dimensional space projection surface 10 exists in the area in the direction of the arrow, the distance information to the hand 80 is acquired, and the movement of the hand 80 is detected as an input.
  • the control unit 7 has an input determination processing unit 71, an input / output I / F processing unit 72, and a screen control processing unit 73.
  • the input determination processing unit 71 is connected to the input detection sensor 5 including the reflected light distance sensors 51, 52, 53 and the built-in speaker 74, and is an input determination process for processing the input of the user operation to the input operation button 30. I do.
  • the input / output I / F processing unit 72 is connected to a handling processing device 9 of another device that performs medium processing or predetermined work with the user, and transmits / receives screen information and input information. Perform processing.
  • the screen control processing unit 73 connects to the display unit 3 and performs screen control processing for controlling the screen display on the display unit 3.
  • the handling processing device 9 is a device that receives a user operation (input operation) for the aerial image display input device 1 and executes the handling processing desired by the user, and as shown in the block diagram of FIG. It includes a control unit 91, a display unit 92, and a processing unit 93.
  • the control unit 91 controls the display unit 92 having a touch panel and controls the processing unit 93 that handles a plurality of cases.
  • the operation of the user 8 with respect to the aerial image display input device 1 and the internal processing of the aerial image display input device 1 will be described below.
  • the user 8 selects, for example, one of handling A and handling B when two types of handling (handling A and handling B) are prepared. Start the operation from.
  • the control unit 91 of the handling processing device 9 displays the selection screens of handling A and handling B and transmits to the control unit 7 to request any selection information.
  • the screen control processing unit 73 displays two input operation buttons indicating handling A and handling B on the display unit 3.
  • the aerial image projection unit 2 displays on the three-dimensional space projection surface 10 two input operation buttons 30 indicating selection of handling A and handling B.
  • the user 8 recognizes the input operation button 30 displayed on the three-dimensional space projection surface 10, and holds the hand 80 for an input operation meaning the desired handling of the handling A and the handling B. It is held over the upper surface of the button 30 (here, it is referred to as handling A), and an operation of pressing the button (button pressing operation) is performed.
  • the reflected light distance sensor (here, the reflected light distance sensor 51) that detects the selected input operation button 30 among the reflected light distance sensors 51, 52, and 53 performs this button pressing operation by the user's hand 80. Is detected.
  • the control unit 7 determines that the button of the handling A is pressed by the input determination processing unit 71 detecting the movement in the input detection direction 6.
  • the screen control processing unit 73 controls the screen display of the display unit 3 to change the color and shape of the button display in order to notify the user that the display button has been pressed. While switching to the screen, a predetermined sound is output from the speaker 74. Further, in the control unit 7, the input / output I / F processing unit 72 transmits the selection information of the handling A to the handling processing device 9.
  • the user 8 inputs the handling selection of the handling processing device 9 touchlessly without touching and operating the display unit 92 (touch panel) provided on the handling processing device 9. Therefore, the handling processing device 9 and the aerial image display input device 1 can be made into clean equipment. Further, when it is difficult to operate with a touch panel, for example, even a user wearing gloves or a user with a handicap can easily operate by touchless, thus providing a highly convenient input device. be able to.
  • the aerial image display input device 1 has the following effects in addition to the above-mentioned effects due to the general touchless operation.
  • the three-dimensional spatial projection surface 10 that forms an aerial image is arranged at an angle ⁇ 1 (about 75 degrees) with the optical branch member 40, so that the viewpoint from the user 8 is as shown in FIG. Since it faces the line of sight below the area, it is not easily affected by invading light or reflected light from the outside, and an aerial image with good visibility can be obtained.
  • the palm (hand 80) is used. Since the three-dimensional space projection surface 10 is close to parallel, the user can easily hold the hand 80 over the position of the input operation button 30, and the palm can be naturally lowered in the vertical direction. The operability can be further improved.
  • the input detection direction 6 has an angle ⁇ 3 of about 90 degrees with respect to the substantially horizontal direction in which the user 8 naturally holds the palm, and the user 8 presses the button to press the palm. Since the input detection surface 61 is located parallel to the direction in which the (hand 80) is moved, the detection sensitivity (accuracy) of the "push" operation by the user 8 can be improved.
  • the reflected light distance sensors 51, 52, and 53 are used in the input detection sensor 5 to detect the movement of the palm of the user 8, but even if another sensor means such as a camera is used, the movement of the palm is detected. Similarly, it can be a device with good operability. However, in that case, the arrangement and detection method of the sensor means such as a camera are different from the case of using the reflected light distance sensors 51, 52, 53.
  • the color and shape of the button display are changed and the voice is output, so that the user 8 can visually or audibly recognize that the operation is recognized by the device instead of the tactile sense. Can be recognized.
  • the aerial image display input device 1 can be intuitively operated by the user with respect to the aerial image, is highly convenient, and has a reaction (operation sensitivity). Can provide a good input device with a simple device configuration.
  • a relatively small 5-inch liquid crystal display is arranged on the display unit 3, and two inputs are provided on the horizontal line segments B3-B4 on the three-dimensional space projection surface 10.
  • the operation button 30 By setting the operation button 30 to be displayed, the effect of enhancing the operation feeling of the user can be obtained. Specifically, as a user's feeling of operation, if the three-dimensional space projection surface 10 is too large, the aerial image is too far from the background optical branching member 40, and it becomes difficult for some people to grasp the sense of distance of the aerial image. Further, if the three-dimensional space projection surface 10 is too small, the input operation button 30 may be too close to the optical branch member 40, and the hand 80 may touch the panel of the image transmission plate 4 during the operation of the button operation.
  • the hand is tilted in the diagonal direction in consideration of the inclination of the three-dimensional space projection surface 10 in the front-rear direction (angle ⁇ 1 in FIG. 2).
  • the 80 has to be moved, which makes the operation difficult.
  • the above configuration in the present embodiment eliminates such a decrease in the operation feeling of the user, and as a result, the user 8 has a good visibility with reference to the optical branching member 40 in the background. Further, the movement of the hand 80 required for the operation can be performed by a small and simple pressing operation without considering the depth and the inclination. That is, the aerial image display input device 1 according to the present embodiment can realize a touchless operation with high operability.
  • FIG. 4A to 4D are diagrams illustrating an example of changing the angle in the aerial image display input device 1.
  • FIG. 4A is a side view of the aerial image display input device 1 before the angle change
  • FIG. 4B is a side view of the aerial image display input device 1 when the angle is changed
  • FIG. 4C is a diagram showing a change in the line-of-sight angle of the user 8 due to a change in the angle of the aerial image display input device 1
  • FIG. 4D shows the aerial image display input device 1 and the user 8 when the angle is changed. It is a figure which shows the positional relationship of.
  • the aerial image display input device 1 is configured by mounting the housing 11 on the housing pedestal 12, but as shown in FIG.
  • the outer arc of the housing 11 is formed.
  • the angle can be changed along with.
  • the optical branching member 40 can move in a range of an angle ⁇ 4 (for example, about 10 degrees) from the vertical direction to the right rotation direction.
  • the aerial image display input device 1 of the present embodiment has a similar angle ⁇ 4 (leftward rotation direction) along the arc of the outer shape of the housing 11 in the direction opposite to that of FIG. 4B (counterclockwise rotation direction).
  • it may be configured to be movable within a range of up to about 10 degrees).
  • the angle of the housing 11 changeable in this way, when it is desired to arrange the aerial image display input device 1 further below or above the line of sight of the user 8 shown in FIG. 2, visibility and visibility and The operability can be optimized. More specifically, when it is desired to arrange the aerial image display input device 1 below the state shown in FIG. 2, the aerial image display input device 1 can be moved in the clockwise rotation direction for visibility and operability. Can be in a suitable state. On the other hand, when it is desired to arrange the aerial image display input device 1 above the state shown in FIG. 2, the aerial image display input device 1 is arranged so as to be movable in the counterclockwise direction, so that the visibility and operability are suitable. Can be. This will be described in detail below with reference to FIGS. 4C and 4D.
  • FIG. 4C is a diagram showing the line-of-sight angle ⁇ 5 ( ⁇ 5A, ⁇ 5B) of the user 8 with respect to the three-dimensional space projection surface 10 (10A, 10B) when the arrangement and angle of the aerial image display input device 1 are changed in combination.
  • 4D is a diagram schematically showing the positional relationship between the user 8 and the aerial image display input device 1 when the combination is changed.
  • the line-of-sight angle ⁇ 5 from the user 8 with respect to the original space projection surface 10 changes, making it difficult for the user 8 to see the aerial image.
  • the aerial image display input device 1 is moved in the right rotation direction
  • the three-dimensional space projection surface 10A by the aerial image display input device 1 after the movement is in the right rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle ⁇ 5A from the user 8 can maintain a value close to the line-of-sight angle ⁇ 5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C).
  • the aerial image display input device 1 is simply arranged downward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10A) becomes farther than in the state of FIG. 2, and the user 8 is in the aerial image. You may feel inconvenient to operate. Therefore, when the aerial image display input device 1 is arranged downward, it is preferable to arrange it so as to be shifted to the front side when viewed from the user 8 from the viewpoint of improving operability. Based on the above, as shown in FIG. 4D, when the aerial image display input device 1 is to be arranged below the state of FIG. 2, the aerial image display input device 1 is moved in the clockwise rotation direction.
  • the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate at hand, so that the visibility and operability are optimized. can do.
  • the aerial image display input device 1 of FIG. 2 is to be arranged above the state of FIG. 2, the position of the three-dimensional space projection surface 10 simply moves upward if the angle is not changed. It is assumed that the line-of-sight angle ⁇ 5 from the user 8 with respect to the three-dimensional space projection surface 10 changes, making it difficult for the user 8 to see the aerial image.
  • the three-dimensional space projection surface 10B by the aerial image display input device 1 after the movement is in the left rotation direction from the three-dimensional space projection surface 10 before the movement. Since the inclination changes to, the line-of-sight angle ⁇ 5B from the user 8 can maintain a value close to the line-of-sight angle ⁇ 5 in the initial arrangement (in the case of FIG. 2), and the user 8 can easily see the aerial image. Yes (see Figure 4C). Further, if the aerial image display input device 1 is simply arranged upward, the distance from the user 8 to the aerial image (three-dimensional space projection surface 10B) becomes too close to that in the state of FIG.
  • the aerial image display input device 1 when the aerial image display input device 1 is arranged upward, it is preferable to arrange the aerial image display input device 1 so as to be offset from the user 8 from the viewpoint of improving operability.
  • FIG. 4D when the aerial image display input device 1 is to be arranged above the state of FIG. 2, the aerial image display input device 1 is moved in the counterclockwise direction.
  • the aerial image can be displayed at an angle that is easy to see at a position where the user 8 can easily operate with the hand 80 extended forward, so that visibility and operation can be performed.
  • the sex can be optimized.
  • FIG. 5A and 5B are diagrams illustrating an example of the arrangement and operation of the reflected light distance sensors 51, 52, 53.
  • FIG. 5A is an image diagram of an input operation by a pressing operation on the input operation button 30, and
  • FIG. 5B is a graph showing a change in distance information detected by the reflected light distance sensors 51 to 53 during an input operation by the pressing operation.
  • FIG. 5A schematically shows an arrangement example of the two input operation buttons 30 and the three reflected light distance sensors 51, 52, 53, and further, the left input operation button 30 (“left”). The image is shown in which the hand 80 of the user 8 performs an input operation by pressing the button).
  • FIG. 5B shows the situation shown in FIG.
  • the change is displayed as a graph, and in the graph, the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
  • the distance information detected by the reflected light distance sensors 51, 52, 53 is referred to as distance information L, C, R in order.
  • the distance information L is represented by a broken line
  • the distance information C is represented by a alternate long and short dash line
  • the distance information R is represented by a solid line.
  • the sensor output represents the distance from each of the reflected light distance sensors 51, 52, 53, and specifically, the distance corresponding to the sensor output of “H2” is in the air. The distance to the image (three-dimensional space projection surface 10). Therefore, when the sensor output changes below H2, it can be determined that the pressing operation with respect to the aerial image has been performed.
  • the distance information L starts changing from a time of 0.2 seconds and then changes to the pressed position (distance of the aerial image) in about 0.2 seconds thereafter, and the reflected light distance sensor 51 has a reflected light distance sensor 51.
  • the adjacent distance information C also changes at the same time as the distance information L, but the distance information C does not change up to the pressed position.
  • the distance information R has not changed at all.
  • the input determination processing unit 71 of the control unit 7 performs input determination processing as follows to input information to the input operation button 30. judge.
  • FIG. 6 is a flowchart showing an example of the processing procedure of the input determination process.
  • the input determination processing unit 71 executes the processing according to the flowchart shown in FIG. 6 and uses the determination levels of H0, H1, H2, and H3 to input from the distance information L, C, and R to the input operation button 30. Judge the input information for.
  • step S10 is a process for detecting the button, and more specifically, for detecting that the hand 80 or the like of the user 8 has started the operation for any of the input operation buttons 30 which are aerial images. It is a process.
  • step S20 is a process for detecting the pressed state of the button, and more specifically, is a process for detecting that the pressed operation has been performed on the input operation button 30 detected in step S10.
  • step S10 The processing outline of step S10 is as follows.
  • the input determination processing unit 71 determines the detection of the distance information L or the distance information R (step S101).
  • the distance information L is detected in step S101, it is for input corresponding to the distance information L by determining whether the distance information L is a value between H2 and H1 and smaller than the distance information C.
  • Detecting the start of an operation on the operation button 30 (“left” button) steps S102 to S104.
  • the input operation button 30 corresponding to the distance information R is performed by performing the same determination processing on the distance information R as the above-mentioned processing on the distance information L (“right”). ”Button) to detect the start of the operation (steps S111 to S113).
  • step S20 executes the processing of step S20 based on the processing results of step S10 (steps S104 and S113).
  • the processing outline of step S20 is as follows.
  • the input determination processing unit 71 determines the pressing operation (100 ms in this example). By using a timer) to determine whether the distance information L is less than H2 (the distance indicated by the distance information L is closer than the distance to the aerial image) for a time of 100 ms or more, "left".
  • the pressing operation of the button is detected (steps S201 to S203, S205 to S208). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left” button (step S204).
  • the input determination processing unit 71 determines the same as the above-mentioned processing related to the distance information L. By performing the process on the distance information R, the pressing operation of the "right” button is detected (steps S211 to S213, S215 to S218). Then, when the pressing operation is detected in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "right” button (step S214).
  • buttons reflected light distance sensors 51 to 53
  • the pressing action on the two buttons is applied to the distance sensor (reflected light distance sensor 51) corresponding to the left (L) or the right (R).
  • Each can be detected with one output from the corresponding distance sensor (reflected light distance sensor 53). Since this detection method has extremely high accuracy and can be determined by simple logic, processing can be executed with high sensitivity and high speed. Further, the distance sensor and the control unit 7 do not need a complicated configuration for detection, and a small device can be realized by a simple configuration.
  • the output of the sensor (reflected light distance sensor 52) of the distance information C is monitored, and the operation of the hand 80 or the like with respect to the portion other than the button is detected, so that the operation other than the button operation is performed. It is possible to prevent erroneous detection of erroneous operation.
  • the button image displayed on the display unit 3 is a button having a position and size substantially matching the detection area of the sensor.
  • the input operation selected by pressing a button as the handling input information input by the user 8 has been described, but the input operation that can be handled by the aerial image display input device 1 according to this embodiment is a button. It is not limited to the input operation by pressing, and is displayed on the screen when, for example, there are a plurality of selectable information such as "Handling A”, “Handling B", “Handling C”, and “Handling D”. It is also possible to handle operations that display no handling (hereinafter referred to as "turning" operations).
  • FIG. 7A and 7B are diagrams illustrating another example of the arrangement and operation of the reflected light distance sensors 51, 52, 53.
  • FIG. 7A is an image diagram of a swipe operation with respect to the input operation button 30, and
  • FIG. 7B is a graph showing changes in distance information detected by the reflected light distance sensors 51 to 53 during the swipe operation.
  • FIG. 7A shows an example of arrangement of the three reflected light distance sensors 51, 52, 53.
  • two circular buttons (“left” button and “right” button) are displayed as the input operation buttons 30, but in FIG. 7A, the input operation buttons 30 are circular on the left side.
  • the "left” button is displayed, and the arrow-shaped "turn” button is displayed on the right side.
  • the "turn” button changes the handling displayed on the "left” button in a predetermined order when the user 8 performs a predetermined "turning” operation, so that the handling desired by the user 8 is “turned". It is a button to be able to display on the "left” button. For example, in the case of FIG. 7A, it is assumed that "Handling A" is displayed on the "Left” button. The detailed processing procedure will be described later in FIG. 8, but in this example, the pressing operation for the "turning” button is performed (pressing operation), or the display in the direction of the arrow is followed from the vicinity of the "turning" button to the horizontal direction.
  • FIG. 7B is a graph display of the situation shown in FIG. 7A, that is, the change in the distance information detected by the reflected light distance sensors 51, 52, 53 when the user 8 swipes the “turn” button.
  • the horizontal axis is time and the vertical axis is distance information (sensor output, distance).
  • the distance information L starts changing from the time 0.2 seconds
  • the distance information C starts changing about 0.2 seconds later
  • the distance information R changes about 0.2 seconds later.
  • Each sensor (reflected light distance sensor 51 to 53) acquires this information, respectively.
  • the input determination processing unit 71 of the control unit 7 may perform the following input determination processing (when distinguishing from the input determination processing described with reference to FIGS. 5 to 6).
  • the input information for the input operation button 30 is determined.
  • four determination levels (threshold values) of H0, H1, H2, and H3 are set in the same manner as the input determination process described with reference to FIGS. 5 to 6.
  • the distance information is classified into three states.
  • FIG. 8 is a flowchart showing an example of the processing procedure of the second input determination process.
  • the processing procedure shown in FIG. 8 has many parts in common with the processing procedure of the input determination processing shown in FIG. 6, and the description of these common parts is basically omitted.
  • Step S30 is a process for detecting that the hand 80 or the like of the user 8 has started an operation for any one of the input operation buttons 30 which is an aerial image, and step S40 is detected in step S30. This is a process for detecting that a predetermined operation (pressing operation in the case of the "left” button, pressing operation or swiping operation in the case of the "turning" button) has been performed on the input operation button 30. ..
  • step S30 The detailed processing procedure of step S30 is that the processing of step S112 is deleted from the processing procedure of step S10 of FIG.
  • the process of step S112 determines whether the distance information R is farther than the aerial image (button) as a preliminary determination for detecting the pressing operation when the distance information R corresponding to the button on the right side is detected.
  • the button on the right side is a "turn" button and does not require determination of the pressing operation (the object to be detected (hand 80) may be in the vicinity of the button).
  • Step S112 has been deleted. Although the detailed processing procedure is repeated, it is omitted, but the processing in step S30 detects the start of the operation to the input operation button 30 corresponding to the distance information L or the distance information R.
  • step S40 executes the processing of step S40 based on the processing results of step S30 (steps S104 and S113).
  • the processing outline of step S40 is as follows.
  • step S30 When the start of the operation to the input operation button 30 (“left” button) corresponding to the distance information L is detected in step S30, the input determination processing unit 71 performs the same processing as in step S20 of FIG. , The pressing operation of the "left” button is detected (steps S201 to S203, S205 to S208). Then, when the input determination processing unit 71 detects the pressing operation in the detection result, the input determination processing unit 71 outputs an input determination processing result of pressing the "left” button (step S204).
  • step S30 when the start of the operation to the input operation button 30 (“turning” button) corresponding to the distance information R is detected in step S30, the input determination processing unit 71 performs a predetermined “turning” button with respect to the “turning” button.
  • the process of detecting the pressing operation of the "turn” button based on the change in the distance information R steps S211 to S213) or the "turning” based on the change in the distance information C.
  • the process of detecting the swipe operation of the button (steps S402, S404 to S407) is executed.
  • the input determination processing unit 71 outputs an input determination processing result of "turning" pressing when the pressing operation is detected in the detection result (step S401), and “turning” when the swipe operation is detected in the detection result.
  • the input determination process result of swiping is output (step S403).
  • the screen control processing unit 73 of the control unit 7 displays the display image of the "left" button on the display unit 3 in a predetermined transition order.
  • the process of switching to the next handling is performed according to. Specifically, for example, the display image is switched from “Handling A" to "Handling B".
  • the second input determination process as described above is performed, so that even when selecting from three or more handlings, two buttons (for input) by the aerial image are used.
  • the operation button 30 By using only the operation button 30) and making one of them a "turn" button, the user 8 can perform two types of intuitive and simple operations: a swipe operation that moves in the horizontal direction and a pressing operation that presses in the vertical direction. It is possible to select one desired handling simply by repeating the operation of. Therefore, for example, a user who is accustomed to the swipe operation on a smartphone or a tablet, or a user who is unfamiliar with the swipe operation and is accustomed only to the pressing operation can easily perform the touchless operation.
  • FIG. 9 is an external view of the system when the aerial image display input device 1 is connected to the reception settlement machine 94 for hospitals.
  • the reception settlement machine 94 for hospitals is an example of a handling processing device 9, and a patient who is a user presents a medical examination ticket before receiving, selects a clinical department to receive, and performs reception processing. After the medical examination, in order to settle the medical examination fee, it is a device that confirms the medical examination amount, inserts cash corresponding to the settlement amount, and performs payment processing.
  • the aerial image display input device 1 is arranged on the right side of the cash slot of the reception settlement machine 94, so that the user faces the operating user and has a line of sight of the user as shown in FIG. Located below. Then, as shown in FIG. 9, the user can use the aerial image display input device 1 and the "reception” and “payment” input operation buttons 96 displayed on the display unit 95 of the reception settlement machine 94 to display the aerial image. Both the displayed "reception” and “payment” input operation buttons 30 can be operated. Therefore, the user can execute any button operation and select the handling.
  • FIG. 10 is a diagram schematically showing the processing between the aerial image display input device 1 and the control unit of the reception settlement machine 94 in the system shown in FIG.
  • FIG. 10 shows the processing between the control units when the user 8 selects the handling of “reception”.
  • three types of display screens (screens A1 to A3) that can be displayed by the aerial image display input device 1 are defined in advance between the aerial image display input device 1 and the control unit of the reception settlement machine 94.
  • four types of display screens (screens B1 to B4) that can be displayed by the reception settlement machine 94 are defined in advance.
  • the aerial image display input device 1 receives the display instruction of the screen A1 which is the selection screen of "reception” and “payment” from the reception settlement machine 94, and displays the screen A1. It is displayed on the display unit 3.
  • the aerial image display input device 1 transmits the input key information to the reception settlement machine 94. Further, the reception settlement machine 94 displays the screen B1 on its own display unit 95.
  • the reception settlement machine 94 can display the screen B2, which is a guide for reading the medical examination ticket, on the display unit 95 and select "cancel".
  • the display instruction of the screen A2, which is a simple screen, is transmitted to the aerial image display input device 1.
  • the aerial image display input device 1 receives this display instruction and displays the screen B2 on the display unit 3.
  • the reception settlement machine 94 displays the screen B3, which is a selection screen of the clinical department, on the display unit 95, and the screen A3, which is a screen on which the clinical department can be selected.
  • the display instruction of is transmitted to the aerial image display input device 1.
  • the aerial image display input device 1 receives this display instruction and displays the screen A3 on the display unit 3.
  • the screen A3 displayed by the aerial image display input device 1 is preferably a display screen using the "turn" button as described with reference to FIGS. 7 and 8, and is also shown in FIG. As described above, when the "turn" button on the right side is operated, the handling (clinical department) is displayed by transitioning to the button on the left side. Then, when there is an input operation (operation of selecting one of the clinical departments) on the screen A3, the aerial image display input device 1 transmits the input key information to the reception settlement machine 94.
  • the reception settlement machine 94 selects a clinical department on the screen B3 displayed on its own display unit 3, or inputs key information from the aerial image display input device 1 to the screen A3 (input information of the selected clinical department). Is received, a predetermined reception process is internally executed, and when the reception process is completed, the screen B4 notifying the completion of the reception is displayed on the display unit 95. This completes the reception process for one user (patient).
  • the aerial image display input device 1 is a small device having a simple configuration, which is an easy-to-use input device that is intuitive to the user and has a good response to the aerial image. Not only can it be provided, but by connecting to an existing handling processing device, an easy touchless input device can be realized.
  • the aerial image display input device 1 is operated by the user in either the aerial image display input device 1 or the handling processing device 9 in a state of being connected to the handling processing device 9 (reception settlement machine 94). Therefore, even if the user is unsuitable for operating the aerial image, the handling processing device 9 may be operated, and a highly convenient input system can be provided.
  • the aerial image display input device 1 by configuring the aerial image display input device 1 separately from the handling processing device 9, it becomes easy to add the aerial image display input device 1 to various existing handling processing devices.
  • the logic for processing the input information from the aerial image display input device 1 is added without changing the specifications of the screen and operation method of the application software. It is easy to change the software, it is easy to install, and it has a cost advantage.
  • the aerial image display input device 1 is a compact device that only displays an input button on the screen, it can be easily arranged in the vicinity of each operation unit of various types of processing devices, and the installation space as a whole. It is possible to suppress the increase in the number and secure the operability.
  • FIG. 11 is an external perspective view of the aerial image display input device 1A according to the second embodiment (Example 2) of the present invention.
  • the arrangement relationship between the input detection sensor and the input operation button is changed from the first embodiment.
  • the three reflected light distance sensors 51, 52, It differs from the aerial image display input device 1 of the first embodiment in that the three input operation buttons 31 are arranged so as to face each of the 53.
  • the distance between the respective input operation buttons 31 becomes short, but the range of utilization of handling is increased by increasing the number of selection types of handling from two to three. Has the advantage of spreading.
  • a total of three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are linearly arranged on both sides of the two input operation buttons 30 and between them.
  • the sensors on both sides detect the pressing operation of each input operation button 30, and the remaining sensors (reflected light distance sensor 52) are hands between the two input operation buttons 30.
  • the three input detection sensors 5 (reflected light distance sensors 51, 52, 53) are arranged in a straight line facing the three input operation buttons 31, and the buttons are close to each other. This may result in improper input determination processing for the above-mentioned erroneous operation. Therefore, in order to prevent this, it is preferable to prepare a detection means and input determination processing different from those in the first embodiment.
  • the aerial image display input device 1 (1A) basically the same relationship is established with respect to the number of buttons and the number of sensors even if the number of input operation buttons 30 (31) increases. That is, it is possible to adopt a configuration in which two or more input detection sensors 5 are linearly arranged and N are arranged, and less than N ([N-1] or less) input operation buttons 30 (31) are displayed. It can be said that it is the most efficient considering the performance of the device and the cost of loading parts. However, in principle, it is also possible to adopt a configuration in which N input detection sensors 5 are arranged and N input operation buttons 30 (31) are displayed in the same number.
  • the number of input operation buttons when the number of input operation buttons is further increased to three or more, the number of reflected light distance sensors constituting the input detection sensor may be increased together.
  • an integrated line-type detection sensor in which elements are arranged on the line may be adopted as the reflected light distance sensor.
  • the aerial image display input device 1 described in the first embodiment is connected to a calling button installed on each floor of the elevator of the building. do.
  • FIG. 12 is a diagram showing a display example of the input operation button 30 in the third embodiment.
  • the aerial image display input device 1 connected to the call button displays the input operation button 30 on the three-dimensional space projection surface 10 by displaying the call button image on the display unit 3. do. More specifically, as shown in FIG. 12, an upper call button 30A for calling the elevator car for upward movement and a lower call button for calling the elevator car for downward movement. 30B is displayed on the three-dimensional space projection surface 10. Then, in the aerial image display input device 1, when the user 8 performs a predetermined operation (basically a pressing operation, but a swipe operation may also be accepted) with respect to the upper call button 30A or the lower call button 30B.
  • a predetermined operation basicically a pressing operation, but a swipe operation may also be accepted
  • the aerial image display input device 1 can switch the call push button installed for each floor in the existing elevator to the touchless operation.
  • FIG. 13 is an external perspective view of the aerial image display input device 1B according to the fourth embodiment. Further, FIG. 14 is a side view of the aerial image display input device 1B.
  • FIGS. 13 and 14 the same reference numerals are given to the configurations common to those in FIGS. 1 and 2, and the description thereof will be omitted.
  • the input detection sensor 5 becomes the input detection sensor 5B as compared with the aerial image display input device 1 shown in FIGS. 1 and 2, and the input detection on one surface is performed.
  • the surface 61 is a two-sided input detection surface 62, 63.
  • the input detection sensor 5B is configured by arranging five sensors (distance sensors including a light emitting element and a light receiving element) arranged on one axis over two axes.
  • the input detection surface 62 (B1-B2-B3-B4) and the input detection surface It forms a two-sided input detection region consisting of 63 (C1-C2-C3-C4).
  • the three-dimensional space projection surface 10 is a surface including a line segment B1-B2 and a line segment C1-C2. For example, 10 input operation buttons 32 are displayed, and "0" to "9” are displayed on each button. By assigning a number of "", a one-digit number can be selected and input.
  • the aerial image display input device 1B as described above can be realized with the same configuration as the aerial image display input device 1 of the first embodiment except that the number of sensors constituting the input detection sensor 5B is increased. ..
  • the input determination processing unit 71 determines the input. In the process, it can be technically easily solved by adding a logic for determining which button the operation target is. Therefore, according to the aerial image display input device 1B according to the fourth embodiment, in addition to the effects obtained in the other examples, the effect of increasing the types of input selection and expanding the applicable range can be obtained.
  • FIG. 15 is an external perspective view of the aerial image display input device 1C according to the fifth embodiment.
  • 16A and 16B are diagrams (No. 1 and No. 2) showing an example of a display image of the three-dimensional space projection surface 10 by the aerial image display input device 1C.
  • FIG. 17 is a diagram showing the relationship between the display image shown in FIG. 16B and the arrangement of the reflected light distance sensor of the aerial image display input device 1C.
  • the aerial image display input device 1C of the fifth embodiment is a modification of the arrangement relationship between the input detection sensor (reflected light distance sensor) and the input operation button from the aerial image display input device 1 of the first embodiment. Places five input operation buttons 33 to 37 facing the five reflected light distance sensors 54 to 58. Of these, the three input operation buttons 33, 34, 35 arranged facing the three reflected light distance sensors 55, 56, 57 are operation buttons corresponding to the input operation (selection operation) by pressing the buttons. The two input operation buttons 36 and 37 arranged to face the two reflected light distance sensors 54 and 58 are operation buttons corresponding to the "turning" operation.
  • the side guide 13 shown in FIGS. 15 and 17 is, for example, a plate-shaped member arranged on both sides of the housing 11, and is a projection surface portion 14 located on the side of the three-dimensional space projection surface 10. And a sensor surface portion 15 located on the side of the input detection sensor 5 (reflected light distance sensor 54 to 58).
  • the display image (three-dimensional space projection surface 10) of FIG. 16A is a screen before the operation of the input operation buttons 33 to 37, and the display image (three-dimensional space projection surface 10) of FIG. 16B is the input operation button 35. This is the screen after pressing.
  • the distance between the respective input operation buttons 33 to 37 is shorter than in the first embodiment, but the input operation buttons corresponding to the "turn" operation are provided.
  • Having 36 and 37 has the advantage that the range of utilization of handling is expanded by increasing the selection types of handling from two types (handling A and handling B) to five or more types (for example, handling A to handling E).
  • the five input detection sensors 5 are arranged at intervals of, for example, about 25 mm.
  • the reflected light distance sensors 54 to 58 have a built-in infrared LED, which is a kind of infrared light emitting element (as a light emitting element), on the light emitting side, and a position detection element (PSD: Position Sensitive Detector) on the light receiving side (as a light receiving element).
  • PSD Position Sensitive Detector
  • a sensor that outputs light according to the distance to the object by a triangular survey method based on the light receiving direction when the PSD receives the reflected light from the object emitted by the infrared LED, as shown in FIG.
  • the light emitting side (light emitting element) is arranged on the side closer to the image transmission plate 4, and the light receiving side (light receiving element) is arranged on the far side.
  • the five input detection sensors 5 (reflected light distance sensors 54 to 58) in the fifth embodiment are on the light emitting side as compared with the three input detection sensors 5 (reflected light distance sensors 51 to 53) shown in the first embodiment. And the arrangement on the light receiving side is rotated 90 degrees.
  • the detection area of the object is wide in the direction of the light emitting / receiving element and narrow in the direction perpendicular to the light emitting / receiving element.
  • the reflected light distance sensor used in this embodiment has a detection region of about 40 mm in the light emitting / receiving element direction and about 20 mm in the direction perpendicular to the light emitting / receiving element at a distance of about 100 mm from the element.
  • a light emitting / receiving element (light emitting element and a light receiving element).
  • the distance between the sensors of each set consisting of) is about 25 mm, and the arrangement direction of the light emitting / receiving element in each set is a direction perpendicular to (perpendicular) to the arrangement direction of a plurality of similar elements (light emitting element, light receiving element).
  • a plurality of homogenous elements are arranged in the vicinity of the bottom of the image transmission plate 4 on a straight line parallel to the bottom, respectively. It can be said that the arrangement direction of the light receiving and emitting elements in the set is a direction perpendicular to the bottom direction of the image transmitting plate 4.
  • the input operation by pressing a button is performed at the position facing each sensor as in the first embodiment.
  • the same processing as the input determination processing described in the first embodiment can be executed.
  • the projection surface portion 14 of the side guide 13 forms a surface that coincides with the three-dimensional space projection surface 10, so that the user can refer to the field of view of the operation image floating in the air as shown in FIG. (Forming both ends of the operation image), so that the visibility of the operation image can be improved.
  • the side guide 13 has an effect of preventing the image from being difficult to see due to external light from the side surface and preventing the input detection sensor 5 from malfunctioning.
  • the arranged input operation buttons 33 to 37 are displayed in a three-dimensional diagram showing that they have a three-dimensional thickness in the front direction, and are shown in FIG. 16B.
  • the pressed state of the input operation button 35 (handling D) is a position in which only the selected input operation button 35 is moved toward the front in a plan view having no three-dimensional thickness by a distance corresponding to the thickness. It is displayed in.
  • FIG. 18 is an external perspective view of the aerial image display input device 1D according to the sixth embodiment. Further, FIG. 19 is a side view of the aerial image display input device 1D.
  • the aerial image display input device 1D of the sixth embodiment arranges five reflected light distance sensors 54, 55, 56, 57, 58 as the input detection sensor 5. Further, as a configuration different from the aerial image display input device 1C of the fifth embodiment, the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10.
  • the projection surface touch sensor 16 has an infrared light emitting element and a light receiving element.
  • the infrared light emitting element emits infrared rays in the above-plane direction of the three-dimensional space projection surface 10, and the light receiving element is arranged on the line in the width direction of the three-dimensional space projection surface 10, and is a three-dimensional space when an object is obstructed.
  • the position on the projection surface 10 is output in XY coordinates with X in the width direction and Y in the direction perpendicular to it.
  • a zForce (registered trademark) AIR touch sensor manufactured by neodo can be used as the projection surface touch sensor 16.
  • the user can use the projection surface touch sensor 16 on the three-dimensional space projection surface 10 to display each input operation button 38.
  • Each input is detected by detecting whether the object is obstructed at the position of, and detecting the distance that the object detected by the projection surface touch sensor 16 moves in the pressing direction by using the five reflected light distance sensors 54 to 58. It is possible to detect the pressing operation of the operation button 38.
  • the present embodiment it is possible to increase the types of operation selections as compared with the fifth embodiment, and at the same time, even if the number of types of operation selections increases, when the pressing operation is performed by providing the projection surface touch sensor 16. Since the press detection is performed only on the button, it is possible to prevent a malfunction or a malfunction.
  • the projection surface touch sensor 16 is arranged in the vicinity of the three-dimensional space projection surface 10, and the position on the three-dimensional space projection surface 10 when the object is obstructed is set to the width direction X and the direction perpendicular to it Y.
  • the projection surface touch sensor 16 has a light emitting element on both sides of the three-dimensional space projection surface 10 in the width direction X on one side and a light receiving element on the other side. May be a type of sensor that outputs the Y coordinate of the direction Y perpendicular to the width direction X at the position on the three-dimensional space projection surface 10 when the object is obstructed. This is because the position in the X direction can be detected by using the reflected light distance sensors 54 to 58.
  • the present invention is not limited to the above-described embodiment, but includes various modifications.
  • the above-described embodiment has been described in detail in order to explain the present invention in an easy-to-understand manner, and is not necessarily limited to the one including all the described configurations.
  • it is possible to replace a part of the configuration of one embodiment with the configuration of another embodiment and it is also possible to add the configuration of another embodiment to the configuration of one embodiment.
  • each of the above configurations, functions, processing units, processing means, etc. may be realized by hardware by designing a part or all of them by, for example, an integrated circuit. Further, each of the above configurations, functions, and the like may be realized by software by the processor interpreting and executing a program that realizes each function. Information such as programs, tables, and files that realize each function can be placed in a memory, a hard disk, a recording device such as an SSD (Solid State Drive), or a recording medium such as an IC card, an SD card, or a DVD.
  • SSD Solid State Drive
  • control lines and information lines are shown in the drawings as necessary for explanation, and not all control lines and information lines are shown in the product. In practice it may be considered that almost all configurations are interconnected.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Computer Hardware Design (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

L'invention concerne un dispositif d'entrée d'affichage d'image aérienne 1 qui comporte : une unité de projection d'image aérienne 2 à l'intérieur de laquelle se trouve une unité d'affichage rectangulaire 3 qui affiche une image prescrite, et à l'extérieur de laquelle se trouve une plaque de transmission d'image rectangulaire 4 servant à projeter l'image affichée sur l'unité d'affichage 3 sur un plan de projection 10 d'espace tridimensionnel visible pour l'utilisateur, et qui forme dans les airs l'image affichée sur l'unité d'affichage 3 en tant qu'écran de guidage d'entrée (boutons d'opération d'entrée 30) pour l'utilisateur ; un capteur de détection d'entrée 5 qui détecte des opérations aériennes effectuées par l'utilisateur sur l'écran de guidage d'entrée ; et une unité de commande 7 qui effectue une commande prescrite.
PCT/JP2021/003495 2020-06-24 2021-02-01 Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne WO2021260989A1 (fr)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2020-108346 2020-06-24
JP2020108346 2020-06-24
JP2020-171039 2020-10-09
JP2020171039A JP2022007868A (ja) 2020-06-24 2020-10-09 空中像表示入力装置及び空中像表示入力方法

Publications (1)

Publication Number Publication Date
WO2021260989A1 true WO2021260989A1 (fr) 2021-12-30

Family

ID=79282287

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/003495 WO2021260989A1 (fr) 2020-06-24 2021-02-01 Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne

Country Status (1)

Country Link
WO (1) WO2021260989A1 (fr)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4242730A1 (fr) * 2022-03-09 2023-09-13 Alps Alpine Co., Ltd. Procédé de fabrication d'élément optique, élément optique, dispositif d'affichage d'image aérienne et dispositif d'entrée spatiale

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016132568A1 (fr) * 2015-02-16 2016-08-25 株式会社アスカネット Dispositif et procédé de saisie sans contact
JP2017107133A (ja) * 2015-12-11 2017-06-15 株式会社ニコン 表示装置、電子機器、画像処理装置および画像処理プログラム
WO2017125984A1 (fr) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Dispositif d'affichage aérien
JP2018018305A (ja) * 2016-07-28 2018-02-01 ラピスセミコンダクタ株式会社 空間入力装置及び指示点検出方法
JP2019003332A (ja) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 空中映像表示装置
JP2019002976A (ja) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 空中映像表示装置
WO2019167425A1 (fr) * 2018-02-27 2019-09-06 ソニー株式会社 Dispositif électronique

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016132568A1 (fr) * 2015-02-16 2016-08-25 株式会社アスカネット Dispositif et procédé de saisie sans contact
JP2017107133A (ja) * 2015-12-11 2017-06-15 株式会社ニコン 表示装置、電子機器、画像処理装置および画像処理プログラム
WO2017125984A1 (fr) * 2016-01-21 2017-07-27 パナソニックIpマネジメント株式会社 Dispositif d'affichage aérien
JP2018018305A (ja) * 2016-07-28 2018-02-01 ラピスセミコンダクタ株式会社 空間入力装置及び指示点検出方法
JP2019003332A (ja) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 空中映像表示装置
JP2019002976A (ja) * 2017-06-13 2019-01-10 コニカミノルタ株式会社 空中映像表示装置
WO2019167425A1 (fr) * 2018-02-27 2019-09-06 ソニー株式会社 Dispositif électronique

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP4242730A1 (fr) * 2022-03-09 2023-09-13 Alps Alpine Co., Ltd. Procédé de fabrication d'élément optique, élément optique, dispositif d'affichage d'image aérienne et dispositif d'entrée spatiale

Similar Documents

Publication Publication Date Title
US11379048B2 (en) Contactless control panel
EP3250989B1 (fr) Capteur de proximité optique et interface utilisateur associée
JP2022007868A (ja) 空中像表示入力装置及び空中像表示入力方法
US20040104894A1 (en) Information processing apparatus
US9001087B2 (en) Light-based proximity detection system and user interface
US8907894B2 (en) Touchless pointing device
US20100026723A1 (en) Image magnification system for computer interface
US20150309592A1 (en) Human interface apparatus having input unit for pointer location information and pointer command execution unit
KR102052752B1 (ko) 텍스트 입력장치와 포인터 위치정보 입력장치가 구비된 복합 휴먼 인터페이스 장치
JP2016071836A (ja) ホログラム表示を実現する対話型表示方法、制御方法及びシステム
US20170170826A1 (en) Optical sensor based mechanical keyboard input system and method
WO2021260989A1 (fr) Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne
US9703410B2 (en) Remote sensing touchscreen
KR20150050546A (ko) 복합휴먼 인터페이스 장치
KR20090030697A (ko) 다기능 마우스
EP3242190B1 (fr) Système, procédé et programme informatique pour détecter un objet en approche et en contact avec un dispositif tactile capacitif
KR101682527B1 (ko) 박형 햅틱 모듈을 이용한 마우스 겸용 터치 키패드
KR102514832B1 (ko) 텍스트 입력장치와 포인터 위치정보 입력장치가 구비된 복합 휴먼 인터페이스 장치
JP2007310477A (ja) 画面操作装置および画面操作方法ならびにこの画面操作装置に用いられる表示入力装置
JP5027084B2 (ja) 入力装置及び入力方法
KR20140066378A (ko) 디스플레이장치와 그 제어방법
KR20090103384A (ko) 공간 투영 및 공간 터치 기능이 구비된 네트워크 단말 장치및 그 제어 방법
KR20140063487A (ko) 디스플레이 유닛이 구비된 복합 휴먼 인터페이스 장치
KR20140063483A (ko) 디스플레이 유닛이 구비된 복합 휴먼 인터페이스 장치
KR20140063484A (ko) 디스플레이 유닛이 구비된 복합 휴먼 인터페이스 장치

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21828998

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21828998

Country of ref document: EP

Kind code of ref document: A1