WO2022244515A1 - Dispositif d'actionnement sans contact - Google Patents

Dispositif d'actionnement sans contact Download PDF

Info

Publication number
WO2022244515A1
WO2022244515A1 PCT/JP2022/016000 JP2022016000W WO2022244515A1 WO 2022244515 A1 WO2022244515 A1 WO 2022244515A1 JP 2022016000 W JP2022016000 W JP 2022016000W WO 2022244515 A1 WO2022244515 A1 WO 2022244515A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
display
operating device
operator
display entity
Prior art date
Application number
PCT/JP2022/016000
Other languages
English (en)
Japanese (ja)
Inventor
裕 稲垣
泰弘 小野
貴博 白井
Original Assignee
株式会社東海理化電機製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社東海理化電機製作所 filed Critical 株式会社東海理化電機製作所
Priority to CN202280034788.0A priority Critical patent/CN117321665A/zh
Publication of WO2022244515A1 publication Critical patent/WO2022244515A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B30/00Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
    • G02B30/50Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
    • G02B30/56Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0481Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
    • G06F3/04815Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F13/00Illuminated signs; Luminous advertising

Definitions

  • the present invention relates to a non-contact operating device.
  • Non-contact input device that forms a real image in the air and can input a signal by manipulating the real image (for example, Patent Document 1).
  • a display in which a detection unit is incorporated in a large number of light-emitting blocks, and a large number of first and second micro-reflecting surfaces that intersect in a plan view are arranged on the same plane.
  • An image of the display is formed as a first real image on the other side of the optical imaging means using an optical imaging means, and an image of the pointing means touching the first real image is formed on one side of the optical imaging means. 2 real images are formed, and the position of the second real image is detected by the optical sensor of the display.
  • the non-contact input device disclosed in Patent Literature 1 has a problem that the operational feeling is not excellent because there is no operational feedback.
  • a non-contact operation device includes a display entity having at least one operation target image, an optical system for generating an aerial image corresponding to the display entity, and an operation target image displayed in the air as the aerial image. and a control unit that provides visual feedback according to the operation to at least the operation target image from which the operation has been detected.
  • FIG. 1A is a top view showing the configuration of the non-contact operating device according to the first embodiment.
  • FIG. 1B is a front view showing the configuration of the non-contact operating device according to the first embodiment.
  • FIG. 1C is a side view showing the configuration of the non-contact operating device according to the first embodiment.
  • FIG. 2A is a perspective view for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 2B is a perspective view for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 2C is a perspective view for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 3A is a cross-sectional view of FIG. 2A for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 3B is a cross-sectional view of FIG. 2B for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 3C is a cross-sectional view of FIG. 2C for explaining the operation of the non-contact operating device according to the first embodiment.
  • FIG. 4A is a top view showing the configuration of the non-contact operating device according to the second embodiment.
  • FIG. 4B is a front view showing the configuration of the non-contact operating device according to the second embodiment.
  • FIG. 4C is a side view showing the configuration of the non-contact operating device according to the second embodiment.
  • FIG. 5A is a perspective view for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 5B is a perspective view for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 5C is a perspective view for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 5D is a perspective view for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 6A is a cross-sectional view of FIG. 5A for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 6B is a cross-sectional view of FIG. 5B for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 6C is a cross-sectional view of FIG. 5C for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 5A is a cross-sectional view of FIG. 5A for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 6B is a cross-sectional view of FIG. 5B for explaining the operation of the non-contact
  • FIG. 6D is a cross-sectional view of FIG. 5D for explaining the operation of the non-contact operating device according to the second embodiment.
  • FIG. 7 is an explanatory diagram showing the configuration of the non-contact operating device according to the third embodiment.
  • FIG. 8 is a block diagram of a non-contact operating device according to the fourth embodiment.
  • FIG. 1A, 1B and 1C are configuration diagrams of the non-contact operating device according to the first embodiment, wherein FIG. 1A is a top view, FIG. 1B is a front view, and FIG. 1C is a side view. be.
  • the non-contact operation device 1 includes a display entity 10, an actuator 20 as a drive section, an imaging member 31 as an optical system, a distance sensor 40 as a detection section, and a control section 50. I have.
  • the display substance 10 has display substances 11 to 14 which are four design portions arranged in a line on the same plane.
  • Each of the display entities 11 to 14 is an LED panel using an LED (Light Emitting Diode) as a light source.
  • a design sheet is arranged on the surface of the LED panel.
  • the display entity 10 is not limited to this, and may be another self-luminous display device such as an organic EL (Electro Luminescence) or vacuum fluorescent display.
  • a reference light source is required separately, a non-luminous display device may be used.
  • the design sheets of the display entities 11 to 14 are formed with designs indicating each operation function of the non-contact operation device 1 .
  • a design indicating this operation function is imaged and becomes an operation target image.
  • a black mark indicating an operation function is represented in the center of the display entities 11 to 14 .
  • each design is such that LED light does not pass through the mark indicating the operation function, and LED light passes through the portion other than the mark indicating the operation function.
  • the pattern may be reversed, in which the LED light passes through the design indicating the operation function, but as will be described later, the pattern in which the periphery of the LED panel emits light is more suitable for the aerial image of the moved LED panel. Easier to check.
  • the non-contact operating device 1 is used as an operating device for audio equipment, for example.
  • the display substance 11 is formed with a design indicating the operation function of "stop”.
  • the display body 12 is formed with a design that indicates the operation functions of "cue” and “fast-reverse.”
  • the display entity 13 is formed with a design indicating the operation function of "playback”.
  • the display entity 14 is formed with a design that indicates the operation functions of "cueing to the next track" and "fast-forwarding".
  • Design indicating operational functions are not necessarily required.
  • a design without a design sheet and without a mark indicating an operation function may be used.
  • the entire rectangular shape of each of the display entities 11 to 14 is designed to emit light.
  • the display entities 11 to 14 are independently movable and attached to an actuator 20 .
  • the actuator 20 is a drive source for the display entity 10 and is attached to the holder 25 .
  • the actuator 20 expands and contracts in the horizontal direction in FIG. As the actuator 20 expands and contracts, each of the display entities 11 to 14 moves independently in the horizontal direction in FIG. 2C.
  • the imaging member 31 is a flat optical material that forms an aerial image.
  • the imaging member 31 forms an image by causing light emitted from a predetermined position to reach a symmetrical position with respect to the imaging member 31, thereby forming an aerial image.
  • Examples of such an imaging member 31 include an ASKA3D plate manufactured by Asukanet and a parity mirror manufactured by Parity Innovations.
  • Imaging member 31 is attached to plate 100 .
  • the plate 100 is formed with a through hole having the same size as the imaging member 31 , and the imaging member 31 is fitted into the through hole so that the imaging member 31 becomes a part of the plate 100 .
  • This board 100 is, for example, a dashboard of a vehicle.
  • the distance sensor 40 has four distance sensors 41 to 44 that measure the distance to the detection target. As shown in FIGS. 2A and 2B, the four distance sensors 41-44 are arranged in a row on the plate 100 and attached to the four display entities 11-14, respectively. Note that the distance sensor 40 may be provided singularly for the display entity 10 (display entities 11 to 14), or may be provided in plural for each of the display entities 10 (display entities 11 to 14). do not have. Further, the detection unit is not limited to a distance sensor, and is operated by a camera that captures an area in which the aerial image 70 (aerial images 71 to 74) is formed. may be configured to detect.
  • the control unit 50 is composed of a CPU (Central Processing Unit) that performs calculations and processing on acquired data according to a stored program, a semiconductor memory such as RAM (Random Access Memory) and ROM (Read Only Memory). Equipped with a microcomputer.
  • CPU Central Processing Unit
  • RAM Random Access Memory
  • ROM Read Only Memory
  • the control unit 50 provides visual feedback by moving the aerial image 70 based on the movement of the display entity 10 by the actuator 20 .
  • Control unit 50 is electrically connected to actuator 20 and distance sensor 40 .
  • the control unit 50 has a program for inputting the detection signal S1 from the distance sensor 40, judging the detection signal S1, and outputting the drive signal S2 to the actuator 20 based on the judgment result. .
  • This program continuously confirms the detection of an object based on the detection signal S1 from each of the distance sensors 41-44.
  • This program determines that an object is present when a detection signal S1 indicating the presence of an object is input for a predetermined time (eg, several/10 seconds) directly above each of the distance sensors 41-44. Then, this program sets a mode for outputting a drive signal S2 for moving the corresponding display entity 10 to the actuator 20 .
  • the relevant display entity 10 is the display entity 10 corresponding to the distance sensors 41 to 44 that have determined that there is an object among the display entities 11 to 14 .
  • this program outputs a drive signal S2 to the actuator 20 for moving the corresponding display entity 10 in accordance with the movement of the object. Then, according to this program, when the object moves and reaches a predetermined position, the control unit 50 sets the operation function of the display entity 10 .
  • this program outputs a drive signal S2 to the actuator 20 to move the display entity 10 to the initial position.
  • FIGS. 2A, 2B and 2C are sectional views of FIGS. 2A, 2B and 2C for explaining the operation of the non-contact operating device according to the first embodiment.
  • aerial images 71-74 of the display entities 11-14 are displayed in the air as shown in FIG. 2A. That is, the LED panels of the display entities 11 to 14 are lit, and the light transmitted through the design sheet reaches the imaging member 31 .
  • the light reaching the imaging member 31 forms an aerial image of the design of the LED panels of the display entities 11 to 14 at symmetrical positions on the imaging member 31 .
  • the aerial images 71 to 74 are mainly the images of the designs of the display entities 11 to 14 formed in the air, that is, the operation target images formed in the air.
  • FIG. 3A shows display entity 13 and aerial image 73 among display entities 11-14 and aerial images 71-74. That is, the display entity 13 is imaged in the air at symmetrical positions on the imaging member 31, and the aerial image 73 is displayed in the air.
  • the operator moves the finger 200 and touches the aerial image 73 with the finger 200, for example, as shown in FIG. 2B. Then, the operator's finger 200 is positioned directly above the distance sensor 43 as shown in FIG. 3B. Then, the distance sensor 43 detects that the operator's finger 200 is directly above the distance sensor 43 .
  • the control unit 50 determines that the operator is operating.
  • the operator presses the aerial image 73 with the finger 200 as the operator's operation.
  • the distance sensor 43 detects that the operator's finger 200 is on the left side of the distance sensor 43 (back in FIG. 2C).
  • the control unit 50 receives the detection signal S1 to drive the actuator 20 to move the display entity 13 .
  • the aerial image 73 moves in the pressing operation direction.
  • the distance sensor 43 detects the position of the operator's finger 200 . Then, the control unit 50 receives this detection signal S1 and turns ON the setting of “reproduction” corresponding to the operation function of the aerial image 73 .
  • control unit 50 continuously confirms the detection of the object by the distance sensors 41-44. Then, when the distance sensors 41 to 44 detect an object on the left side (back side in FIGS. 2A, 2B and 2C) of the reference position indicated by the dashed lines in FIGS. 3A, 3B and 3C, 50 causes the actuator 20 to move the display entity 13 corresponding to the distance sensor 43 that has detected the object among the distance sensors 41 to 44 according to the moving distance. Then, the aerial image 73 moves according to the movement of the operator's finger 200 as if it were pressed by the operator's finger 200 .
  • marks indicating operation functions are black. Then, portions other than the mark indicating the operation function emit light, forming an image of a pattern in which the peripheral portion 73a emits light. Therefore, even if the operator's finger 200 hides all the marks indicating the operation functions from the operator's point of view, it does not prevent the operator from recognizing that the aerial image 73 is moving. In other words, the operator can visually recognize that the portion other than the mark indicating the operation function, which is the luminous image, is moving. Also, the change in the positional relationship between the adjacent aerial images 72 and 74 can be visually recognized from the light emission pattern of the peripheral portion of the aerial image 73 .
  • the aerial images 71-74 function as operation switches. Similarly to the aerial image 73, the aerial images 71 to 74 all move following the operator's finger 200 when the operator's finger 200 presses the aerial image. Then, when the aerial images 71 to 74 are pushed by a predetermined distance, the control section 50 sets the operation function.
  • control unit 50 may change the emission color and brightness of the display entity 10 and change the emission color and brightness of the aerial image 70 .
  • the display entity 10 to be imaged in midair is movably attached to the actuator 20 .
  • the control unit 50 operates the actuator 20 according to the detection signal S1 from the distance sensors 41 to 44, so that the display entities 11 to 14 whose operation is detected are displayed in accordance with the movement of the operator's finger 200.
  • Aerial images 71 to 74 of the design move.
  • the non-contact operation device 1 can present visual feedback to the operator as operational feedback, so that the operational feeling is excellent.
  • FIG. 4A, 4B, and 4C are configuration diagrams of a non-contact operating device according to the second embodiment, FIG. 4A being a top view, FIG. 4B being a front view, and FIG. 4C being a side view. .
  • parts having the same configurations and functions as those of the first embodiment are given the same reference numerals.
  • This non-contact operation device 2 differs from the first embodiment in the display entity 10 and the program of the control unit 50 of the non-contact operation device 1 described in the first embodiment.
  • the non-contact operation device 2 has a non-movable display entity 15 in addition to the display entity 10 of the non-contact operation device 1 described in the first embodiment.
  • the designs of the four movable display entities 11 to 14 are different.
  • the four movable display entities 11 to 14 move forward as well as backward. This difference will be mainly described below.
  • the display entity 10 has four display entities 11-14 arranged in a row and a display entity 15 surrounding them.
  • the four display entities 11 to 14 and the display entity 15 are arranged on the same plane.
  • the display entities 11 to 15 are LED panels using LEDs as light sources.
  • Design sheets are arranged on the surfaces of the four display entities 11-14. No design sheet is arranged on the surface of the display entity 15 .
  • the design sheets of the display entities 11 to 14 are formed with designs indicating each operation function of the non-contact operation device 1 .
  • a mark indicating an operation function is represented in the center of the display entities 11-14.
  • a frame is expressed at the periphery of the display entities 11-14.
  • the mark indicating the operation function of the display entities 11 to 14 and the portion of the frame of the peripheral portion allow the light of the LED to pass through, and the other portions of the display entities 11 to 14 do not allow the light of the LED to pass therethrough.
  • each design may be a design in which the entire surface emits light, and the mark indicating the operation function is not represented.
  • the display entities 11 to 14 are independently movable and attached to an actuator 20 .
  • the display entity 15 is directly attached to the holder 25 .
  • Display entity 15 is a reference image.
  • the program of the non-contact operation device 2 continuously confirms the detection of the object by each of the distance sensors 41-44.
  • this program when a detection signal S1 with an object is input for a predetermined time (for example, several/10 seconds) at a predetermined distance from right above each of the distance sensors 41 to 44, the operator's object is detected. It is determined that there is Then, this program sets a mode for outputting a drive signal S2 for moving the corresponding display entity 10 to the actuator 20 .
  • the applicable display entity 10 is the display entity 10 corresponding to the distance sensors 41 to 44 that have received the object detection signal S1 for a predetermined time from among the display entities 11 to 14 .
  • This program then outputs to the actuator 20 a drive signal S2 for moving the corresponding display entity 10 in accordance with the movement of the object. Then, according to this program, when the object moves and reaches a predetermined position, the control unit 50 sets the operation function of the display entity 10 .
  • 5A, 5B, 5C, and 5D are perspective views explaining the operation of the non-contact operating device according to the second embodiment.
  • 6A, 6B, 6C and 6D are sectional views of FIGS. 5A, 5B, 5C and 5D for explaining the operation of the non-contact operating device according to the second embodiment.
  • an aerial image 70 is displayed in the air. That is, the aerial images 71-74 of the display entities 11-14 and the aerial image 75 of the display entity 15 are displayed around the aerial images 71-74.
  • the controller 50 drives the actuator 20 to move the display entity 13 to the position shown in FIG. 6B. Thereby, the aerial image 73 moves to the position of the finger 200 of the operator.
  • the distance sensor 43 detects the position of the operator's finger 200 . Then, the control unit 50 receives the detection signal S1 and drives the actuator 20 to move the display entity 13 . The aerial image 73 moves with the operator's finger 200 .
  • the aerial image 70 changes from the state of FIG. 5B to the state of FIG. 5D through FIG. 5C.
  • the distance sensor 43 detects the position of the operator's finger 200 . Then, the control unit 50 receives this detection signal S1 and turns ON the setting of “reproduction” corresponding to the operation function of the aerial image 73 .
  • the aerial image 73 has a perimeter 73a of the aerial image of the luminous display entity 13 . Therefore, as in the non-contact operation device 1 of the first embodiment, even if the operator's finger 200 hides all the marks indicating the operation functions from the operator's point of view, the aerial image 73 does not move. It does not interfere with the recognition of existence.
  • the peripheral edge portion 73a of the aerial image of the luminous display entity 13 may be a part of the peripheral edge portion instead of the entire periphery of the peripheral edge portion.
  • the peripheral edge portion 73a of the aerial image of the display entity 13 but also the aerial images of the other display entities 11, 12, and 14 are the same.
  • the non-contact operating device 2 has an aerial image 75 of the fixed display entity 15 surrounding the movable display entities 11-14. Since the aerial image 75 serves as a comparison reference for the positions of the movable aerial images 71 to 74, the operator can easily recognize that any or all of the aerial images 71 to 74 are moving. Further, even if one aerial image of the movable display entity is used instead of the aerial images 71 to 74 of the four movable display entities 11 to 14, it is possible to indicate that the aerial image is in a moving state. can be easily recognized.
  • the aerial images 71-74 of the movable display entities 11-14 are moved by the aerial image 75 of the fixed display entity 15 surrounding the movable display entities 11-14. It is possible to easily recognize that it is in a state. In other words, the non-contact operation device 2 can provide visual feedback by relatively moving the operated image and the reference image. In addition, the operator can confirm whether the switch is to be operated by the program of the control unit 50 that moves the movable display entities 11 to 14 not only to the back but also to the front.
  • FIG. 7 is a configuration diagram of a non-contact operating device according to the third embodiment.
  • a non-contact operation device 3 according to the third embodiment differs from those according to the first and second embodiments in that the optical system is composed of an imaging member 32 and a beam splitter 33 . This difference will be described below.
  • the imaging member 32 is a sheet of optical material that forms an aerial image.
  • the imaging member 32 forms an aerial image by causing light emitted from a predetermined position to reach the same position with respect to the imaging member 32 .
  • the beam splitter 33 is a half mirror or the like that transmits part of the incident light and reflects part of the incident light.
  • the imaging member 32 is not limited to a planar shape, and may be curved. Then, light emitted from a predetermined position may be focused on the image forming member 31 at different positions to form an aerial image with a different magnification.
  • Imaging member 32 is fixedly held together with actuator 20 by retainer 26 .
  • the imaging member 32 is positioned in an exemplary orthogonal angular relationship with respect to the display entity 10 . Note that the angle between the imaging member 32 and the display entity 10 is not limited to 90°.
  • a beam splitter 33 is attached to the plate 100 .
  • the plate 100 is formed with a through hole having the same size as the beam splitter 33 , and the beam splitter 33 is fitted into the through hole so that the beam splitter 33 becomes a part of the plate 100 .
  • the light emitted from the display entity 10, reflected by the beam splitter 33, and reaching the imaging member 32 is focused. It is reflected off the image member 32 .
  • the light transmitted through the beam splitter 33 reaches a position symmetrical to the display entity 10 with respect to the beam splitter 33 .
  • the imaging member 32 directs the light emitted from the predetermined position by the optical system.
  • the operation and effect described for the first embodiment can also be obtained by the present embodiment configured by the image forming member 32 and the beam splitter that reach the same position with respect to the beam.
  • the non-contact operation device 4 according to the fourth embodiment does not include a driving unit, and the display entity 10 displays image information provided by the control unit 50. different.
  • FIG. 8 is an example of a block diagram of a non-contact operation device.
  • the non-contact operation device 4 of the present embodiment is configured such that the display substance 10 stereoscopically displays the image to be operated by optical illusion. Further, the non-contact operation device 4 is configured to control the display entity 10 to change the operation target image according to the operation and to provide visual feedback. This difference will be mainly described below.
  • the display entity 10 is composed of a variable display entity 16 and a lenticular lens 17 as shown in FIG.
  • the variable display entity 16 is electrically connected to the controller 50 .
  • the variable display entity 16 can display a plurality of operation target images according to the output signal S3 from the control section 50.
  • the lenticular lens 17 is a lens in which fine cylindrical lenses are arranged.
  • the control unit 50 has information on the operation target image displayed by the variable display entity 16 .
  • This operation target image is an illusion image in which an image of the display entities 11 to 14 in FIG. ).
  • the arrangement interval between these alternately arranged images is the interval between the cylindrical lenses of the lenticular lens 17 .
  • the control unit 50 also has a plurality of optical illusion images when the operation target images of the display entities 11 to 14 are operated.
  • the controller 50 is provided with illusory images of the display entities 11-14 when the aerial image 70 is formed as shown in FIG. 2C.
  • the controller 50 also provides illusionary images of the display entities 11-14 when the aerial image 70 is formed in the intermediate state of FIGS. 2B-2C. That is, this optical illusion image is an optical illusion image of the display entities 11 to 14 in which the display entity 13 is deeper than the other display entities 11, 12, and 14.
  • FIG. Also, this illusion image is a series of multiple illusion images in which the extent to which the display entity 13 is in the back is slightly different.
  • control unit 50 is provided not only with an illusion image in which the display entity 13 is in the back, but also in the same way with illusion images of display entities in which the other display entities 11, 12, and 14 are moved to the back.
  • the control unit 50 outputs the output signal S3 for displaying the illusion image A to the variable display entity 16 .
  • the variable display entity 16 displays the optical illusion image A based on the acquired output signal S3. Then, the display of the optical illusion image of the variable display entity 16 is divided by the lenticular lens 17 into an image viewed from the front with the right eye and an image viewed with the left eye of the display entities 11 to 14 in FIG. Images viewed by the left and right eyes are displayed in different directions 16a and 16b as shown in FIG.
  • the operator sees the image of the display entity 10 from the front, the operator sees the image of the display entities 11-14 in FIG. 1C viewed from the front with each of the left and right eyes. That is, the operator sees the images of the display entities 11 to 14 from the front and slightly to the right with the right eye. Also, the operator sees the images of the display entities 11 to 14 from the front and slightly to the left with the left eye. Then, the operator visually recognizes the stereoscopic image.
  • an aerial image of the display entity 10 of the non-contact operation device 4 is displayed in the air by the imaging member 31 that forms the aerial image. Thereby, the operator visually recognizes an aerial image similar to that in FIG. 2A.
  • the operator moves the finger 200 to touch the aerial image 73 with the finger 200, as shown in FIG. 2B, for example. Then, the operator's finger 200 is positioned right above the distance sensor 43 . Then, the distance sensor 43 detects that the operator's finger 200 is directly above the distance sensor 43 .
  • the control unit 50 determines that the operator is operating.
  • the operator presses the aerial image 73 with the finger 200 .
  • the distance sensor 43 detects the position of the operator's finger 200 .
  • the control unit 50 receives the detection signal S1 and outputs to the variable display substance 16 the output signal S3 of the optical illusion image corresponding to the position of the operator's finger 200 .
  • the aerial image 70 becomes an aerial image in which the aerial image 73 is moved in the pressing operation direction. This aerial image 70 is viewed stereoscopically by the operator, similar to the display entity 10 that is stereoscopically displayed by optical illusion.
  • the control unit 50 outputs to the variable display entity 16 an output signal S3 of an optical illusion image that matches the detected position of the finger 200 . Then, an aerial image 73 of the aerial image 70 is three-dimensionally viewed by the operator as if it were pressed by the operator's finger 200 and moves in accordance with the operator's finger 200 movement.
  • the distance sensor 43 detects the position of the operator's finger 200 . Then, the control unit 50 receives this detection signal S1 and turns ON the setting of “reproduction” corresponding to the operation function of the aerial image 73 .
  • the non-contact operation device 4 of the present embodiment can dispense with the actuator 20 of the drive section and can present visual feedback to the operator as feedback of the operation.
  • the non-contact operating device 4 can reduce the number of parts and simplify the configuration.
  • the non-contact operation device 4 can reduce the member cost and manufacturing cost of the drive section.
  • the illusion image corresponds to the pressing operation, but the invention is not limited to this.
  • the illusionary image may be such that the display entities 11 to 14 move not only to the back but also to the front.
  • variable display entity and the lenticular lens is shown as an example of the display entity stereoscopically displaying the operation target image by optical illusion, but the present invention is not limited to this.
  • the non-contact operation device 4 may be configured to use a stereoscopic image based on other binocular parallax, or may be configured to use a stereoscopic image other than this.
  • the switch switching position is a position where the aerial image is pushed in by a predetermined distance from the position directly above the distance sensor, but it is not limited to this.
  • the non-contact operation device may set the switch setting position to a position directly above the distance sensor. In this case, after that, the non-contact operation device may move the aerial image to the back without depending on the movement of the operator's finger 200 . Further, for example, the non-contact operation device may set the switch setting position to a position in front of the distance sensor.
  • the aerial image moves to the position of the finger, and furthermore, when the operator's finger 200 remains at that position for a predetermined time, switch setting is performed. I don't mind.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Optics & Photonics (AREA)
  • Position Input By Displaying (AREA)
  • Illuminated Signs And Luminous Advertising (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

La présente invention comprend une entité d'affichage 10 qui a une ou plusieurs images exploitables, un système optique qui génère une image à l'air 70 qui correspond à l'entité d'affichage 10, une unité de détection qui détecte une opération effectuée par un opérateur au niveau d'une image exploitable telle qu'affichée dans l'air dans l'image à l'air 70, et une unité de commande 50 qui réalise une rétroaction visuelle qui correspond à l'opération vers au moins l'image exploitable au niveau de laquelle l'opération a été détectée.
PCT/JP2022/016000 2021-05-17 2022-03-30 Dispositif d'actionnement sans contact WO2022244515A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202280034788.0A CN117321665A (zh) 2021-05-17 2022-03-30 非接触操作装置

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021083332A JP2022176751A (ja) 2021-05-17 2021-05-17 非接触操作装置
JP2021-083332 2021-05-17

Publications (1)

Publication Number Publication Date
WO2022244515A1 true WO2022244515A1 (fr) 2022-11-24

Family

ID=84141215

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/016000 WO2022244515A1 (fr) 2021-05-17 2022-03-30 Dispositif d'actionnement sans contact

Country Status (3)

Country Link
JP (1) JP2022176751A (fr)
CN (1) CN117321665A (fr)
WO (1) WO2022244515A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140192023A1 (en) * 2013-01-10 2014-07-10 Samsung Display Co., Ltd. Proximity and touch sensing surface for integration with a display
WO2016113917A1 (fr) * 2015-01-15 2016-07-21 株式会社アスカネット Dispositif et procédé d'entrée sans contact
JP2018097388A (ja) * 2015-04-21 2018-06-21 株式会社村田製作所 ユーザインタフェース装置及びユーザインタフェースシステム
JP2020009266A (ja) * 2018-07-10 2020-01-16 オムロン株式会社 入力装置
JP2022007868A (ja) * 2020-06-24 2022-01-13 日立チャネルソリューションズ株式会社 空中像表示入力装置及び空中像表示入力方法

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20140192023A1 (en) * 2013-01-10 2014-07-10 Samsung Display Co., Ltd. Proximity and touch sensing surface for integration with a display
WO2016113917A1 (fr) * 2015-01-15 2016-07-21 株式会社アスカネット Dispositif et procédé d'entrée sans contact
JP2018097388A (ja) * 2015-04-21 2018-06-21 株式会社村田製作所 ユーザインタフェース装置及びユーザインタフェースシステム
JP2020009266A (ja) * 2018-07-10 2020-01-16 オムロン株式会社 入力装置
JP2022007868A (ja) * 2020-06-24 2022-01-13 日立チャネルソリューションズ株式会社 空中像表示入力装置及び空中像表示入力方法

Also Published As

Publication number Publication date
CN117321665A (zh) 2023-12-29
JP2022176751A (ja) 2022-11-30

Similar Documents

Publication Publication Date Title
JP6822473B2 (ja) 表示装置
US9955144B2 (en) 3D display system
JP6724987B2 (ja) 制御装置および検出方法
US20100225564A1 (en) Image display device
WO2007013215A1 (fr) Dispositif d’affichage d’image
US20070096125A1 (en) Illumination device
KR101802430B1 (ko) 비접촉 입력 장치 및 방법
JP5075776B2 (ja) 表示装置及び移動体
WO2004027492A1 (fr) Unite d'affichage et dispositif electronique muni d'une telle unite d'affichage
JP2010078860A (ja) 光学部材、表示装置および移動体
US20180348960A1 (en) Input device
JP2014179032A (ja) 仮想キー入力装置
JP2010072255A (ja) 表示装置及び移動体
US11429230B2 (en) Motorist user interface sensor
US10429942B2 (en) Gesture input device
WO2022244515A1 (fr) Dispositif d'actionnement sans contact
JP5987395B2 (ja) 表示装置
WO2020012708A1 (fr) Dispositif d'entrée
WO2019168006A1 (fr) Dispositif d'entrée sans contact
CN114546180B (zh) 浮空影像产生装置以及浮空影像触控装置
JP6663736B2 (ja) 非接触表示入力装置及び方法
JP6075352B2 (ja) パターン照射装置およびシステム
JP2011215499A (ja) 表示装置及び制御方法
JP2011215484A (ja) 表示装置及び制御方法
JP4284159B2 (ja) 立体的二次元画像表示装置及び画像表示方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22804432

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 202280034788.0

Country of ref document: CN

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22804432

Country of ref document: EP

Kind code of ref document: A1