CN117321665A - Non-contact operation device - Google Patents
Non-contact operation device Download PDFInfo
- Publication number
- CN117321665A CN117321665A CN202280034788.0A CN202280034788A CN117321665A CN 117321665 A CN117321665 A CN 117321665A CN 202280034788 A CN202280034788 A CN 202280034788A CN 117321665 A CN117321665 A CN 117321665A
- Authority
- CN
- China
- Prior art keywords
- image
- display
- operation device
- operator
- display entity
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000001514 detection method Methods 0.000 claims abstract description 30
- 230000000007 visual effect Effects 0.000 claims abstract description 15
- 230000003287 optical effect Effects 0.000 claims abstract description 11
- 230000002093 peripheral effect Effects 0.000 claims description 11
- 238000013459 approach Methods 0.000 claims description 2
- 238000003384 imaging method Methods 0.000 description 38
- 230000006870 function Effects 0.000 description 29
- 206010021403 Illusion Diseases 0.000 description 19
- 238000010586 diagram Methods 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 230000004044 response Effects 0.000 description 4
- 238000005401 electroluminescence Methods 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 230000015654 memory Effects 0.000 description 2
- 230000008602 contraction Effects 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/56—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0346—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09F—DISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
- G09F13/00—Illuminated signs; Luminous advertising
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Human Computer Interaction (AREA)
- Optics & Photonics (AREA)
- Position Input By Displaying (AREA)
- Illuminated Signs And Luminous Advertising (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
A non-contact operating device is provided. The device is provided with: a display entity (10) having at least one operation object image; an optical system that generates an aerial image (70) corresponding to a display entity (10); a detection unit for detecting an operation of an operator with respect to an operation target image displayed in the air as an aerial image (70); and a control unit (50) that performs visual feedback corresponding to the operation with respect to at least the operation target image in which the operation is detected.
Description
Cross Reference to Related Applications
The present application claims priority from japanese patent application No. 2021-083332, filed 5/17/2021, and the entire contents of japanese patent application No. 2021-083332 are incorporated herein by reference.
Technical Field
The present invention relates to a non-contact operation device.
Background
There is a non-contact input device that forms a real image in the air and enables signal input by an operation of the real image (for example, patent document 1).
The non-contact input device uses a display in which a detection unit is assembled in a plurality of light-emitting blocks, and a light imaging unit in which a plurality of 1 st and 2 nd minute reflection surfaces intersecting each other in a plan view are respectively erected on the same plane, images the image of the display as a 1 st real image on the other side of the light imaging unit, images the image of an indication unit in contact with the 1 st real image as a 2 nd real image on one side of the light imaging unit, and detects the position of the 2 nd real image by a photosensor of the display.
Patent document 1: japanese patent No. 5856357
In the non-contact input device disclosed in patent document 1, there is no feedback of operation, and thus there is a problem in that the operation feeling is poor.
Disclosure of Invention
The purpose of the present invention is to provide a non-contact operation device which has feedback of operation and has excellent operation feeling.
A noncontact operation device according to an embodiment of the present invention includes: a display entity having at least one operation object image; an optical system that generates an aerial image corresponding to a display entity; a detection unit that detects an operation of an operator on an operation target image displayed in the air as an aerial image; and a control unit that performs visual feedback corresponding to the operation at least with respect to the operation target image in which the operation is detected.
According to an embodiment of the present invention, a non-contact operation device having excellent operation feeling and feedback of operation can be provided.
Drawings
Fig. 1A is a plan view showing the structure of the non-contact operation device according to embodiment 1.
Fig. 1B is a front view showing the structure of the non-contact operation device according to embodiment 1.
Fig. 1C is a side view showing the structure of the non-contact operation device according to embodiment 1.
Fig. 2A is a perspective view illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 2B is a perspective view illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 2C is a perspective view illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 3A is a cross-sectional view of fig. 2A illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 3B is a cross-sectional view of fig. 2B illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 3C is a cross-sectional view of fig. 2C illustrating an operation of the non-contact operation device according to embodiment 1.
Fig. 4A is a plan view showing the structure of the non-contact operation device according to embodiment 2.
Fig. 4B is a front view showing the structure of the non-contact operation device according to embodiment 2.
Fig. 4C is a side view showing the structure of the non-contact operation device according to embodiment 2.
Fig. 5A is a perspective view illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 5B is a perspective view illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 5C is a perspective view illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 5D is a perspective view illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 6A is a cross-sectional view of fig. 5A illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 6B is a cross-sectional view of fig. 5B illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 6C is a cross-sectional view of fig. 5C illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 6D is a cross-sectional view of fig. 5D illustrating an operation of the non-contact operation device according to embodiment 2.
Fig. 7 is an explanatory diagram showing a configuration of the non-contact operation device according to embodiment 3.
Fig. 8 is a block diagram of a non-contact operation device according to embodiment 4.
Detailed Description
[ embodiment 1 ]
Fig. 1A, 1B, and 1C are structural diagrams of a non-contact operation device according to embodiment 1, in which fig. 1A is a plan view, fig. 1B is a front view, and fig. 1C is a side view.
As shown in fig. 1C, the non-contact operation device 1 includes a display entity 10, an actuator 20 as a driving section, an imaging member 31 as an optical system, a distance sensor 40 as a detecting section, and a control section 50.
As shown in fig. 1B, the display entity 10 has display entities 11 to 14 that are 4 appearance portions arranged in 1 column on the same surface. The display entities 11 to 14 are LED panels each using LED (Light Emitting Diode) as a light source. An exterior sheet is disposed on the surface of the LED panel. The display entity 10 is not limited to this, and may be any other self-luminous display device such as an organic EL (Electro Luminescence) or a fluorescent display tube. In addition, although a reference light source is additionally required, a non-emission type display device may be used.
The appearance sheets of the display entities 11 to 14 are formed with appearances showing the respective operation functions of the non-contact operation device 1. The appearance of the operation function is imaged to be an operation target image. Further, for each appearance, marks indicating the operation functions are represented by black in the center of the display entities 11 to 14.
That is, the respective appearance is such that the portion of the mark indicating the operation function does not transmit the light of the LED, and the portion other than the mark indicating the operation function transmits the light of the LED. Further, the pattern may be a reverse pattern in which the light of the LED is transmitted through a portion showing the appearance of the operation function, but as described later, the pattern in which the peripheral edge portion of the LED panel emits light can more easily identify the aerial image of the moved LED panel.
As an example, the noncontact operation device 1 is used as an operation device of an audio apparatus. Further, the display entity 11 is formed with an appearance indicating the operation function of "stop". The display entity 12 is formed with an appearance that indicates the "go to the beginning" and "fast reverse" operating functions. The display entity 13 is formed with an appearance representing the operation function of "play". The display entity 14 is formed with an appearance representing the operation functions of "beginning of next tune" and "fast forward".
In addition, the appearance indicating the operation function is not necessarily required. For example, the appearance may be an appearance in which no appearance sheet is used and no mark indicating an operation function is present. In this case, the display entities 11 to 14 have a rectangular overall luminous appearance.
The display entities 11 to 14 are independently movable and mounted on the actuator 20.
The actuator 20 is a driving source of the display entity 10, and is mounted to the holder 25. The actuator 20 is driven by a driving signal S from the control unit 50 2 And expands and contracts in the left-right direction of fig. 2C. By the expansion and contraction of the actuator 20, the display entities 11 to 14 each independently move in the left-right direction of fig. 2C.
The imaging member 31 is a flat plate-like optical material that forms an aerial image. The imaging member 31 forms an aerial image by imaging light emitted from a predetermined position to a position symmetrical with respect to the imaging member 31. Examples of such imaging members 31 include ASKA3D plates manufactured by Askanet corporation and parity mirrors manufactured by Parity Innovations corporation.
The imaging member 31 is mounted to the plate 100. The plate 100 has a through hole having the same size as the imaging member 31, and the imaging member 31 is fitted into the through hole, whereby the imaging member 31 becomes a part of the plate 100. As one example, the panel 100 is a control panel of a vehicle.
The distance sensor 40 includes 4 distance sensors 41 to 44 for measuring the distance from the sensor to the detection target. As shown in fig. 2A and 2B, 4 distance sensors 41 to 44 are respectively associated with 4 display entities 11 to 14, and are mounted on the board 100 in a 1-column arrangement. The distance sensor 40 may be single for the display entities 10 (display entities 11 to 14), or a plurality of distance sensors 40 may be mounted for each of the display entities 10 (display entities 11 to 14). The detection unit is not limited to the distance sensor, and may be configured to detect an operation performed by the detection subject on the aerial images 70 (aerial images 71 to 74) by a camera that captures an area where the aerial images 70 (aerial images 71 to 74) are formed.
The control unit 50 includes a microcomputer including CPU (Central Processing Unit) for performing calculation, processing, and the like on acquired data according to a stored program, RAM (Random Access Memory) and ROM (Read Only Memory) as semiconductor memories, and the like.
The control unit 50 performs visual feedback by the movement of the aerial image 70 based on the movement of the display entity 10 by the actuator 20. The control section 50 is electrically connected to the actuator 20 and the distance sensor 40. The control unit 50 further includes a program for causing the distance sensor 40 to detect the signal S 1 Input, determine the detection signal S 1 And will be based on the drive signal S of this judgement result 2 To the actuator 20.
The program is continuously executed based on the detection signals S from the respective distance sensors 41 to 44 1 And (5) checking the detection of the object. The program is that, if a detection signal S of the object is input in a period of a predetermined time (for example, several/10 seconds) 1 It is determined that an object exists directly above each of the distance sensors 41 to 44. The program uses the driving signal S for driving the corresponding display entity 10 to move to the actuator 20 2 This mode. The corresponding display entity 10 is the display entity 10 corresponding to the distance sensor 41 to 44 determined that the object exists among the display entities 11 to 14.
Then, the program outputs a driving signal S for moving the corresponding display entity 10 to the actuator 20 in response to the operation of the object 2 . Then, when the object moves to a predetermined position, the program causes the control unit 50 to set the operation function of the display entity 10.
Further, if the detection signal S of the object of the distance sensor 40 1 When the display entity 10 disappears, the program outputs a driving signal S for moving the display entity 10 to the initial position to the actuator 20 2 。
(action of non-contact operation device 1)
Fig. 2A, 2B, and 2C are perspective views illustrating operations of the non-contact operation device according to embodiment 1. Fig. 3A, 3B, and 3C are cross-sectional views of fig. 2A, 2B, and 2C illustrating operations of the non-contact operation device according to embodiment 1.
When the non-contact operation device 1 is powered on and operated, aerial images 71 to 74 of the display entities 11 to 14 are displayed in the air as shown in fig. 2A. That is, the LED panels of the display entities 11 to 14 are lit, and the light transmitted through the appearance sheet reaches the imaging member 31. The light reaching the imaging member 31 aerial images the appearance of the LED panels of the display entities 11 to 14 at a position symmetrical to the imaging member 31. Thus, the operator visually recognizes the aerial images 71 to 74. The aerial images 71 to 74 are mainly images obtained by imaging the appearance of the display entities 11 to 14 in the air, in other words, images of the operation target obtained by imaging in the air.
Fig. 3A shows display entities 13 and aerial images 73 among display entities 11 to 14 and aerial images 71 to 74. That is, a state in which the display entity 13 is imaged in the air at a position symmetrical to the imaging member 31 and the aerial image 73 is displayed in the air is shown.
The operator moves the finger 200, and touches the aerial image 73 with the finger 200, for example, as shown in fig. 2B. Then, as shown in fig. 3B, the finger 200 of the operator is positioned directly above the distance sensor 43. Then, the distance sensor 43 detects that the operator's finger 200 is present directly above the distance sensor 43. The control section 50 makes the following operation determination: if the detection signal S is continuously input for a predetermined time 1 It is determined that the operator is performing an operation.
As shown in fig. 2C, the operation is performed as an operation action of the operatorThe author performs a pressing operation with the finger 200 toward the aerial image 73. Then, as shown in fig. 3C, the distance sensor 43 detects that the operator's finger 200 is present on the left side (deep in fig. 2C) from immediately above the distance sensor 43. Then, the control unit 50 inputs the detection signal S 1 The actuator 20 is driven to move the display entity 13. Then, the aerial image 73 moves in the pressing operation direction.
Then, when the operator's finger 200 is pushed down toward the aerial image 73 by a predetermined distance, the distance sensor 43 detects the position of the operator's finger 200. Then, the control unit 50 inputs the detection signal S 1 And the setting of "play" corresponding to the operation function of the aerial image 73 is turned on.
Specifically, the control unit 50 continuously checks the detection of the object by the distance sensors 41 to 44. When the distance sensors 41 to 44 detect an object on the left side (the depth side in fig. 2A, 2B, and 2C) from the reference position indicated by the one-dot chain line in fig. 3A, 3B, and 3C, the control unit 50 moves the display entity 13 corresponding to the distance sensor 43 that detects the object among the distance sensors 41 to 44 by the actuator 20 in response to the movement distance. Then, the aerial image 73 moves in accordance with the movement of the operator's finger 200 as if it was pressed by the operator's finger 200.
As in fig. 2A, 2B, and 2C, the appearance of the aerial image 73 is black for the mark indicating the operation function. The portions other than the marks indicating the operation functions emit light, and form an image of a pattern in which the peripheral portion 73a emits light. Therefore, even if the finger 200 of the operator completely obscures the mark indicating the operation function when viewed from the operator, it is assumed that the recognition of the case where the aerial image 73 is moving is not hindered. That is, the operator can visually recognize that the portion other than the mark indicating the operation function, which is the light-emitting image, is moving. Further, the positional relationship between the adjacent aerial image 72 and the aerial image 74 can be visually recognized to be changed according to the light emission pattern of the peripheral edge portion of the aerial image 73.
As described above, aerial images 71 to 74 function as operation switches. All of the aerial images 71 to 74 are identical to the aerial image 73, and when the operator's finger 200 presses the aerial image, the aerial images 71 to 74 follow the movement of the operator's finger 200. When the aerial images 71 to 74 are pressed a predetermined distance, the control unit 50 sets the operation function.
When the operation function is set, the control unit 50 may change the emission color and the luminance of the display entity 10, and change the emission color and the luminance of the aerial image 70.
(effects of embodiment 1)
According to embodiment 1 described above, with the non-contact operation device 1, the display entity 10 imaged in the air can be moved and mounted to the actuator 20. The control unit 50 then receives the detection signals S from the distance sensors 41 to 44 1 By operating the actuator 20, the aerial images 71 to 74 of the appearance of the display entities 11 to 14, for which the operation is detected, move in accordance with the operation of the finger 200 of the operator. Thus, the non-contact operation device 1 can present visual feedback to the operator as feedback of the operation, and thus the operation feeling is excellent.
[ embodiment 2 ]
Fig. 4A, 4B, and 4C are structural diagrams of a non-contact operation device according to embodiment 2, in which fig. 4A is a plan view, fig. 4B is a front view, and fig. 4C is a side view. In the following description, the same reference numerals are given to portions having the same structures and functions as those of embodiment 1.
The non-contact operation device 2 is different from embodiment 1 in terms of the program of the display entity 10 and the control unit 50 of the non-contact operation device 1 described in embodiment 1. The non-contact operation device 2 has a display entity 15 that is not movable, in addition to the display entity 10 of the non-contact operation device 1 described in embodiment 1. Further, the movable 4 display entities 11 to 14 are different in appearance. The movable 4 display entities 11 to 14 move not only to the depth but also to the front. Hereinafter, this difference will be mainly described.
The display entity 10 has 4 display entities 11 to 14 arranged in 1 column and a display entity 15 surrounding them. The 4 display entities 11 to 14 are arranged on the same surface as the display entity 15. The display entities 11 to 15 are LED panels each having LEDs as light sources. Appearance sheets are arranged on the surfaces of the 4 display entities 11 to 14. No appearance sheet is disposed on the surface of the display entity 15.
The appearance sheets of the display entities 11 to 14 are formed with appearances showing the respective operation functions of the non-contact operation device 1. The respective appearances are marked with marks indicating the operation functions in the center of the display entities 11 to 14. A frame is displayed on the peripheral edge of the display entities 11 to 14. The marks indicating the operation functions of the display entities 11 to 14 and the portions of the frame at the peripheral edge portion transmit the light of the LEDs, and the other portions of the display entities 11 to 14 do not transmit the light of the LEDs. The respective appearances may be the appearance of the entire surface light emission without the marks indicating the operation functions.
The display entities 11 to 14 are independently movable and mounted on the actuator 20. On the other hand, the display entity 15 is mounted directly to the holder 25. The display entity 15 is a reference image.
The program of the non-contact operation device 2 continuously confirms the detection of the object by the respective distance sensors 41 to 44. If the program inputs the detection signal S of the object during a predetermined time (for example, several/10 seconds) 1 It is determined that the operator's object is present at a predetermined distance from immediately above the respective distance sensors 41 to 44. The program uses the driving signal S for driving the corresponding display entity 10 to move to the actuator 20 2 This mode. The corresponding display entity 10 is one of the display entities 11 to 14, which inputs the detection signal S of the existence of the object in a predetermined time period 1 The display entity 10 corresponding to the distance sensors 41 to 44.
The program will then drive the corresponding display entity 10 to move in response to the motion of the object 2 To the actuator 20. When the object moves to a predetermined position, the program causes the control unit 50 to set the operation function of the display entity 10.
(action of non-contact operation device 2)
Fig. 5A, 5B, 5C, and 5D are perspective views illustrating operations of the non-contact operation device according to embodiment 2. Fig. 6A, 6B, 6C, and 6D are cross-sectional views of fig. 5A, 5B, 5C, and 5D illustrating operations of the non-contact operation device according to embodiment 2.
When the non-contact operation device 2 is powered on and operated, the aerial image 70 is displayed in the air. That is, aerial images 71 to 74 of display entities 11 to 14 are displayed, and aerial image 75 of display entity 15 is displayed around aerial images 71 to 74.
Further, for example, as shown in fig. 5A, the operator moves the finger 200 to the vicinity of the aerial image 73. Then, the distance sensor 43 detects the finger 200 of the operator on the right side of the distance sensor 43 shown in fig. 6A. Then, when the control unit 50 inputs the detection signal S 1 The actuator 20 is driven to move the display entity 13 to the position of fig. 6B. Thereby, the aerial image 73 moves toward the position of the operator's finger 200.
At this time, as shown in fig. 5B, aerial image 70 moves forward only aerial image 73, and approaches the position of operator's finger 200. Thus, the operator can check whether or not the aerial image 73 approaching the finger 200 is a switch to be operated.
Then, when the operator performs a pressing operation on the aerial image 73 with the finger, the distance sensor 43 detects the position of the finger 200 of the operator. Then, the control unit 50 inputs the detection signal S 1 The actuator 20 is driven to move the display entity 13. Aerial image 73 moves with operator's finger 200.
Further, as shown in fig. 6C, the finger 200 of the operator is directly above the distance sensor 43, and as shown in fig. 6D, the finger 200 of the operator reaches the left side of the distance sensor 43. Aerial image 70 goes from the state of fig. 5B through fig. 5C to the state of fig. 5D.
Then, when the operator performs a pressing operation on the aerial image 73 at a predetermined distance, the distance sensor 43 detects the position of the finger 200 of the operator. Then, the control unit 50 inputs the detection signal S 1 To make it match with the aerial image73 is turned on in response to the "play" setting.
The aerial image 73 has a peripheral edge 73a of the aerial image of the light-emitting display entity 13. Therefore, as in the non-contact operation device 1 of embodiment 1, even when the operator looks at the device, the finger 200 of the operator blocks the entire mark indicating the operation function, and the recognition that the aerial image 73 is moving is not hindered. The peripheral edge 73a of the aerial image of the light-emitting display entity 13 may be a part of the peripheral edge instead of the entire periphery of the peripheral edge. The aerial image is not limited to the peripheral edge 73a of the aerial image of the display entity 13, but is the same for the aerial images of the other display entities 11, 12, and 14.
The non-contact operation device 2 has an aerial image 75 surrounding the fixed display entities 15 of the movable display entities 11 to 14. Since the aerial image 75 serves as a reference for comparing the positions of the movable aerial images 71 to 74, the operator can easily recognize that any or all of the aerial images 71 to 74 are moved. Even when one aerial image of the movable display entity is set in place of the aerial images 71 to 74 of the 4 movable display entities 11 to 14, the condition that the aerial image is in a moving state can be easily recognized.
As described above, according to the present embodiment in which the program for changing the operations of the display entity 10 and the control unit 50 is changed, the operations and effects described in relation to embodiment 1 can be obtained.
The non-contact operation device 2 can easily recognize that the aerial images 71 to 74 of the movable display entities 11 to 14 are moving, based on the aerial image 75 of the fixed display entity 15 surrounding the movable display entities 11 to 14. In other words, the non-contact operation device 2 can perform visual feedback by relatively moving the operation target image subjected to the operation with respect to the reference image. Further, according to the program of the control unit 50 that moves the movable display entities 11 to 14 not only to the depth but also to the front, the operator can check whether or not the switch is the switch to be operated.
[ embodiment 3 ]
Fig. 7 is a structural diagram of a non-contact operation device according to embodiment 3.
The non-contact operation device 3 of embodiment 3 is different from embodiments 1 and 2 in that an optical system is constituted by an imaging member 32 and a spectroscope 33. The following description will be made on the difference.
The imaging member 32 is a sheet-like optical material that forms an aerial image. The imaging member 32 forms an aerial image by causing light emitted from a predetermined position to reach the same position relative to the imaging member 32. The beam splitter 33 is a half mirror or the like that transmits a part of incident light and reflects a part of the incident light. The imaging member 32 is not limited to a planar shape, and may be curved. Further, the light emitted from the predetermined position may be imaged at a different position with respect to the imaging member 31, and an aerial image having a different magnification may be formed.
The imaging member 32 is fixedly held together with the actuator 20 by the holder 26. As an example, the imaging members 32 are disposed in an orthogonal angular relationship with respect to the display entity 10. In addition, the angle between the imaging member 32 and the display entity 10 is not limited to 90 °.
The beam splitter 33 is mounted on the board 100. The plate 100 has a through hole having the same size as the beam splitter 33, and the beam splitter 33 is fitted into the through hole, whereby the beam splitter 33 becomes a part of the plate 100.
As described above, the display entity 10, the imaging member 32, and the spectroscope 33 are provided, and thus, light emitted from the display entity 10, reflected by the spectroscope 33, and reaching the imaging member 32 is reflected by the imaging member 32. The light reflected by the imaging member 32 and transmitted through the beam splitter 33 reaches a position symmetrical to the display entity 10 with respect to the beam splitter 33.
Thus, when the non-contact operation device 3 is powered on and operated, the aerial image of the display entity 10 is displayed in the air. Moreover, the operator recognizes the aerial image.
As described above, according to the present embodiment in which the optical system is configured by the imaging member 32 and the beam splitter in which the light emitted from the predetermined position reaches the same position as the imaging member 32, instead of the imaging member 31 in which the light emitted from the predetermined position reaches the position symmetrical to the imaging member 31, the operation and effect described in relation to embodiment 1 can also be obtained.
[ embodiment 4 ]
The non-contact operation device 4 according to embodiment 4 is different from the non-contact operation device 1 according to embodiment 1 in that the non-contact operation device 4 does not include a driving unit, and image information included in the control unit 50 is displayed on the display unit 10. Fig. 8 is an example of a block diagram of the noncontact operation device.
The non-contact operation device 4 of the present embodiment is configured such that the display entity 10 stereoscopically displays the operation target image by means of an optical illusion. The non-contact operation device 4 is configured to control the display entity 10 so that the operated operation target image changes according to the operation to perform visual feedback. Hereinafter, this difference will be mainly described.
As shown in fig. 8, the display entity 10 is composed of a variable display entity 16 and a lenticular lens 17. The variable display entity 16 is electrically connected to the control unit 50. The variable display entity 16 outputs a signal S from the control unit 50 3 A plurality of operation target images can be displayed. The lenticular lens 17 is a lens in which tiny cylindrical lenses are arranged.
The control unit 50 includes information of the operation target image displayed by the variable display entity 16. The operation target image is an optical image in which images of the display entities 11 to 14 of fig. 1B are alternately arranged when viewed from the right eye and images of the display entities 11 to 14 of fig. 1B are viewed from the left eye (hereinafter, the optical image is referred to as an optical image a). The arrangement interval of these images alternately arranged is the interval of the cylindrical lenses of the lenticular lens 17.
The control unit 50 also includes a plurality of visual illusions when the operation target images of the display entities 11 to 14 are operated.
The control unit 50 includes visual illusions of the display entities 11 to 14 when the aerial image 70 is formed as shown in fig. 2C. The control unit 50 includes visual illusions of the display entities 11 to 14 when the aerial image 70 is formed in the intermediate state of fig. 2B to 2C. That is, the visual illusion image is the visual illusion image of the display entity 11 to 14 in which the display entity 13 is located further than the other display entities 11, 12, 14. The optical illusion image is a plurality of continuous optical illusion images in which the depth of the display entity 13 is slightly different.
The control unit 50 also includes not only the visual illusion image of the display entity 13 located in the deep position but also the visual illusion images of the display entities of the other display entities 11, 12, 14 moving to the deep position.
(action of non-contact operation device 4)
When the non-contact operation device 4 is powered on and operated, the control unit 50 outputs an output signal S for displaying the visual illusion image a 3 Output to variable display entity 16. The variable display entity 16 is based on the acquired output signal S 3 While the visual illusion image a is displayed. Further, the display of the visual illusion image of the variable display entity 16 is divided by the lenticular lens 17 into the images of the display entities 11 to 14 of fig. 1B viewed with the right eye from the front side and the images of the display entities 11 to 14 of fig. 1B viewed with the left eye. The images observed by the left and right eyes are displayed in different directions 16a and 16b as shown in fig. 8.
If the operator views the image of the display entity 10 from the front, the operator views the images of the display entities 11 to 14 of fig. 1C from the front with the left and right eyes, respectively. That is, the operator views the images of the display entities 11 to 14 slightly right from the front with the right eye. The operator views the images of the display entities 11 to 14 slightly left from the front with the left eye. The operator then views the stereoscopic image.
The aerial image of the display entity 10 of the non-contact operation device 4 is displayed in the air by the imaging means 31 for forming the aerial image. Thus, the operator visually recognizes the same aerial image as in fig. 2A.
Next, the operator moves the finger 200, and touches the aerial image 73 with the finger 200, for example, as shown in fig. 2B. Thus, the operator's finger 200 is located directly above the distance sensor 43. Then, the distance sensor 43 is aligned with the distance sensor 43The presence of the operator's finger 200 above is detected. The control section 50 makes the following operation determination: if the detection signal S 1 The operator is considered to be operating if the input is continued for a predetermined time.
Then, the operator performs a pressing operation on the aerial image 73 with the finger 200. Thus, the distance sensor 43 detects the position of the operator's finger 200. Then, the control unit 50 inputs the detection signal S 1 And an output signal S of an optical illusion image corresponding to the position of the finger 200 of the operator 3 Output to variable display entity 16. Then, the aerial image 70 becomes an aerial image in which the aerial image 73 is moved in the pressing operation direction. Like the display entity 10 displayed stereoscopically by the optical illusion, this aerial image 70 is stereoscopically perceived by the operator.
The control unit 50 outputs an output signal S of the visual illusion image matching the detected position of the finger 200 3 Output to variable display entity 16. Then, the aerial image 73 in the aerial image 70 is stereoscopically perceived by the operator as if it were pressed by the operator's finger 200 and moved in accordance with the movement of the operator's finger 200.
When the operator's finger 200 is pushed down toward the aerial image 73 by a predetermined distance, the distance sensor 43 detects the position of the operator's finger 200. Thus, the control unit 50 inputs the detection signal S 1 And the setting of "play" corresponding to the operation function of the aerial image 73 is turned on.
As described above, according to the present embodiment in which visual feedback is performed by changing the stereoscopic image in which the display entity is moved instead of performing visual feedback by moving the aerial image in which the display entity is moved, the operation and effect described in embodiment 1 can be obtained.
The non-contact operation device 4 according to the present embodiment can present visual feedback to the operator as feedback of the operation without the actuator 20 of the driving unit. Thus, the noncontact operation device 4 can reduce the number of components and can simplify the structure. Furthermore, the noncontact operation device 4 can reduce the component cost and the manufacturing cost of the driving portion.
In the present embodiment, the case where the optical illusion image corresponds to the pressing operation has been described, but the present invention is not limited to this. For example, as in the non-contact operation device 2 of embodiment 2, the visual illusion image may be an image in which the display entities 11 to 14 move not only in the depth direction but also in the front direction.
In the present embodiment, the configuration of the variable display entity and the lenticular lens is shown as one example of the display entity displaying the operation target image stereoscopically by the optical illusion, but the present invention is not limited to this. The non-contact operation device 4 may be configured to use another stereoscopic image formed by parallax between both eyes, or may be configured to use another stereoscopic image.
The embodiments of the present invention have been described above, but these embodiments are merely examples and are not intended to limit the invention described in the claims. The above-described novel embodiment and its modified examples can be implemented in various other modes, and various omissions, substitutions, changes, and the like can be made without departing from the spirit of the invention.
In the above-described embodiment, the switch switching position is described as a position in which the aerial image is pressed by a predetermined distance from the position immediately above the distance sensor, but the present invention is not limited to this. For example, the non-contact operation device may set the switch setting position to a position immediately above the distance sensor. In this case, the noncontact operation device may move the aerial image to a depth position or the like independently of the operation of the finger 200 of the operator. For example, the non-contact operation device may be configured such that the switch setting position is located further forward than just above the distance sensor. In this case, the non-contact operation device may be configured to move the aerial image to the position of the finger when the finger 200 of the operator is present, and perform a switch setting or the like when the finger 200 of the operator is further left at the position for a predetermined time.
The combination of the features described in the embodiments is not essential to the solution of the subject of the invention. The present invention is not limited to the above embodiments, but may be modified according to the following embodiments.
Description of the reference numerals
1. 2, 3, 4..a non-contact operating device; 10. 11-14. display entities; display entity; variable display entity; lenticular lens; an actuator; 31. an imaging member; a beam splitter; 40. distance sensor; a control unit; 70. 71-74.
Claims (10)
1. A non-contact operation device is characterized by comprising:
a display entity having at least one operation object image;
an optical system that generates an aerial image corresponding to the display entity;
a detection unit that detects an operation of an operator on the operation target image displayed in the air as the air image; and
and a control unit that performs visual feedback corresponding to an operation on at least the operation target image on which the operation is detected.
2. The non-contact operation device according to claim 1, wherein,
comprises a driving part for moving the display entity,
the control unit performs the visual feedback by a movement of the aerial image based on a movement of the display entity by the driving unit.
3. The non-contact operation device according to claim 2, wherein,
the display entity has: a plurality of appearance parts respectively movable by the driving parts,
the plurality of appearance portions each have the operation target image.
4. The non-contact operation device according to claim 3 or 4, wherein,
the display entity is configured to have a plurality of display entities which become the plurality of appearance parts,
the plurality of display entities are self-luminous devices of an LED panel or a fluorescent display tube.
5. The non-contact operation device according to any one of claims 1 to 4, wherein,
the display entity is provided with a reference image serving as a reference for comparison of the operation target image,
the control unit performs the visual feedback by moving the manipulated object image and the reference image relative to each other.
6. The non-contact operation device according to claim 1, wherein,
the display entity stereoscopically displays the operation object image through an optical illusion,
the control unit controls the display entity to change the operated operation target image to perform the visual feedback according to an operation.
7. The non-contact operation device according to any one of claims 1 to 6, wherein,
the peripheral edge of the operation target image has a luminous appearance.
8. The non-contact operation device according to any one of claims 1 to 7, wherein,
the detection unit detects a pressing operation of the operation target image of the aerial image by the operator,
the control unit performs at least the visual feedback of moving the operation target image in which the operation is detected in the direction of the pressing operation.
9. The non-contact operation device according to any one of claims 1 to 8, wherein,
the detection unit detects a proximity before a pressing operation of the operation target image of the aerial image by the operator,
the control unit performs the visual feedback to move the operation target image in which the approach is detected in a direction opposite to a direction of the pressing operation.
10. The non-contact operation device according to any one of claims 1 to 9, wherein,
the detection unit is a distance sensor that measures a distance to a detection target used in the operation of the operator.
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-083332 | 2021-05-17 | ||
JP2021083332A JP2022176751A (en) | 2021-05-17 | 2021-05-17 | Non-contact operation device |
PCT/JP2022/016000 WO2022244515A1 (en) | 2021-05-17 | 2022-03-30 | Non-contact operation device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN117321665A true CN117321665A (en) | 2023-12-29 |
Family
ID=84141215
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202280034788.0A Pending CN117321665A (en) | 2021-05-17 | 2022-03-30 | Non-contact operation device |
Country Status (3)
Country | Link |
---|---|
JP (1) | JP2022176751A (en) |
CN (1) | CN117321665A (en) |
WO (1) | WO2022244515A1 (en) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9223442B2 (en) * | 2013-01-10 | 2015-12-29 | Samsung Display Co., Ltd. | Proximity and touch sensing surface for integration with a display |
KR101802430B1 (en) * | 2015-01-15 | 2017-11-28 | 가부시키가이샤 아스카넷토 | Non-contact input device and method |
JP2018097388A (en) * | 2015-04-21 | 2018-06-21 | 株式会社村田製作所 | User interface apparatus and user interface system |
JP7225586B2 (en) * | 2018-07-10 | 2023-02-21 | オムロン株式会社 | input device |
JP2022007868A (en) * | 2020-06-24 | 2022-01-13 | 日立チャネルソリューションズ株式会社 | Aerial image display input device and aerial image display input method |
-
2021
- 2021-05-17 JP JP2021083332A patent/JP2022176751A/en active Pending
-
2022
- 2022-03-30 WO PCT/JP2022/016000 patent/WO2022244515A1/en active Application Filing
- 2022-03-30 CN CN202280034788.0A patent/CN117321665A/en active Pending
Also Published As
Publication number | Publication date |
---|---|
JP2022176751A (en) | 2022-11-30 |
WO2022244515A1 (en) | 2022-11-24 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20100225564A1 (en) | Image display device | |
KR101802430B1 (en) | Non-contact input device and method | |
TWI571769B (en) | Contactless input device and method | |
EP2199889A1 (en) | Image display device | |
CN108762585A (en) | Non-contactly detection reproduces the method and device of the indicating positions of image | |
JPWO2008136064A1 (en) | Operation device, information processing system, display device, and information processing method | |
JP2014179032A (en) | Virtual key input device | |
JP2009043139A (en) | Position detecting device | |
WO2018146867A1 (en) | Control device | |
EP4092517A1 (en) | Input device | |
TW201324259A (en) | User interface display device | |
US20150109261A1 (en) | Projection apparatus | |
CN117321665A (en) | Non-contact operation device | |
JP5987395B2 (en) | Display device | |
JP6663736B2 (en) | Non-contact display input device and method | |
JPH0346724A (en) | Switching device | |
CN111758083A (en) | Non-contact input device | |
JP6924903B2 (en) | Information display device and information display system | |
JP2021189411A (en) | Vehicle display device | |
US11869394B2 (en) | Display device | |
WO2022181412A1 (en) | Input assisting mechanism and input system | |
JP2023001954A (en) | Aerial video display device | |
JP2024021759A (en) | display device | |
JP2023001957A (en) | Aerial image displaying device | |
CN116783644A (en) | Space suspension image display device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |