WO2023079879A1 - Dispositif de commande aérienne et procédé de démarrage - Google Patents

Dispositif de commande aérienne et procédé de démarrage Download PDF

Info

Publication number
WO2023079879A1
WO2023079879A1 PCT/JP2022/036830 JP2022036830W WO2023079879A1 WO 2023079879 A1 WO2023079879 A1 WO 2023079879A1 JP 2022036830 W JP2022036830 W JP 2022036830W WO 2023079879 A1 WO2023079879 A1 WO 2023079879A1
Authority
WO
WIPO (PCT)
Prior art keywords
aerial
operation screen
sleep mode
display device
display
Prior art date
Application number
PCT/JP2022/036830
Other languages
English (en)
Japanese (ja)
Inventor
仁 中山
正孝 浜野
Original Assignee
株式会社村上開明堂
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社村上開明堂 filed Critical 株式会社村上開明堂
Publication of WO2023079879A1 publication Critical patent/WO2023079879A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • HELECTRICITY
    • H01ELECTRIC ELEMENTS
    • H01HELECTRIC SWITCHES; RELAYS; SELECTORS; EMERGENCY PROTECTIVE DEVICES
    • H01H35/00Switches operated by change of a physical condition

Definitions

  • Japanese Unexamined Patent Application Publication No. 2016-95635 describes an aerial touch panel for a surgical simulator display system.
  • the aerial touch panel includes a display section, an aerial imaging panel as an imaging section, and an optical position input section.
  • the position input unit calculates the positional relationship between the position of the hand and the operation screen.
  • the position input unit transmits an operation input signal to the control unit when it is determined that the operator's hand is at the position of the operation screen. This allows the operator to operate the surgical simulator image without touching anything.
  • a settlement terminal device is described in JP-A-2016-51321.
  • the settlement terminal device has a main body and a lid, and a touch panel is provided on the upper surface of the lid.
  • the payment terminal device detects an input operation on the screen. If no input operation is detected on the screen for a certain period of time or longer, the touch panel shifts to sleep mode.
  • the settlement terminal device cancels the sleep mode when the input operation is detected after shifting to the sleep mode.
  • Japanese Patent Application Laid-Open No. 2021-103493 describes an information processing system.
  • the information processing system has an aerial imaging device, a control device for controlling a user's personal computer, and a sensor for detecting the position of an object such as a hand in the space where the aerial image is displayed.
  • the sensor has first and second sensing areas, and the aerial imaging device forms first and second aerial images corresponding to the first and second sensing areas, respectively.
  • the sensor detects the object reaching the first detection area and the controller puts the personal computer into sleep mode.
  • the control device restarts the personal computer.
  • An aerial image forming system includes an aerial image forming device that forms an aerial image, a human detection sensor that forms a detection plane including a position where the aerial image is to be formed, and a human detection sensor that detects the position of the person. and an image controller for determining the position for forming the aerial image. The image controller determines whether to form an aerial image. When the aerial imaging device forms an aerial image, the aerial imaging system wakes from sleep.
  • Japanese Patent Application Laid-Open No. 2019-197251 describes a display system equipped with a head-mounted display (HMD).
  • the HMD includes a connection device and an image display section.
  • the image display unit is connected to each of the personal computer, the image output device, and the power supply device via the connection device.
  • the connection device has a control section and an operation section.
  • the control unit has a sleep function for stopping the display of the image on the image display unit. When the sleep function is disabled, the HMD remains in normal operating mode.
  • Each system described above has a sleep mode, and returns from sleep mode to operable mode.
  • an aerial operation device that forms an aerial image of an operation screen including switches in the air and operates a device on the operation screen
  • An object of the present disclosure is to provide an in-flight operation device and a start-up method that can be recovered from sleep mode in an easy-to-understand manner.
  • An aerial operation device includes a display device that emits light to display information, an aerial imaging device that displays an operation screen with buttons as an aerial image by reflecting light from the display device multiple times, a sensor for detecting an object approaching the button.
  • the display device has a sleep mode in which the operation screen is not displayed via the aerial imaging device, and an operable mode in which the operation screen is displayed via the aerial imaging device. The display device switches from the sleep mode to the operable mode and displays the operation screen in the air when the sensor detects the object.
  • the display device emits light, and the aerial imaging device reflects the light from the display device multiple times to display an operation screen with buttons as an aerial image.
  • the air manipulator has a sensor that detects an object approaching the button.
  • the display device has a sleep mode in which the operation screen is not displayed and an operable mode in which the operation screen is displayed.
  • the display device switches from the sleep mode to the operable mode and displays the operation screen in the air when the sensor detects the object. By displaying the operation screen in the air when an object is detected, the operation screen can be displayed in an easy-to-understand manner when returning from the sleep mode to the operable mode.
  • the operation screen as an aerial image is displayed at a position on the nearer side than the aerial imaging device, the operation screen can be displayed in an easy-to-understand manner when the sleep mode is canceled. Furthermore, since it is not necessary to touch the device when canceling the sleep mode, the aerial operation device can be excellent in terms of hygiene.
  • the display device may display a sleep mode release button in the air when it is in sleep mode.
  • the user can recognize that it is in sleep mode, and the sleep mode can be easily released by pressing the sleep mode release button.
  • the air control device may be equipped with multiple sensors.
  • the display device displays the operation screen in the air when at least one of the plurality of sensors detects the object. Therefore, when any one of the plurality of sensors detects an object, the mode shifts to the operable mode and the operation screen is displayed in the air. Therefore, it is possible to return from the sleep mode in an easy-to-understand manner.
  • the display device may switch from the sleep mode to the operable mode and display the operation screen in the air when the plurality of sensors detect a predetermined motion of the target object.
  • the sleep mode is released when the plurality of sensors detect a predetermined motion of the object. Since the sleep mode is not canceled unless the object performs a predetermined action, unintentional display of the operation screen is suppressed. Therefore, the security of the air control device can be enhanced.
  • a start-up method includes a display device that irradiates light to display information, an aerial imaging device that reflects light from the display device multiple times to display an operation screen with buttons as an aerial image, and a sensor for detecting an object approaching a button.
  • the start-up method includes a step in which the display device enters a sleep mode in which the operation screen is not displayed via the aerial imaging device, and a step in which the display device switches from the sleep mode to the operable mode when the sensor detects an object. and displaying the operation screen in the air via the aerial imaging device.
  • an operation screen with buttons is displayed in the air as an aerial image by an aerial imaging device, and an object approaching the button trying to operate the button is detected by the sensor.
  • the display device has a sleep mode in which the operation screen is not displayed and an operable mode in which the operation screen is displayed.
  • the display device switches from the sleep mode to the operable mode, and the display device displays the operation screen in the air. Therefore, since the operation screen is displayed in the air when the object is detected, the operation screen can be displayed in an easy-to-understand manner when the sleep mode is switched to the operable mode.
  • the operation screen as an aerial image is displayed at a position on the nearer side than the aerial imaging device, the operation screen can be displayed in an easy-to-understand manner when the sleep mode is canceled. Furthermore, since it is not necessary to touch the device when canceling the sleep mode, the start-up method can be excellent from a hygienic point of view.
  • FIG. 1 is a perspective view showing an application example of an aerial operation device according to an embodiment
  • FIG. FIG. 4 is a perspective view showing a state in which the aerial operating device according to the embodiment displays a display surface with buttons
  • 1 is a vertical cross-sectional view showing an aerial operating device according to an embodiment
  • FIG. 3 is a block diagram showing functions of the air control device according to the embodiment
  • FIG. FIG. 4 is a diagram schematically showing a sensor and a press determination surface according to the embodiment
  • FIG. 4 is a diagram schematically showing an example of detection of an object by a sensor
  • 4 is a flow chart showing an example of steps of a start-up method according to an embodiment
  • FIG. 11 is a perspective view showing an aerial operation device according to a modification
  • FIG. 11 is a vertical cross-sectional view showing an aerial operating device according to a modification
  • FIG. 11 is a diagram schematically showing a sleep mode release button displayed by an air control device according to a modification;
  • FIG. 1 is a perspective view showing an aerial operation device 1 as an example according to this embodiment.
  • FIG. 2 is a perspective view showing a state in which the air control device 1 displays an operation screen 10 with buttons 11 in the air.
  • the aerial operation device 1 according to this embodiment is an operation input device for operating equipment.
  • the aerial operation device 1 is provided in the toilet T as an example, and the button 11 of the operation screen 10 is a switch for operating each part of the toilet T.
  • An operation screen 10 with buttons 11 is a floating image displayed in the air. That is, the operation screen 10 with the button 11 is displayed as an aerial image. Therefore, the user of the aerial operation device 1 does not have to directly touch the buttons and the like, so that each part of the toilet T can be operated hygienically.
  • the toilet T includes a toilet bowl B and a wall portion W adjacent to the toilet bowl B.
  • the wall portion W is provided on the right side as viewed from the user seated on the toilet bowl B.
  • An exemplary wall portion W has the aerial operation device 1 and toilet paper E fixed thereto, and a washbasin M may be provided.
  • the toilet bowl B has, for example, a washlet (registered trademark), and the washlet of the toilet T can be operated from the button 11 on the operation screen 10 .
  • the operation screen 10 has a plurality of buttons 11, and the user can operate each part of the washlet by operating the plurality of buttons 11.
  • the aerial operation device 1 is fixed to the wall W, for example.
  • the aerial operation device 1 may be embedded in the wall W. Further, the aerial operation device 1 may be detachable from the wall W.
  • An exemplary aerial operation device 1 includes a housing 2 and a protrusion 3 protruding from the bottom of the housing 2 .
  • a plurality of sensors 12 for detecting an object F approaching the button 11 are embedded in the projecting portion 3 .
  • the sensor detects an object means that the sensor irradiates the object with light and receives the light reflected from the object along with the irradiation of the light. .
  • object indicates an object that operates the button 11 to operate the device.
  • the "object” is the user's finger.
  • the object is not limited to the finger, and may be a stick-shaped object such as a pen or something other than a stick-shaped object.
  • the sensors 12 are provided corresponding to the buttons 11 , and each button 11 is displayed obliquely above each sensor 12 . As an example, the number of buttons 11 and sensors 12 is seven.
  • the operation screen 10 is displayed so as to be oriented obliquely upward in both the first direction D1 and the third direction D3.
  • the first direction D1 is the vertical direction of the aerial operation device 1
  • the third direction D3 is the direction of projection from the wall portion W.
  • the operation screen 10 has, for example, a shape elongated in the second direction D2.
  • the second direction D2 is a direction that intersects (perpendicularly as an example) both the first direction D1 and the third direction D3.
  • the operation screen 10 is displayed in a rectangular shape.
  • the vertical length L1 of the operation screen 10 is shorter than the horizontal length L2 of the operation screen 10 .
  • the aspect ratio of length L1 and length L2 is 1:X (where X is a real number)
  • the value of X is 2 or more and 10 or less, for example.
  • the lower limit of the value of X may be three.
  • the upper limit of the value of X may be 5, 6, 7, 8 or 9.
  • the above value of X may be 4 and the aspect ratio of length L1 and length L2 may be 1:4.
  • the button 11 has, for example, a rectangular shape. As an example, button 11 may be square-shaped. The length of one side of the button 11 is, for example, 18 mm or more and 24 mm or less. However, the shape and size of the button 11 can be changed as appropriate.
  • the operation screen 10 may display items other than the plurality of buttons 11.
  • the operation screen 10 may include multiple buttons 11 and character information 13 .
  • the character information 13 may include, for example, at least one of date information, time information, temperature information, temperature information, and atmospheric pressure information.
  • buttons 11 may be displayed horizontally on the lower side of the operation screen 10 and text information 13 may be displayed above the plurality of buttons 11 .
  • the button 11 may be arranged so that the center 11b of the operation screen 10 is shifted from the center 10b.
  • each of the plurality of buttons 11 is displayed at a position offset from a reference line V passing through the center 10b and extending in the longitudinal direction (second direction D2) of the operation screen 10.
  • FIG. 3 is a cross-sectional view schematically showing the internal structure of the aerial operation device 1.
  • the housing 2 has a box shape extending horizontally.
  • the housing 2 has a front surface 2b, a pair of side surfaces 2c, an upper surface 2d, and a lower surface 2f.
  • the front surface 2b extends in the first direction D1 and the second direction D2 when attached to the wall portion W.
  • the side surface 2c extends in the first direction D1 and the third direction D3.
  • the upper surface 2d extends in the second direction D2 and the third direction D3.
  • the lower surface 2f faces the opposite side of the upper surface 2d.
  • the projecting portion 3 has, for example, a rectangular plate shape.
  • the protruding portion 3 includes a main surface 3b that protrudes from the lower side of the front surface 2b and faces upward, an end surface 3c that faces the second direction D2 or the third direction D3, and a back surface that faces the opposite side of the main surface 3b. 3d.
  • the sensor 12 is built in the projecting portion 3 . This makes it easier for the sensor 12 to detect the object F.
  • the aerial operation device 1 includes an aerial imaging device 14 that displays an operation screen 10 with buttons 11 , a display device 15 arranged inside the housing 2 , and a control section 20 .
  • the aerial imaging device 14 and the display device 15 correspond to an aerial image display unit that displays the operation screen 10 as an aerial image in the air.
  • the aerial imaging device 14 is, for example, a retroreflecting member (retroreflecting mirror) fixed to an opening 2g located inside the front surface 2b of the housing 2 .
  • the opening 2g is defined by, for example, a pair of first plate portions 2h forming the front surface 2b of the housing 2 and a pair of second plate portions 2j fixed to the inside of the housing 2 of the first plate portions 2h. be done.
  • the first plate portion 2h and the second plate portion 2j are provided as a pair of upper and lower plates.
  • the aerial imaging device 14 is fixed to the housing 2 while being sandwiched between the first plate portion 2h and the second plate portion 2j.
  • the aerial imaging device 14 has, for example, a rectangular shape with long sides extending in the second direction D2 and short sides extending in the first direction D1. Each of one long side and the other long side of the aerial imaging device 14 is sandwiched between the first plate portion 2h and the second plate portion 2j.
  • the display device 15 is arranged obliquely with respect to the aerial imaging device 14 .
  • the display device 15 is a liquid crystal display (LCD).
  • the display device 15 is arranged diagonally above the aerial imaging device 14 inside the housing 2 .
  • the display device 15 has a screen for displaying images.
  • the screen of the display device 15 irradiates the light C as an image obliquely downward toward the aerial imaging device 14, for example.
  • the aerial image forming device 14 internally reflects the light C from the display device 15 a plurality of times (for example, twice), and the user of the aerial operation device 1 manipulates it into a space on the front side of the aerial image forming device 14 .
  • a screen 10 is imaged.
  • the sensor 12 may be exposed on the main surface 3b of the projecting portion 3, for example.
  • sensor 12 is a depth sensor.
  • the sensor 12 is provided on a virtual straight line extending from the button 11, that is, at a front position with respect to the button 11, which is an aerial image.
  • the sensor 12 acquires distance image data including information on the position (two-dimensional position) of the object F on a plane perpendicular to the virtual straight line and information on the distance K from the sensor 12 to the object F.
  • the sensor 12 outputs the acquired distance image data to the control unit 20 at a predetermined cycle (for example, 1/30 second).
  • the senor 12 irradiates light rays (or infrared rays) to each point on an object existing within an imaging area including the object F, and receives light rays reflected from each point on the object. Then, the sensor 12 measures the distance between the sensor 12 and each point on the object based on the received light beam, and outputs the measured distance for each pixel.
  • the distance between the sensor 12 and each point on the object may be measured by, for example, the Light Coding method.
  • the sensor 12 irradiates each point on an object existing within an imaging area including the object F with a light beam in a random dot pattern. Sensor 12 then measures the distance between sensor 12 and each point on the object by receiving rays reflected from each point on the object and detecting distortions in the pattern of the reflected rays. The sensor 12 detects two-dimensional position information of each point on the object and distance information from the sensor 12 to each point on the object as a plurality of pixels, and outputs the detected plurality of pixels to the control unit 20. do.
  • the control unit 20 can communicate with each of the sensor 12 and the display device 15.
  • the control unit 20 includes, for example, a CPU (Central Processing Unit) that executes programs, a storage unit including ROM (Read Only Memory) and RAM (Random Access Memory), an input/output unit, and a driver.
  • a CPU Central Processing Unit
  • ROM Read Only Memory
  • RAM Random Access Memory
  • Each function of the control unit 20 is realized by operating the input/output unit under the control of the CPU to read or write data in the storage unit.
  • the form and location of the control unit 20 are not particularly limited.
  • FIG. 4 is a functional block diagram of the control unit 20.
  • the control unit 20 includes, as functional components, an image output unit 21, an object detection unit 22, a recognition unit 23, a signal output unit 24, and a notification unit 25.
  • the image output unit 21 outputs to the display device 15 a control signal for image data of an image to be displayed on the display device 15 .
  • the display device 15 can display various types of images based on image data control signals from the image output unit 21 .
  • the object detection unit 22 detects the object F based on the distance image data output from the sensor 12 .
  • the object detection unit detects the object means that the distance image data is output from the sensor to the control unit as the sensor detects the object, and the object detection unit receives the distance image data. It indicates that an object is detected.
  • the target object detection unit 22 outputs position data indicating the position of the target object F to the recognition unit 23 .
  • the recognition unit 23 recognizes that the button 11 has been pressed by the object F based on the position data output from the object detection unit 22 .
  • the notification unit 25 receives an operation signal from the recognition unit 23 when the recognition unit 23 recognizes that the button 11 has been pressed.
  • the notification unit 25 is notification means for notifying the user of the air control device 1 that the button 11 has been operated when the recognition unit 23 recognizes that the button 11 has been operated.
  • the notification unit 25 has, for example, an audio output unit 25b and a color change unit 25c.
  • the audio output unit 25b is a speaker, and outputs audio when an operation signal is received from the recognition unit 23.
  • the user can understand that the button 11 has been operated by listening to the voice from the voice output unit 25b.
  • the color changer 25 c generates a color change signal and outputs the color change signal to the display device 15 when receiving an operation signal from the recognition unit 23 , for example.
  • the display device 15 When the display device 15 receives the color change signal from the color change unit 25c, it changes the color of the button 11 pressed by the user, for example. By visually recognizing that the color of the button 11 has been changed, the user can recognize that the button 11 has been operated. Note that the display device 15 may change the color of portions other than the button 11 when receiving the color change signal from the color change section 25c.
  • the notification unit 25 By providing the notification unit 25 as described above, the user can understand that the button 11 has been operated. However, in the notification unit 25, at least one of the audio output unit 25b and the color change unit 25c may be omitted.
  • the notification unit 25 may notify the user that the button 11 has been operated in a manner different from the voice output by the voice output unit 25b or the color change by the color change unit 25c.
  • the signal output unit 24 generates a control signal based on the pressing operation of the button 11 when the recognition unit 23 recognizes that the button 11 has been pressed.
  • the signal output unit 24 transmits the generated control signal to the device 30 external to the air control device 1 , and transmits the control signal to the device 30 to operate the device 30 .
  • the device 30 may be, for example, a toilet bowl B with a washlet (see FIG. 1), each part of the washlet, or a washstand M.
  • the device 30 may be a toilet bowl B, and the toilet bowl B may operate to clean the toilet bowl when receiving a control signal from the signal output unit 24 .
  • each part of the toilet T can be operated by operating the button 11 displayed as an aerial image. Therefore, since it is not necessary to directly (physically) press a button or the like for the operation of each part of the toilet T, the operation can be realized in a sanitary manner.
  • the recognition unit 23 determines whether the distance K between the sensor 12 and the object F is equal to or less than the threshold Y1, and whether the distance K between the sensor 12 and the object F is equal to or less than the threshold Y2 which is smaller than the threshold Y1. or may be determined. In this case, the recognition unit 23 determines whether or not the object F has reached the first surface Z1 whose distance K from the sensor 12 is the threshold Y1, and whether or not the object F has reached the second surface Z2 whose distance K from the sensor 12 is the threshold Y2. It is determined whether or not the object F has reached the
  • Each of the first surface Z1 and the second surface Z2 is a virtual surface formed at a site at a constant distance from the sensor 12, and is a press determination surface for determining whether the button 11 has been pressed. As shown in FIGS. 3 and 5, each of the first surface Z1 and the second surface Z2 has, for example, a circular shape. Each of the first surface Z1 and the second surface Z2 is provided at a position near the button 11, for example. At least one of the first surface Z1 and the second surface Z2 may match the position of the button 11, or both the first surface Z1 and the second surface Z2 may not match the position of the button 11. good.
  • the threshold value Y1 may be the same value as the threshold value Y2, and the first surface Z1 and the second surface Z2 may be the same surface. An example in which the threshold value Y1 is the same as the threshold value Y2 and the first surface Z1 and the second surface Z2 are the same press determination surface Z will be described below.
  • the values of the threshold values Y are the same. However, the values of the threshold Y may differ from each other in the plurality of sensors 12 .
  • the threshold Y value for the sensor 12 corresponding to a particular button 11 may be smaller than the threshold Y value for the sensor 12 corresponding to another button 11 .
  • the specific button 11 cannot be pressed unless the object F is moved closer to the sensor 12 than the other buttons 11 . Therefore, the possibility of erroneously operating the specific button 11 can be reduced.
  • the specific button includes, for example, a button used in an emergency (emergency button).
  • the value of the threshold Y in the sensor 12 corresponding to a specific button may be greater than the value of the threshold Y in the sensor 12 corresponding to another button 11.
  • the specific button 11 can be pressed even if the object F is farther from the sensor 12 than the other buttons 11 . Therefore, the specific button 11 can be made easier to operate than the other buttons 11 .
  • the mode switching unit 16 may switch to sleep mode when power is supplied to the air control device 1 (or when the power is turned on). Also, the mode switching unit 16 may set the operable mode when power is supplied to the air control device 1 . In the sleep mode, the light C may not be emitted from the display device 15 and nothing may be displayed. Also, in the sleep mode, the display device 15 may display only the sleep mode. Since the operation screen 10 is not displayed in the sleep mode, it is impossible to operate the equipment 30 with the aerial operation device 1 .
  • the manner in which the sleep mode is returned to the operable mode is not limited to the manner described above.
  • the mode switching unit 16 may switch from the sleep mode to the operable mode when the plurality of sensors 12 detect the target F.
  • the mode switching unit 16 may switch from the sleep mode to the operable mode when the two sensors 12 arranged along the second direction D2 detect the target F.
  • the method of canceling sleep mode is not limited to the above example.
  • the mode switching unit 16 may switch from the sleep mode to the operable mode when the sensor 12 detects an object F leaving the sensor 12. .
  • Various methods can be employed for switching from the sleep mode to the operable mode.
  • FIG. 7 is a flow chart showing an example of the steps of the start-up method according to this embodiment.
  • An example of a method for starting up the aerial operation device 1 will be described below with reference to FIG.
  • the air control device 1 is powered on.
  • the aerial operation device 1 is activated while the display device 15 is in sleep mode (the step of entering sleep mode, step S1).
  • the control unit 20 determines whether or not the sensor 12 has detected the target object F (step of determining whether or not the sensor has detected the target object; step S2).
  • the detection of the object F by the sensor 12 in step S2 may be different from the detection that the object F has reached the depression determination plane Z.
  • the sensor 12 detects whether or not the object F exists in a predetermined area farther from the sensor 12 than the press determination surface Z.
  • control unit 20 determines that the sensor 12 has not detected the object F in the predetermined area.
  • the control unit 20 determines whether or not the object F exists within the set area (for example, the depression determination surface Z). 20 determines (step S3).
  • step S3 for example, when the control unit 20 determines that there is no object F on the depression determination surface Z, a series of steps is completed. Further, when the control unit 20 determines that the object F is present on the depression determination surface Z, the display device 15 displays the operation screen 10 in the air via the aerial imaging device 14 (displays the operation image in the air). step S4), and the mode switching unit 16 switches from the sleep mode to the operable mode (step of switching to the operable mode, step S5). A series of steps of the start-up method is completed through the above steps.
  • the display device 15 irradiates the light C, and the aerial imaging device 14 reflects the light C from the display device 15 multiple times to form an aerial image with the button 11 attached. to display the operation screen 10 of .
  • the aerial operating device 1 includes a sensor 12 that detects an object F approaching the button 11 .
  • the display device 15 has a sleep mode in which the operation screen 10 is not displayed and an operable mode in which the operation screen 10 is displayed.
  • the display device 15 switches from sleep mode to operable mode and displays the operation screen 10 in the air.
  • the operation screen 10 can be displayed in an easy-to-understand manner when returning from the sleep mode to the operable mode. That is, since the operation screen 10 as an aerial image is displayed at a position on the nearer side than the aerial imaging device 14, the operation screen 10 can be displayed in an easy-to-understand manner when the sleep mode is canceled. Furthermore, since it is not necessary to touch the device when canceling the sleep mode, the aerial operation device 1 can be excellent in terms of hygiene.
  • the display device 15 may display the sleep mode release button 10A in the air when in sleep mode. In this case, the user can recognize that it is in sleep mode, and can easily release the sleep mode by pressing the sleep mode release button 10A.
  • the aerial operation device 1 includes a plurality of sensors 12. Therefore, when at least one of the plurality of sensors 12 detects the object F, the display device 15 displays the operation screen 10 in the air. Therefore, when any one of the plurality of sensors 12 detects the object F, the mode shifts to the operable mode and the operation screen 10 is displayed in the air.
  • the display device 15 switches from the sleep mode to the operable mode when the plurality of sensors 12 detect a predetermined action of the object F (for example, an action of approaching the sensor 12 and then moving in the second direction D2).
  • the operation screen 10 may be displayed in the air.
  • the sleep mode is released when the plurality of sensors 12 detect a predetermined motion of the object F.
  • FIG. Therefore, since the sleep mode is not canceled unless the object F performs a predetermined action, unintentional display of the operation screen 10 is suppressed. Therefore, the security of the air control device 1 can be enhanced.
  • FIG. 8 is a perspective view showing an aerial operation device 41 according to a modification.
  • FIG. 9 is a vertical cross-sectional view schematically showing the internal structure of the aerial operation device 41.
  • the aerial operation device 41 displays a display surface 50 having a plurality of buttons 51, which are numeric keys, in the air as an aerial image.
  • the aerial operation device 41 has a columnar shape.
  • the aerial operation device 41 extends in a first direction A1, a second direction A2 and a third direction A3.
  • the first direction A1 corresponds to the depth direction (front-rear direction) of the aerial operation device 41 .
  • the second direction A2 corresponds to the width direction of the aerial operation device 41 .
  • a third direction A3 corresponds to the height direction of the aerial operation device 41 .
  • the length (height) of the air operation device 41 in the third direction A3 is the length (depth) of the air operation device 41 in the first direction A1 and the length (width) of the air operation device 41 in the second direction A2. longer than This makes it possible to further improve the visibility of the display surface 50 for the user.
  • the aerial operation device 41 includes a housing 42 that constitutes the outer surface of the aerial operation device 41 .
  • the housing 42 has, for example, an upper surface 42b, a pair of side surfaces 42c, a front surface 42d, and a lower surface 42g.
  • the upper surface 42b faces vertically upward, and the pair of side surfaces 42c are arranged along the second direction A2.
  • the front surface 42d faces one side in the first direction A1, and the rear surface 42f faces the other side in the first direction A1.
  • the lower surface 42g faces vertically downward.
  • the housing 42 has an inclined surface 42h located above the front surface 42d.
  • An opening through which the aerial imaging device 54 is exposed is formed in the inclined surface 42h.
  • This opening has, for example, a rectangular shape.
  • the inclined surface 42h is directed obliquely upward, for example.
  • the display surface 50 imaged by the aerial imaging device 54 is displayed at a position closer to the user than the inclined surface 42h. Therefore, the visibility of the display surface 50 can be improved.
  • a button 51 on the display surface 50 is a button for operating an automated teller machine (ATM).
  • a display surface 50 with a button 51 is a floating image displayed in the air. That is, the display surface 50 with the button 51 is displayed as an aerial image. Therefore, the user of the air control device 41 does not have to directly touch the touch panel or the like, and thus can sanitarily operate the automated teller machine.
  • the display surface 50 may be illuminated, and in this case, the display surface 50 can be easily viewed regardless of the brightness of the surroundings.
  • the display surface 50 has, for example, a shape that extends vertically. As an example, the display surface 50 is displayed in a rectangular shape, and the vertical length of the display surface 50 is longer than the horizontal length of the display surface 50 .
  • the shape and size of the display surface 50 are not limited to the above examples and are not particularly limited.
  • the display surface 50 may display items other than the button 51 .
  • the display surface 50 may include a plurality of buttons 51 and character information. Character information may include at least one of date information and time information, for example.
  • buttons 51 are displayed in a grid pattern.
  • the plurality of buttons 51 are displayed as numeric keys. In this case, by pressing the button 51 of the air control device 41, it becomes possible to designate the withdrawal amount, the deposit amount, the transfer amount, or the personal identification number for the automated teller machine.
  • the shape of the button 51 is, for example, a square shape (a rectangular shape with rounded corners as an example). However, the shape of the button 51 may be circular, oval, or polygonal, and is not particularly limited.
  • the button 51 may include at least one of an enter button and a return button. In this case, the enter button corresponds to a button for deciding an operation, and the return button corresponds to a button for returning to the previous operation.
  • the type of button 51 is not limited to the above examples and is not particularly limited.
  • a display surface with buttons is displayed as an aerial image.

Abstract

Selon un mode de réalisation de la présente invention, un dispositif de commande aérienne comprend un dispositif d'affichage émettant de la lumière pour afficher des informations, un dispositif de formation d'image aérienne réfléchissant la lumière provenant du dispositif d'affichage plusieurs fois pour former une image aérienne et affichant un écran de commande avec des boutons, et un capteur détectant un objet F s'approchant de l'un quelconque des boutons. Le dispositif d'affichage a un mode veille dans lequel l'écran de commande n'est pas affiché par l'intermédiaire du dispositif de formation d'image aérienne et un mode opérationnel dans lequel l'écran de commande est affiché par l'intermédiaire du dispositif de formation d'image aérienne. Lorsque le capteur détecte l'objet, le dispositif d'affichage bascule du mode veille au mode opérationnel pour afficher l'écran de commande aérienne.
PCT/JP2022/036830 2021-11-04 2022-09-30 Dispositif de commande aérienne et procédé de démarrage WO2023079879A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2021180315 2021-11-04
JP2021-180315 2021-11-04

Publications (1)

Publication Number Publication Date
WO2023079879A1 true WO2023079879A1 (fr) 2023-05-11

Family

ID=86241319

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2022/036830 WO2023079879A1 (fr) 2021-11-04 2022-09-30 Dispositif de commande aérienne et procédé de démarrage

Country Status (1)

Country Link
WO (1) WO2023079879A1 (fr)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150277545A1 (en) * 2014-03-31 2015-10-01 Motorola Mobility, Llc Apparatus and Method for Awakening a Primary Processor Out of Sleep Mode
JP2016006447A (ja) * 2014-06-20 2016-01-14 船井電機株式会社 画像表示装置
WO2016047173A1 (fr) * 2014-09-24 2016-03-31 オリンパス株式会社 Système médical

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150277545A1 (en) * 2014-03-31 2015-10-01 Motorola Mobility, Llc Apparatus and Method for Awakening a Primary Processor Out of Sleep Mode
JP2016006447A (ja) * 2014-06-20 2016-01-14 船井電機株式会社 画像表示装置
WO2016047173A1 (fr) * 2014-09-24 2016-03-31 オリンパス株式会社 Système médical

Similar Documents

Publication Publication Date Title
CN108496181A (zh) 采用触摸屏来发起指纹获取
US20100026723A1 (en) Image magnification system for computer interface
CN106341569B (zh) 图像形成装置及该图像形成装置的控制方法
JP2022130496A (ja) 入力装置
CN112262451A (zh) 输入装置
JP2006209279A (ja) 入力装置および触読文字記号入力方法
US20230033280A1 (en) Operation input device
JP2022008645A (ja) 非接触入力システム及び方法
JP2020064632A (ja) 入力装置
WO2023079879A1 (fr) Dispositif de commande aérienne et procédé de démarrage
JP2013186827A (ja) 操作装置
KR20170007573A (ko) 레이저를 이용한 비접촉식 버튼 입력 장치
WO2022113685A1 (fr) Dispositif d'actionnement aérien
WO2022113687A1 (fr) Appareil d'exploitation d'antenne
JP7272764B2 (ja) 情報提供システム
WO2021260989A1 (fr) Dispositif d'entrée d'affichage d'image aérienne et procédé d'entrée d'affichage d'image aérienne
JP2022539483A (ja) 非接触式タッチパネルシステム及びその制御方法、並びに既存のタッチスクリーンに装着可能な非接触式入力装置
TWI573043B (zh) The virtual two - dimensional positioning module of the input device
JP2000181602A (ja) スイッチ付処理システムおよびスイッチ装置
WO2022085346A1 (fr) Appareil d'affichage aérien
JP2003186621A (ja) タッチパネルおよびそのタッチパネルを備えた装置
WO2023095465A1 (fr) Dispositif de détection d'opération
WO2021220623A1 (fr) Dispositif d'entrée d'opération
JP2002317482A (ja) 衛生洗浄装置
WO2022181412A1 (fr) Mécanisme d'aide à la saisie et système de saisie

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22889703

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023557900

Country of ref document: JP