WO2016103522A1 - 制御装置、電子機器、制御方法およびプログラム - Google Patents
制御装置、電子機器、制御方法およびプログラム Download PDFInfo
- Publication number
- WO2016103522A1 WO2016103522A1 PCT/JP2014/084716 JP2014084716W WO2016103522A1 WO 2016103522 A1 WO2016103522 A1 WO 2016103522A1 JP 2014084716 W JP2014084716 W JP 2014084716W WO 2016103522 A1 WO2016103522 A1 WO 2016103522A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- display
- detection reference
- user
- aerial image
- detection
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/04164—Connections between sensors and controllers, e.g. routing lines between electrodes and connection pads
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/0304—Detection arrangements using opto-electronic means
- G06F3/0325—Detection arrangements using opto-electronic means using a plurality of light emitters or reflectors or a plurality of detectors forming a reference frame from which to derive the orientation of the object, e.g. by triangulation or on the basis of reference deformation in the picked up image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/0416—Control or interface arrangements specially adapted for digitisers
- G06F3/0418—Control or interface arrangements specially adapted for digitisers for error correction or compensation, e.g. based on parallax, calibration or alignment
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0421—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by interrupting or reflecting a light beam, e.g. optical touch-screen
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0425—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/042—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
- G06F3/0428—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means by sensing at the edges of the touch surface the interruption of optical paths, e.g. an illumination plane, parallel to the touch surface which may be virtual
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/041—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
- G06F3/044—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means
- G06F3/0444—Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by capacitive means using a single conductive element covering the whole sensing surface, e.g. by sensing the electrical current flowing at the corners
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0481—Interaction techniques based on graphical user interfaces [GUI] based on specific properties of the displayed interaction object or a metaphor-based environment, e.g. interaction with desktop elements like windows or icons, or assisted by a cursor's changing behaviour or appearance
- G06F3/04815—Interaction with a metaphor-based environment or interaction object displayed as three-dimensional, e.g. changing the user viewpoint with respect to the environment or object
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04883—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/14—Digital output to display device ; Cooperation and interconnection of the display device with other functional units
- G06F3/147—Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/001—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background
- G09G3/003—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background to produce spatial visual effects
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G5/00—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
- G09G5/36—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
- G09G5/38—Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory with means for controlling the display position
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04101—2.5D-digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface and also measures the distance of the input means within a short range in the Z direction, possibly with a separate measurement setup
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/041—Indexing scheme relating to G06F3/041 - G06F3/045
- G06F2203/04108—Touchless 2D- digitiser, i.e. digitiser detecting the X/Y position of the input means, finger or stylus, also when it does not touch, but is proximate to the digitiser's interaction surface without distance measurement in the Z direction
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/066—Adjustment of display parameters for control of contrast
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0693—Calibration of display systems
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
Definitions
- the present invention relates to a control device, an electronic device, a control method, and a program.
- Patent Document 1 An electronic device that detects an operation on a three-dimensional object displayed in the air by a capacitive touch sensor that calculates a distance between a finger and a touch panel is disclosed.
- Patent Document 1 describes detecting an operation to be performed on a three-dimensional object (target), the operability at the time of the operation on the object is still insufficient.
- the control device includes a control unit that controls the display and changes the positional relationship between the display and the detection device that detects the user's operation with respect to the display in the air.
- the positional relationship can be changed by the user.
- a control method for controlling the display in the air and changing the positional relationship between the display and the detection reference for detecting the user's operation there is provided a program for causing a computer to execute a process of controlling a display in the air and changing a positional relationship between a detection reference for detecting a user operation and the display by the user. To run.
- FIG. 1 It is a figure explaining the structure of the display apparatus which concerns on 1st Embodiment, (a) is a disassembled perspective view, (b) is sectional drawing. It is a block diagram explaining the principal part structure of the display apparatus which concerns on 1st Embodiment. It is a figure which shows typically the aerial image displayed in 1st Embodiment, (a) is a top view, (b), (c) shows the relationship between an operation detector, an aerial image, and a detection reference. It is sectional drawing shown. It is a figure which shows typically the aerial image for the calibration process displayed in 1st Embodiment.
- FIG. It is a figure explaining the calibration process in 1st Embodiment, (a), (b), (c), (d) is the relationship between an operation detector, an aerial image, a detection reference, and a finger position.
- FIG. It is a flowchart explaining the calibration process by the 1st calibration process mode in 1st Embodiment. It is a figure which shows the positional relationship of the aerial image in the 2nd calibration process mode in 1st Embodiment, a detection reference
- FIG. It is a figure which shows typically the aerial image displayed in the modification 7 of 1st Embodiment, (a) is a top view, (b) shows the relationship between an operation detector, an aerial image, and a detection reference. It is sectional drawing. It is a figure explaining the calibration process in the modification 7 of 1st Embodiment, (a), (b), (c) is the relationship between an operation detector, an aerial image, a detection reference, and a finger position.
- FIG. It is a perspective view of the display apparatus which concerns on the modification 8 of 1st Embodiment. It is a block diagram explaining the principal part structure of the display apparatus which concerns on the modification 8 of 1st Embodiment.
- FIG. 1 It is a block diagram explaining the principal part structure of the display apparatus which concerns on 4th Embodiment. It is a top view of the operation detector with which the display apparatus which concerns on 4th Embodiment is equipped. It is a figure explaining the structure and function of the operation detector in 4th Embodiment. It is a figure which shows schematic structure of the display apparatus which concerns on the modification 1 of 4th Embodiment. It is a figure which shows another schematic structure of the display apparatus which concerns on the modification 1 of 4th Embodiment. It is a figure explaining the structure of the display apparatus which concerns on 5th Embodiment, (a) is a disassembled perspective view, (b) is sectional drawing.
- FIG. 1 It is a figure explaining the calibration process in the modification 1 of 6th Embodiment, (a), (b), (c) is sectional drawing which shows the relationship between an aerial image, a detection reference, and the position of a finger
- FIG. 20 is a diagram schematically illustrating a case where a non-contact operation is not detected within the detection standard in the eighth embodiment, and (a) is a part of a predetermined non-contact operation detected outside the detection standard above the detection standard. When the remaining part is detected within the detection standard, (b) shows a case where a part of the predetermined non-contact operation is detected below the detection standard or outside the detection standard and the remaining part is detected within the detection standard.
- FIG. 20 is a diagram schematically illustrating a case where a non-contact operation is not detected within the detection standard in the eighth embodiment, and (a) is a part of a predetermined non-contact operation detected outside the detection standard above the detection standard. When the remaining part is detected within the detection standard, (b) shows a case where a part of the predetermined non-contact operation is detected below the detection standard or outside the detection standard and the remaining part is detected within the detection standard.
- FIG. 1A is an exploded perspective view of the display device 1
- FIG. 1B is an enlarged side view showing a part of the display device 1.
- a coordinate system composed of the X axis, the Y axis, and the Z axis is set as shown in the drawing for the display device 1.
- the coordinate system is not limited to an orthogonal coordinate system including the X axis, the Y axis, and the Z axis, and a polar coordinate system or a cylindrical coordinate system may be employed. That is, the X axis is set in the short side direction of the rectangular display surface of the display device 1, the Y axis is set in the long side direction of the rectangular display surface of the display device 1, and the Z axis is a direction perpendicular to the display surface. Is set to
- the display device 1 includes a main body 10 incorporating a control unit 20, a display 11, an imaging optical system 12, and an operation detector 13.
- the display 11, the imaging optical system 12, and the operation detector 13 are disposed in the main body 10.
- the display 11 is composed of, for example, a liquid crystal display, an organic EL display, or the like, and has a plurality of display pixel arrays arranged two-dimensionally.
- the display 11 is controlled by the control unit 20 and displays an image corresponding to the display image data.
- the imaging optical system 12 is disposed above the display 11 (Z direction + side) with a predetermined distance from the display 11.
- the imaging optical system 12 is configured by superposing two microlens arrays in which minute convex lenses 121 are arranged two-dimensionally in the Z direction.
- the imaging optical system 12 forms an aerial image 30 of the display image displayed on the display 11 in the space above the display device 1. That is, a user of the display device 1 (hereinafter referred to as a user) can observe the display image displayed on the display 11 as an aerial image 30 floating in the air above the display device 1.
- the aerial image 30 includes a plurality of icons 30A (operation buttons) corresponding to operation buttons for instructing various settings of the display device 1 and execution of various functions.
- the icons 30A are arranged in, for example, 3 rows ⁇ 5 columns. Note that a pinhole array or a slit array may be used instead of the microlens array as the imaging optical system.
- the operation detector 13 is provided above the imaging optical system 12 (Z direction + side), and is configured by, for example, a known transparent capacitance panel (hereinafter referred to as a capacitance panel).
- the operation detector 13 made of a capacitance panel forms an electric field with an electrode made of a substantially transparent member.
- the operation detector 13 detects the position of the finger or stylus as a capacitance value when the user moves the finger or stylus toward the aerial image 30 in order to operate the display position of the aerial image 30. .
- the capacitance values detected at the four corners of the transparent capacitance panel are compared, and the position of the user's finger on the X and Y axes is detected based on the capacitance values detected at the four corners. To do.
- the operation detector 13 has a capacitance detection range in a predetermined range in the upward direction of the operation detector 13, and between the finger or stylus within the predetermined detection range and the operation detector 13. Are detected based on the capacitance values detected at the four corners, for example, by comparing the capacitance values detected at the four corners of the transparent capacitance panel.
- the aerial image 30 is imaged by the imaging optical system 12 so as to be positioned within a predetermined detection range of the operation detector 13, and preferably positioned in the middle in the vertical direction of the predetermined detection range. .
- the operation detector 13 detects that the user operates the display position of the aerial image 30 with a finger or a stylus, so that the user can operate the aerial image 30 without touching the operation detector 13 directly. Can be executed.
- the display position of the aerial image 30 is operated with a finger will be described, but the same applies to an operation with a stylus or the like.
- FIG. 2 is a block diagram showing the control unit 20 and the display 11 and the operation detector 13 controlled by the control unit 20 in the configuration of the display device 1.
- the control unit 20 includes a CPU, a ROM, a RAM, and the like, and controls various components including the display 11 and the operation detector 13 of the display device 1 based on a control program, and executes various data processing. Arithmetic circuit.
- the control unit 20 includes an image generation unit 201, a display control unit 202, a calibration unit 203, a detection reference control unit 204, and a storage unit 205.
- the storage unit 205 includes a non-volatile memory that stores a control program, a storage medium that stores image data displayed on the display 11, and the like.
- the correspondence between the distance from the surface of the operation detector 13 to the fingertip and the capacitance when the operation detector 13 detects the fingertip is stored in the storage unit 205 in advance. Therefore, when the fingertip is located within the predetermined detection range of the operation detector 13, the operation detector 13 detects the capacitance at the fingertip, and the detected capacitance and the correspondence stored in the storage unit 205. From the relationship, the position of the fingertip in the Z direction can be detected.
- the image generation unit 201 generates display image data corresponding to the display image to be displayed on the display 11 based on the image data stored in the storage medium.
- the display control unit 202 causes the display 11 to display an image corresponding to the display image data generated by the image generation unit 201. Further, when the user performs an operation on the display position of the icon 30A of the aerial image 30, the display control unit 202 performs switching control of the display image on the display device 11 according to the type of the operated icon 30A. When the user performs an operation on the display position of the icon 30 ⁇ / b> A of the aerial image 30, the display control unit 202 is not limited to performing display image switching control on the display device 11.
- the display device 11 displays a moving image as a display image, and when the user operates the display position of the icon 30 ⁇ / b> A of the aerial image 30, the display control unit 202 reproduces the moving image displayed by the display device 11. You may perform control to stop, or control to stop a moving picture.
- the calibration unit 203 executes calibration processing in first and second calibration processing modes, which will be described in detail later.
- the detection reference control unit 204 sets the detection surface, that is, the detection reference, in the upper space of the display device 1, specifically, within the predetermined detection range of the operation detector 13 and the position of the aerial image 30 (or the predetermined range). ).
- the detection reference control unit 204 further determines that the user's finger has reached the detection reference based on the capacitance value detected by the operation detector 13. That is, the detection reference control unit 204 sets the finger position (the position on each of the X, Y, and Z axes) corresponding to the capacitance value detected by the operation detector 13. When it matches the position, it is determined that the user has operated the display position of the icon 30A.
- the detection reference control unit 204 sets the detection reference to a predetermined initial position, and changes or corrects the position of the detection reference based on the result of calibration processing described later.
- the initial position of this detection reference is stored in the storage unit 205 in advance.
- the initial position of the detection reference may be, for example, a position common to all users, or a different position may be set for each user based on the usage history of the display device 1 by the user. .
- the initial position of the detection reference and the changed position of the detection reference may be set on the entire plane (X axis, Y axis) of the operation detector 13 or may be set on a part of the plane. May be.
- the detection reference set at the previous use of the display device 1 may be stored in the storage unit 205 and read out for use.
- the detection reference control unit 204 is not limited to the case where the finger position corresponding to the capacitance value detected by the operation detector 13 matches the detection reference position, and the finger position substantially matches the detection reference position. In this case, it may be determined that the user has operated the display position of the icon 30A. The range that is determined to be substantially coincident may be set in advance.
- FIG. 3A shows an example of an aerial image 30 displayed by the display device 1, and FIG. 3B schematically shows a positional relationship between the main body 10 or the operation detector 13, the aerial image 30, and the detection reference 40.
- the aerial image 30 includes 15 icons 30A arranged in 3 rows ⁇ 5 columns as described above.
- the detection reference 40 is set by the detection reference control unit 204 in the vicinity of the position of the aerial image 30, specifically, a position slightly above the aerial image 30 in the illustrated example.
- the icon in the aerial image 30 is indicated by a thick dotted line 30A.
- the icon 30A is a part of the aerial image 30, and thus is located at the same height as the position of the aerial image 30, but in FIG. 3B, a thick dotted line representing the icon 30A is indicated by the aerial image. In order to distinguish it from the solid line representing 30, the position of the thick dotted line is drawn shifted from the solid line position.
- the aerial image 30 is formed at a position above the operation detector 13 of the display device 1 by a distance H1, and the detection reference 40 is above the operation detector 13 by a distance H2 (H1 ⁇ It is set at a position separated by H2).
- the operation detector 13 has a capacitance detection range 13A upward from the surface thereof.
- the capacitance detection limit above the operation detector 13 is indicated by a dotted line 13a, and the interval between the capacitance detection limit 13a and the operation detector 13 is the capacitance detection range 13A.
- the aerial image 30 and the detection reference 40 are set so as to be positioned within the capacitance detection range 13A.
- the detection reference 40 is set above the aerial image 30 in FIG. 3B.
- the detection reference 40 is also below the aerial image 30 or in the air. You may make it correspond to the position of the image 30.
- a range other than the region set as the detection reference 40 in the detection range 13 ⁇ / b> A is set as a non-detection reference 41.
- the aerial image 30 and the detection reference 40 are represented as planes parallel to the XY plane. However, these are not limited to planes, and both are composed of curved surfaces, for example. Also good.
- the detection reference 40 may have a step for each icon 30A instead of a plane. In other words, the interval between an icon 30A and the detection criterion 40 for that icon may be different from the interval between another icon 30A and the detection criterion 40 for that other icon. In this manner, providing a step in the detection reference 40 is particularly effective when the aerial image 30 is a three-dimensional image and the positions of the plurality of icons 30A are shifted from each other in the Z direction, that is, the vertical direction.
- the distance between the icon 30 ⁇ / b> A and the corresponding detection reference 40 by shifting the position of the detection reference 40 corresponding to each icon 30 ⁇ / b> A according to the vertical displacement of the plurality of icons 30 ⁇ / b> A of the stereoscopic aerial image 30. Can be made constant.
- the operation detector 13 outputs a detection output corresponding to the distance H2 when the user's fingertip reaches the distance H2 from the operation detector 13. Based on the detection output from the operation detector 13, the detection reference control unit 204 determines that the user's fingertip matches the detection reference 40, and determines that the fingertip has operated the display position of the icon 30A. In this way, the display device 1 detects that the user has operated the display position of the icon 30A of the aerial image 30, and executes a function corresponding to the operated icon 30A. For example, the display image on the display 11 is switched.
- the icon 30A is located at a distance H1 from the operation detector 13, but is displayed as an aerial image 30, so that a display position of the icon 30A of the aerial image 30 is displayed between a certain user and another user. That is, the height H1 may be felt visually differently. Even the same user may feel the display position of the icon 30A differently depending on the environment in which the display device 1 is operated. For example, when the detection reference 40 is set to coincide with the position of the aerial image 30, a user moves his / her finger toward the icon 30A of the aerial image 30 in order to operate the display position of the icon 30A.
- the icon When the user still feels that the finger is in front of the icon 30A, but the finger has actually reached the icon 30A, that is, the detection criterion 40, the icon is separated from the user's intention. The operation is executed. Conversely, when another user moves his / her finger toward the icon 30A of the aerial image 30 for icon operation, the user reaches the icon 30A and operates the display position of the icon 30A. However, if the finger is actually positioned in front of the icon 30A, that is, the detection reference 40, the icon operation is not executed separately from the user's intention. In either case, the user feels uncomfortable with the icon operation.
- the display device 1 is provided with a calibration processing mode for reducing the above-described feeling of strangeness in the icon operation in addition to the aerial image operation mode for performing the operation on the aerial image 30 described above.
- the calibration processing mode the relative positional relationship between the aerial image 30 and the detection reference 40 is set to a relationship suitable for the user's operational sensation / operation characteristics, the usage environment of the display device, and the like.
- the display device 1 of the present embodiment has the first and second calibration processing modes.
- the first calibration processing mode is different from the aerial image operation mode, that is, the calibration processing is executed when the aerial image operation mode is not executed.
- the second calibration processing mode is a display mode.
- the calibration process is executed during the execution of the aerial image operation mode.
- These first and second calibration processing modes are executed by the calibration unit 203 shown in FIG. Note that which one of the first and second calibration processing modes is to be executed is selected by operating a calibration processing mode selection operation button provided on the display device 1 (not shown), and this calibration is performed.
- the control unit 20 may select and execute the aerial image operation mode. Further, the operation button for selecting the calibration process mode may not be provided, and the second calibration process mode may be always performed.
- the first and second calibration processing modes will be sequentially described. Note that the first and second calibration processing modes may be selected from an aerial image icon in addition to the operation buttons.
- the calibration unit 203 in FIG. 2 switches the first calibration processing mode. to start.
- the image generation unit 201 generates display image data
- the display unit 11 displays a display image used for calibration processing based on the display image data.
- 4 and 5 show an aerial image 300 of the display image for the calibration process.
- the aerial image 300 includes a calibration icon 300A, and the calibration icon 300A is superposed with a message “Perform calibration. Touch this icon”.
- the detection reference control unit 204 initially sets the detection reference 40 at a position near the aerial image 300, for example, a position slightly above the aerial image 300.
- the initial position of the detection reference 40 may coincide with the aerial image 300 or may be set at a position slightly lower than the aerial image 300.
- the above message “Calibrate. Touch this icon” does not necessarily have to be displayed during the calibration process. For example, when the user who has selected the calibration processing mode recognizes in advance what to operate in the calibration processing mode, the above message display is unnecessary.
- the operation detector 13 detects the approaching movement of the user's fingertip F to the icon 300A, that is, the downward movement as a change in capacitance. .
- the fingertip F moves downward and reaches a dotted line position 50 slightly above the detection reference 40, the user reaches the display position of the icon 300A and presses the icon 300A.
- the fingertip F is moved upward by a predetermined distance.
- the operation detector 13 detects the downward movement of the fingertip F, that is, the depression and the subsequent upward movement of a predetermined distance as a change in capacitance.
- the detection reference control unit 204 determines that an operation has been performed on the display position of the icon 300A.
- the arrival position when the user's fingertip F moves down a predetermined distance after being pressed down to operate the display position of the icon 300A, the position that has moved downward is defined as the arrival position. That is, the dotted line position 50 is referred to as an arrival position.
- the detection reference control unit 204 moves the detection reference 40 to the position of the arrival position 50 as shown in FIG.
- the position data of the changed detection reference 40 is stored in the storage unit 205 of FIG.
- the detection reference control unit 204 moves, that is, changes the detection reference 40 from the arrival position 50 to a position above the predetermined distance d1, and the changed detection reference 40.
- the position data may be stored in the storage unit 205 of FIG.
- the predetermined distance d1 is set to about 1 mm to 10 mm, for example.
- the predetermined distance d1 may be changed for each user who uses the apparatus, and may be set based on the length from the tip of the user's finger to the first joint.
- the predetermined distance d1 may be set to a predetermined value included in about 1 ⁇ 4 to 1 ⁇ 2 of the length from the tip of the user's finger to the first joint.
- the reaching position 50 of the finger is located above the detection reference 40 has been described above. However, when the reaching position 50 is located below the detection reference 40, the reaching position 50 is determined in the same manner as described above.
- the detection reference 40 is changed based on the arrival position 50. Even when the arrival position 50 matches the detection reference 40, the arrival position 50 is determined in the same manner as described above. However, since the arrival position 50 matches the detection reference 40, the detection reference 40 is not changed.
- the detection reference control unit 204 includes the operation detector 13. Based on the detected output, it is determined that the finger has reached the detection reference 40, but the display 13 is not switched in the first calibration processing mode.
- the display of the display device 13 is not switched.
- a highlight display such as blinking the icon 300A may be performed to notify the user that the finger has reached the detection standard.
- an operation of pushing down the icon 300A by the user is given as an example, but the present invention is not limited to this.
- the detection reference 40 may be changed based on a place where the predetermined non-contact operation is performed.
- the predetermined non-contact operation is, for example, a gesture operation that touches the icon 300A.
- the detection reference 40 may be changed based on the position where the operation for touching the icon 300A is performed.
- Examples of the touching operation include a gesture of shaking the icon 300A with the user's hand.
- a gesture that shakes with the user's hand ends, a position where it is determined that the user's hand has stopped, or a position where a gesture that shakes with the user's hand is started You may decide based on.
- the user feels that the finger has operated the display position of the icon 300 ⁇ / b> A of the aerial image 300, from the finger pressing movement to the upward movement by a predetermined distance.
- the detection reference 40 is changed for the user to a position above the finger arrival position 50 by a predetermined distance d1, and the position of the detection reference 40 and the display position of the aerial image 300 is determined. Change the relationship. That is, the positional relationship between the detection reference 40 and the display position of the aerial image 300 is changed based on the user's operation, which is one piece of information of the user who uses the apparatus. In addition, when changing the detection reference 40 for a user, it is not always necessary to detect the user who uses the apparatus. Based on the detection of the above-described operation by the operation detector 13, the detection reference 40 and the aerial image 300 are detected. The positional relationship with the display position may be changed.
- the detection reference control unit 204 changes the position of the detection reference 40, the position of the entire detection reference 40 may be moved, or the detection reference 40 corresponds to the icon 300A operated by the user's finger. Only a part of the detection reference 40 may be moved.
- the reason why the detection reference 40 is changed to a position above the arrival position 50 by a predetermined distance d1 is as follows. That is, when the user generally operates the touch panel, the finger is brought into contact with the touch panel and the finger is pushed down somewhat, so that the fingertip is also in the air when operating the display position of the icon 300A of the aerial image 300. Even if the display position of the icon 300A of the image 300 is manipulated, it does not immediately move upward by a predetermined distance, but tends to move upward by a predetermined distance after being pressed down somewhat.
- the finger arrival position 50 tends to be slightly lower than the position at which the finger feels that the display position of the icon 300 ⁇ / b> A has been manipulated, so that the detection reference 40 is a predetermined distance d ⁇ b> 1 above the arrival position 50. It is changed to the position.
- the amount of pressing down may be relatively small or may not exist depending on the user. Therefore, the position of the detection reference 40 may be changed to substantially coincide with the arrival position 50.
- the relative positional relationship between the aerial image 300 for calibration and the detection reference 40 is set to a relationship suitable for the operation characteristics of the user.
- the detection reference 40 may be substantially coincident with the arrival position 50 as described above instead of above the arrival position 50, or may be changed to a position below the arrival position 50.
- the arrival position 50 of a certain user is positioned above the aerial image 300, and the distance between the arrival position 50 and the upper limit 13 a of the capacitance detection range 13 A is greater than the distance between the arrival position 50 and the aerial image 300.
- the detection reference 40 is changed to a position higher than the arrival position 50, the detection reference 40 is too close to the upper limit 13a of the capacitance detection range 13A, so that the detection reference 40 matches the arrival position 50.
- the position can be changed to a position below the arrival position 50.
- the method of determining the arrival position 50 is not limited to the method of determining based on the change from the above-described push-down movement to the upward movement of a predetermined distance, and may be determined by other various methods as described below.
- Can do For example, the user feels that the finger has reached the display position of the icon 300A and has performed a pressing operation on the icon 300A, and stops the downward movement of the finger, that is, the pressing.
- the detection reference control unit 204 determines that the finger pressing is stopped when the capacitance value detected by the operation detector 13 hardly changes, and determines that the pressing stop position is the arrival position 50. That is, determine.
- the stop of the downward movement is determined by the fact that the capacitance value detected by the operation detector 13 does not change for a short time, for example, about 0.1 to 1 second.
- the speed vector of the user's finger movement that is, the finger moving speed and moving direction is detected from the change in capacitance, and the direction of the finger speed vector changes from the downward direction to the reverse direction. Based on the change and the speed vector in the reverse direction having a predetermined magnitude, the position of the finger when the speed vector having the predetermined magnitude in the reverse direction is detected may be determined as the arrival position.
- the finger position at the time when the direction of the velocity vector changes from the downward direction to the reverse direction, that is, the lowest position is set as the arrival position. If it is determined and the predetermined size is set to a value other than zero, a position where the finger is a predetermined distance above the lowest position is determined as the arrival position. As described above, the arrival position is determined as the lowest position of the finger or a position in the vicinity thereof when the detection reference control unit 405 determines that the finger has operated the icon display position.
- the finger or stylus determines the arrival position with reference to the portion of the aerial image 300 that contacts the icon 300A, that is, the position of the fingertip portion or the lowest position of the stylus, but instead, The arrival position may be determined based on the position of the fingertip of the user's finger or the position of the first joint of the finger. Further, not only the user's finger but also the user's foot or elbow or the like may be used to determine the arrival position based on the foot or elbow or the like.
- a stylus a mark may be attached to a predetermined position of the stylus, and the reaching position may be determined based on the position of the mark. In the case where the reaching position is determined using the first joint of the finger, the stylus mark, or the like, the imaging described in Modification Example 8 described later is used instead of using the capacitive panel as the operation detector 13. It is desirable to use a device or the like.
- step S1 when the user recognizes that the first calibration processing mode is selected by the operation button for selecting the calibration processing mode, the process proceeds to step S2.
- step S2 the calibration unit 203 shown in FIG. 2 starts the first calibration processing mode, and proceeds to step S3.
- step S ⁇ b> 3 the image generation unit 201 generates display image data for calibration, and the display control unit 202 displays a calibration image based on the display image data on the display device 11 and also performs detection reference control.
- the unit 204 sets the detection reference to the initial position.
- the display image of the display 11 is the aerial image 300 for calibration shown in FIG. 4 generated by the imaging optical system 12, and the aerial image 300 is displayed with an icon 300A and a message “Calibrate. "Please touch”.
- the operation detector 13 detects the downward movement of the user's fingertip F, and proceeds to step S5.
- step S5 the detection reference control unit 204 shown in FIG. 2 determines whether or not the finger has reached the arrival position based on the detection output of the operation detector 13. If an affirmative determination is made in step S5, that is, if it is determined that the finger has reached the arrival position, the process proceeds to step S6. If a negative determination is made in step S5, that is, if it is determined that the finger is not stopped, the process waits until an affirmative determination is made.
- step S6 the detection reference control unit 204 changes the position of the detection reference 40 based on the arrival position 50, stores the changed position data of the detection reference 40 in the storage unit 205 shown in FIG. Proceed to S7. In step S7, the first calibration processing mode is terminated, and the process proceeds to step S8.
- step S8 the aerial image operation mode is started, and the process proceeds to step S9.
- step S9 the aerial image 30 including the aerial image operation mode icon 30A shown in FIG. 3 is displayed, and the position data of the detection reference 40 changed in the first calibration processing mode in step S6. Based on the position data read from the storage unit 205, the detection reference 40 is set to a position near the aerial image 30 as shown in FIG. In this manner, in this aerial image operation mode, the detection reference 40 suitable for the user operation characteristics set in the first calibration processing mode is used.
- step S ⁇ b> 11 the detection criterion control unit 204 determines whether the finger has reached the detection criterion 40 based on the detection output of the operation detector 13. If an affirmative determination is made in step S11, that is, if it is determined that the finger has reached the detection criterion 40, the process proceeds to step S12. If a negative determination is made in step S11, that is, if it is determined that the finger has not reached the detection reference 40, the process waits until an affirmative determination is made.
- step S12 the display control unit 202 switches the display image of the display 13 to a display image corresponding to the operated icon 30A, and proceeds to step S13.
- step S13 it is determined whether or not a stop operation of the display device 1 has been performed. When an affirmative determination is made in step S13, that is, when a stop operation of the display device 1 is performed, the display device 1 stops. If a negative determination is made in step S13, the process returns to step S10.
- the detection reference is changed based on an operation by the user, and the positional relationship between the aerial image and the detection reference is changed. Since the detection reference of the aerial image operation mode is set at the position of the detection reference changed in the first calibration mode, the aerial image operation is performed based on the detection reference suitable for the operation characteristics of the user and the use environment of the display device 1.
- the mode can be executed.
- the first calibration processing mode is executed prior to the aerial image operation mode after the display device 1 is activated.
- the first calibration processing mode is in the air. It can also be executed after the image manipulation mode.
- the user operates the operation button for selecting the calibration processing mode of the display device 1 when, for example, the user feels uncomfortable with the operation to the display position of the icon 300A. 1 calibration processing mode is selected.
- the first calibration processing mode is executed by interrupting the aerial image operation mode being executed, and the aerial image operation mode is resumed after the end of the first calibration processing.
- the display device 1 instead of the display device 1 selecting the first calibration mode according to the operation of the operation button by the user described above, the user feels uncomfortable due to the uncomfortable feeling with respect to the operation to the display position of the icon 300A.
- the first calibration processing mode may be executed.
- the display device 1 can detect, for example, the user's pulse rate (biological information) and detect that the user feels uncomfortable when the detected pulse rate exceeds a predetermined value.
- FIG. 7 is a diagram showing the aerial image 30 for the aerial image operation mode, the initial position detection reference 40, and the reaching position 50 of the finger
- FIG. 8 is a flowchart showing the operation in the second calibration processing mode. It is. Each process shown in the flowchart of FIG. 8 is performed by executing a program by the control unit 20 after the display device is activated.
- step S41 it is recognized that the second calibration processing mode is selected, and the process proceeds to step S42.
- step S42 both the aerial image operation mode and the second calibration processing mode are started, and the process proceeds to step S43.
- step S43 the aerial image 30 including the icon 30A shown in FIG. 3 is displayed, and the detection reference control unit 204 shown in FIG. 2 sets the detection reference 40 to a predetermined initial position, for example, an aerial image.
- the position 30 is set to a position slightly above the position of the aerial image 30, or the process proceeds to step S44.
- the message “Calibration will be performed during icon operation” is displayed on the aerial image 30 for only a short time. This message may not be displayed.
- step S44 when the user moves his / her finger downward to operate the display position of the icon 30A, the operation detector 13 starts detecting the finger movement, and the process proceeds to step S45.
- step S45 the detection reference control unit 204 determines whether the finger has passed the detection reference 40 during the downward movement based on the detection output of the operation detector 13. If an affirmative determination is made in step S45, that is, if the finger passes the detection reference 40 during the downward movement and further moves downward, the process proceeds to step S46.
- a finger F1 in FIG. 7 indicates a finger that has passed the detection reference 40 during the downward movement.
- step S46 the detection criterion control unit 204 determines that the finger F1 has reached the detection criterion 40, that is, has passed the detection criterion 40, and switches the icon display, that is, the aerial image 30 is operated. Switching is performed according to the icon 30A, and the process proceeds to step S47.
- step S47 the detection reference control unit 204 determines whether or not the finger F1 has reached the arrival position 50. If the determination is affirmative, the process proceeds to step S48. If the determination is negative, the detection reference control unit 204 maintains the determination. .
- step S ⁇ b> 48 the detection reference control unit 204 changes the position of the detection reference 40 based on the arrival position 50.
- the changed detection reference 40 may coincide with the fingertip of the user or may be positioned above the position of the user's fingertip. In this case, since the icon display has already been switched once in step S46, the icon display is not switched.
- step S49 the detection reference control unit 204 determines whether the finger has reached the arrival position 50 based on the detection output of the operation detector 13. If the determination is affirmative, the process proceeds to step S50. In the case of negative determination, it is maintained until an affirmative determination is made.
- the finger F ⁇ b> 2 in FIG. 7 indicates that the arrival position 50 matches the detection reference 40.
- step S50 the detection criterion control unit 204 determines whether the arrival position 50 matches the detection criterion 40 based on the detection output of the operation detector 13, and proceeds to step S51 in the case of an affirmative determination. Then, the process proceeds to step S52. In step S51, since the arrival position 50 matches the detection criterion 40, the icon display is switched, but the detection criterion 40 is not changed.
- step S52 the arrival position 50 is positioned above the detection reference 40 as indicated by the finger F3 in FIG. 7, and the detection reference control unit 204 changes the position of the detection reference 40 based on the arrival position 50. Specifically, the position of the detection reference 40 is changed to the vicinity of the arrival position 50, and the process proceeds to step S55. At this time, when the changed detection reference 40 matches the fingertip of the user, or when the detection reference 40 moves above the position of the user's fingertip, the icon display is switched. On the other hand, when the changed detection reference 40 does not reach the position of the user's fingertip, the icon display switching display is not performed.
- step S55 it is determined whether the end operation of the second calibration processing mode has been performed. If the determination is affirmative, the process proceeds to step S55. If the determination is negative, the process returns to step S44.
- the second calibration processing mode is performed in parallel with the execution of the aerial image operation mode, the aerial image is detected with a detection criterion suitable for the user without being aware of the execution of the calibration processing.
- An operation for the display position can be executed.
- the first and second calibration processing modes do not necessarily have to be selected by the user.
- the display device 1 may automatically select one of the first and second calibration processing modes. Further, not all of the first and second calibration processing modes may be provided, but one of them may be provided.
- the display device 1 according to the first embodiment described above can be modified as follows.
- the display device 1 of Modification 1 calculates the speed or acceleration of the user's fingertip based on the detection output of the operation detector 13, predicts the finger arrival position based on the calculated speed or acceleration, and sets the predicted arrival position to the predicted arrival position.
- the detection reference may be changed based on the above.
- FIG. 9 is a block diagram showing the control unit 20, the display 11 and the operation detector 13 controlled by the control unit 20 in the configuration of the display device 1 of the first modification.
- the display device 1 of the first modification will be described with respect to a configuration that is different from the display device of the first embodiment.
- the speed / acceleration detection unit 206 reads out the capacitance value detected by the operation detector 13 every predetermined time, and calculates the finger moving speed from the change in the capacitance value per predetermined time. At the same time, the movement acceleration of the finger is calculated from the calculated speed.
- the arrival position prediction unit 207 predicts the arrival position of the finger based on the movement speed or acceleration of the finger output from the speed / acceleration detection unit 206.
- the arrival position predicting unit 207 can detect, for example, that the movement of the finger has accelerated or has shifted from a substantially constant speed state to a deceleration state, and can predict the arrival position of the finger from the degree of deceleration.
- the detection criterion control unit 204 changes the detection criterion based on the arrival position predicted by the arrival position prediction unit 207.
- step S ⁇ b> 105 the speed / acceleration detection unit 206 calculates the moving speed and acceleration of the fingertip F based on the detection output of the operation detector 13.
- step S ⁇ b> 106 the arrival position prediction unit 207 calculates the arrival position of the fingertip F based on the moving speed and acceleration calculated by the speed / acceleration detection unit 206.
- the arrival position of the finger calculated by the arrival position prediction unit 207 that is, the predicted arrival position is indicated by a dotted line 60.
- the detection reference control unit 204 changes the detection reference 40 based on the predicted arrival position 60 as shown in FIG. 10C and stores the position data of the detection reference 40 in the storage unit 205.
- the detection reference of the aerial image operation mode is set at the position of the stored detection reference 40.
- both the moving speed and acceleration of the finger may be used, or one of them may be used.
- the speed / acceleration detection unit 206 reads the capacitance value detected by the operation detector 13 every predetermined time, and moves the finger from the change in the capacitance value per predetermined time. While calculating the speed and calculating the movement acceleration of the finger from the calculated speed, the present invention is not limited to this method, and an imaging device may be used as the speed / acceleration detection unit 206.
- the movement speed or acceleration of the user's finger is calculated. However, the user's foot or elbow, or a stylus pen possessed by the user may be used.
- the predicted arrival position 60 of the user's finger is calculated based on the calculated movement speed and acceleration of the user's finger, and the detection reference 40 is changed based on the predicted arrival position 60 of the user's finger. It is not necessary to determine the predicted arrival position 60 of the user's finger. If the user moves unintentionally before performing an operation, calculating the predicted arrival position 60 based on the movement causes the detection reference 40 to be set at an extremely high position, for example. It becomes impossible to set a detection reference.
- the predicted arrival position 60 is calculated only when the movement speed and acceleration of the user's finger exceeding a predetermined threshold is detected.
- the position of the detection reference 40 may be changed based on the arrival position 60.
- the calibration process can be performed quickly.
- the calibration process according to the present modification is applied to the first calibration process mode according to the first embodiment, but it can also be applied to the second calibration process mode.
- the arrival position of the fingertip F is predicted in advance before the fingertip F of the user who is operating the aerial image reaches the detection reference 40, and the predicted arrival position is set. Based on this, it becomes possible to change the detection criteria. Therefore, it is possible to prevent an erroneous operation such as that the user's fingertip F does not reach the detection reference 40 and the icon display is not switched, and the user can perform a comfortable operation.
- the display device 1 according to the first embodiment and the modification 1 detects or predicts the arrival position in one calibration process, changes the detection reference based on the arrival position, and the position of the detection reference
- the data is stored in the storage unit 205, and the detection reference in the aerial image operation mode is set or changed to the position of the detection reference stored in the storage unit 205 in the aerial image operation mode.
- the display device according to Example 2 stores, in the storage unit 205, a plurality of detection reference positions respectively set in a plurality of calibration processes, and an aerial image operation mode based on the stored plurality of detection reference positions.
- the detection standard in is changed.
- the detection reference control unit 204 determines the finger arrival position 50 based on the detection output of the operation detector 13, changes the detection reference 40 based on the arrival position 50, and detects the detection reference 40. Is stored in the storage unit 205. Subsequently, the second calibration process is performed, and similarly changed detection reference position data is stored in the storage unit 205. Further, the third calibration process may be performed subsequently. In this way, one detection reference is calculated from a plurality of detection reference position data stored in the storage unit 205 by a plurality of calibration processes continuously performed, and the calculated detection reference position data Is stored in the storage unit 205. In the aerial image operation mode executed thereafter, the detection reference is set to the position of the calculation detection reference stored in the storage unit 205.
- one detection standard may be calculated by arithmetically averaging a plurality of detection standards 40, or one detection standard may be calculated by geometric averaging.
- a new detection criterion may be determined by appropriately weighting each of the plurality of detection criteria 40.
- a detection reference H N calculated from N-th operation, the position H N + 1 of the detection criteria obtained from N + 1 th operation, 3: 7 ratio of may calculate the detection criteria by weighting. Specifically, (H N ⁇ 3 + H N + 1 ⁇ 7) / 10 is calculated using H N and H N + 1, and a detection criterion is calculated based on the result.
- the weighting is not limited to 3: 7, and the number of operations is not limited to two.
- the finger arrival positions detected for each of the plurality of calibration processes are stored. It may be stored in the unit 205, and one detection criterion may be calculated based on the plurality of stored arrival positions.
- the position of the detection reference may not be changed. Furthermore, in the calibration process, instead of changing the detection reference for each calibration process, the number of times the arrival position was determined and the arrival position actually reached the detection reference in multiple calibration processes. And the number of times that the operation to the icon display position has failed, and when it is determined that the operation has failed more than a predetermined number of times, the detection criterion may be changed.
- a natural operation operation that is normally performed when the user operates the display position of the aerial image icon that is, an operation of moving the finger upward after pressing the icon with the finger, Since it determines the arrival position by detecting the operation of pressing it down and stopping for a short time, the user does not notice that the arrival position is detected and determined in the calibration process, Therefore, the user can execute the calibration process without being aware of it.
- Modification 3 of the first embodiment In the first embodiment, an operation on the display position of the aerial image of the user's finger is detected to determine the arrival position, and the detection criterion is changed based on the arrival position.
- the user specifies the position of the finger that felt that the display position of the aerial image icon was manipulated, the detection reference control unit determines the specified position, changes the detection reference based on the specified position, The positional relationship between the detection reference and the aerial image may be changed.
- a modification example in which the position where the user operates the display position of the aerial image is designated as the designated position will be described. The following description will be given of the case where it is applied to the first calibration processing mode in the first embodiment, but it can also be applied to the second calibration processing mode and the first and second modifications.
- FIG. 12 shows an aerial image 300 of the display image for calibration.
- the aerial image 300 includes a calibration icon 300B.
- This calibration icon 300B indicates the message “Calibrate. Point to the position of this icon with your finger, and move your finger horizontally in that state. "Is superimposed.
- the detection reference control unit 204 sets the detection reference 40 to an initial position in the vicinity of the aerial image 300 as shown in FIG.
- FIG. 13A in order for the user to operate the display position of the icon 300B in accordance with the message superimposed on the icon 300B, the fingertip F is moved downward toward the icon 300B, and the fingertip F is the operation detector shown in FIG.
- the operation detector 13 detects the approaching movement of the user's finger F to the icon 300B, that is, the downward movement as a change in capacitance.
- FIG. 13A shows the finger as seen from the front.
- the operation detector 13 detects the downward movement and lateral movement of the finger F.
- the detection reference control unit 204 determines that the operation detector 13 that has detected the downward movement of the finger has detected the lateral movement of the finger, and moves the finger when the movement is changed from the downward movement to the lateral movement.
- the height position is determined as the designated position 50A.
- the detection reference control unit 204 changes the position of the detection reference 40 based on the designated position 50 ⁇ / b> A, and stores the changed position data of the detection reference 40 in the storage unit 205.
- the designated position 50A is illustrated as the position above the aerial image 300 in FIG. 13B, the user feels that the designated position 50A has reached the icon 300B of the aerial image 300. Since it is the position of time, it may be designated as a position that matches the aerial image 300, or may be designated as a position below the aerial image 300.
- the detection reference control unit 204 determines the finger position when the finger is changed from the downward movement to the horizontal movement, that is, the height of the finger as the designated position 50A. For example, the height of the finger when the finger changes from a downward movement to a horizontal movement and the horizontal movement ends may be determined as the designated position 50A. In addition, the detection reference control unit 204 may use the average value or median value of the finger height from the start of the lateral movement of the finger to the end of the lateral movement as the designated position 50A.
- the calibration process of Modification 3 described above will be described with reference to the flowchart shown in FIG.
- the flowchart of FIG. 14 shows steps S121 to S129, and the subsequent steps are omitted.
- the processing after step S129 is the same as the processing after step S109 in the flowchart shown in FIG.
- step S126 the operation detector 13 detects the lateral movement of the user's finger.
- step S127 the detection reference control unit 204 determines that the finger has changed from the downward movement to the horizontal movement based on the detection output of the operation detector 13, and determines the finger position at the time of the change as the designated position 50A. Based on the specified position 50A, the position of the detection reference 40 is changed, the changed position data of the detection reference 40 is stored in the storage unit 205, and the process proceeds to step S128.
- step S128, the first calibration processing mode is terminated, and the process proceeds to step S129.
- step S129 the aerial image operation mode is started. In the aerial image operation mode, the detection reference is set based on the changed position data of the detection reference 40 read from the storage unit 205.
- the user designates a position where the finger feels that the aerial image display position has been manipulated in the calibration process by changing the finger from a downward movement to a horizontal movement.
- the user designates the position perceived as having operated the display position of the icon 300B as the designated position and performs the calibration process, so that the calibration process can be performed accurately.
- designating the designated position by a change from the downward movement of the finger to the lateral movement has good operability and allows quick calibration processing.
- Modification 4 of the first embodiment In the display device 1 according to the third modification, the user designates the position where the user thinks that the display position of the icon is operated with the fingertip as the designated position by changing the finger from the downward movement to the horizontal movement.
- the display device 1 according to the modification 4 designates a position at which the user thinks that the icon display position is operated with a fingertip by operating another icon.
- This calibration process will be described next. The description will be given of the case where the first calibration processing mode in the first embodiment is applied. However, the description can be applied to the second calibration processing mode and the first to third modifications.
- FIG. 15 shows an aerial image 300 of the display image for calibration.
- the aerial image 300 includes calibration icons 300C and 300D, and the calibration icon 300C indicates the message “Calibrate. Please indicate the position of this icon with your right hand finger. Point with your right hand finger. Touch the icon on the left side with your left finger. "Is superimposed.
- the icons 300C and 300D are juxtaposed so that the icon 300C is positioned on the right side and the icon 300D is positioned on the left side.
- the fingertip of the right hand is moved downward toward the icon 300C, and the fingertip moves into the capacitance detection range 13A of the operation detector 13.
- the operation detector 13 detects the approach movement of the user's finger to the display position of the icon 300C, that is, the downward movement as a change in capacitance.
- the user further moves the finger down and feels that the fingertip is operating the display position of the icon 300C of the aerial image 300, the user follows the message and tries to operate the display position of the icon 300D with the fingertip of the left hand. The fingertip of the left hand is moved toward the icon 300D.
- the operation detector 13 detects the movement of the fingertip toward the icon 300D.
- the detection reference control unit 204 determines the position of the fingertip of the right hand as the designated position 50A.
- the detection reference control unit 204 changes the detection reference 40 based on the designated position 50A, and stores the changed position data of the detection reference 40 in the storage unit 205.
- the right finger that operates the display position of the right icon 300C is determined to be the designated position when it is felt that it has been operated, and therefore needs to move downward to approach the aerial image.
- the left finger that performs the operation to the display position of the left icon 300D only needs to position the finger above or below the icon 300D.
- the image may be moved in the direction parallel to the plane of the aerial image 300, that is, in the horizontal direction to move to a position above or below the icon 300D.
- two fingers of one hand may be used.
- a determination button (not shown) provided on the display device 1 can be pressed.
- the user performs, for example, a predetermined operation with the left hand.
- the position of the fingertip of the right hand when it is detected that a gesture has been performed may be determined as the designated position.
- the display device 1 includes an imaging device 18 (see FIG. 22) of a modified example 8 described later, and uses the image acquired by the imaging device 18 to change the user's gesture (for example, change the hand from goo to par). Detect that has been done.
- the calibration process described above will be described with reference to the flowchart shown in FIG.
- the flowchart of FIG. 16 shows steps S131 to S139, and the subsequent steps are omitted.
- the processing after step S139 is the same as the processing after step S109 in the flowchart shown in FIG.
- step S134 the operation detector 13 starts detecting the downward movement of the fingertip of the user's right hand.
- the user operates the display position of the icon 300D with the fingertip of the left hand.
- step S136 the position of the fingertip of the right hand when the left hand operates the display position of the icon 300D in step S135 is set as the designated position 50A, and the process proceeds to step S137.
- step S137 the detection reference control unit 204 changes the detection reference 40 based on the designated position 50A, stores the changed position data of the detection reference 40 in the storage unit 205, and proceeds to step S138.
- step S138 the first calibration processing mode is terminated, and the process proceeds to step S139.
- step S139 an aerial image operation mode is started.
- the user designates the designated position for designating the position where the finger operates the icon in the calibration process by operating another icon or the determination button of the display device 1. ing. Since the calibration process is performed by allowing the user to specify the position where the user perceives the icon 300 in this way, the calibration process can be performed accurately. In addition, designating such a designated position by operating another icon or a button on the display device can quickly perform a calibration process.
- Modification 5 of the first embodiment In the display device of the fifth modification, when the user thinks that the display position of the icon is operated with the fingertip, the user designates the designated position by stopping the finger for a predetermined time.
- this modification will be described with respect to the case where it is applied to the first calibration processing mode in the first embodiment, it can also be applied to the second calibration processing mode and the above-described modifications 1 to 4. .
- the message “Calibrate. Please indicate the position of this icon. Keep the indicated state for a while.” Is superimposed on the icon included in the aerial image for calibration. .
- the operation detector 13 detects that the downward movement of the finger has stopped for a predetermined time.
- the detection reference control unit 204 determines the stop position of the finger as the designated position based on the detection output of the operation detector 13 at this time. The determination of the designated position is performed as follows. That is, it is determined that the operation to the display position of the icon 300A has been performed when the fingertip F moving downward has stopped and stayed within a relatively small predetermined stop range in the vertical direction for a predetermined time or more. .
- the reason why the operation of the fingertip F in the display position of the icon 300A of the fingertip F is determined when the fingertip F remains within the predetermined stop range for a predetermined time or more is that the operation is the display position of the icon 300A of the aerial image 300 This is because, unlike the operation on the touch panel, the fingertip F may not completely stop at the display position of the icon 300A.
- the predetermined stop range for determining the specified position is set to a value sufficiently smaller than the capacitance detection range 13A of the operation detector 13, for example, 5 mm, and the predetermined time is, for example, 2 seconds. Set to degree.
- Modification 6 of the first embodiment In the display device according to the sixth modification, the user designates a designated position that he / she thought has operated the icon display position with his / her fingertip.
- this modification will be described with respect to the case where it is applied to the first calibration processing mode in the first embodiment, it can also be applied to the second calibration processing mode and the above-described modifications 1 to 5.
- FIG. 17 is a block diagram showing the control unit 20, the display 11 and the operation detector 13 controlled by the control unit 20 in the configuration of the display device 1 of the present modification.
- the display device 1 includes the sound collector 14, and the control unit 20 includes a sound detection unit 208.
- the sound collector 14 collects sound around the display device 1 and outputs the sound as sound data to the sound detection unit 208.
- a commercially available microphone can be used as the sound collector 14.
- the voice detection unit 208 identifies the voice data from the sound collector 14 and determines whether or not the voice data corresponds to “high”.
- FIG. 17 when the calibration unit 203 activates the first calibration processing mode, the image generation unit 201 generates display image data, and the display unit 11 uses the display image data based on the display image data. Display the display image.
- FIG. 18 shows an aerial image 300 of the display image for calibration.
- the aerial image 300 includes a calibration icon 300E, and the message “Perform calibration. Please say“ High ”when you touch this icon” is superimposed on the calibration icon 300E. ing.
- the user pushes the fingertip toward the icon 300E and touches the icon 300E to operate to the display position of the icon 300E according to the message superimposed on the icon 300E. Speak out.
- the operation detector 13 detects the downward movement of the fingertip, and the sound collector 14 collects this voice and outputs it to the voice detector 208 as voice data.
- the detection reference control unit 204 determines the position of the fingertip detected by the operation detector 13 at that time as the specified position 50A, and specifies The detection reference 40 is changed based on the position 50 ⁇ / b> A, and the position data of the changed detection reference 40 is stored in the storage unit 205.
- the calibration process described above will be described with reference to the flowchart shown in FIG.
- the flowchart of FIG. 19 shows steps S141 to S149, and the subsequent steps are omitted.
- the processing after step S149 is the same as the processing after step S109 in the flowchart shown in FIG.
- step S ⁇ b> 145 the voice detection unit 208 determines whether the user has uttered “high” based on the output from the sound collector 14. If the determination in step S145 is affirmative, that is, if it is determined that the user has touched the icon 30F and has made a “high” voice, the process proceeds to step S146. If a negative determination is made in step S145, the process waits until an affirmative determination is made. In step S146, the detection reference control unit 204 determines, that is, determines the position of the fingertip as the designated position 50A when the sound detection unit 208 determines the sound “high”.
- step S147 the detection reference 40 is changed based on the designated position 50A, the position data of the changed detection reference 40 is stored in the storage unit 205, and the process proceeds to step S148.
- step S148 the first calibration processing mode is terminated, and the process proceeds to step S149.
- step S149 the aerial image operation mode is started.
- the user designates the position of the finger that is thought to have operated the icon display position, that is, the designated position by speaking. By designating such a reaching position by utterance, calibration processing can be performed quickly.
- the display device 1 does not include the sound collector 14, and the sound data acquired by the external sound collector is input via wireless or wired, and the sound is detected using the sound data input from the external sound collector.
- the unit 208 may perform voice detection.
- the detection reference is described as a single plane or a plane having a step. However, the detection reference may be an area having a thickness rather than a surface. A detection reference calibration process having such a region will be described. The description will be given of the case where the first calibration processing mode in the first embodiment is applied. However, the description can be applied to the second calibration processing mode and the first to sixth modifications. Further, the present invention may be applied to the above-described modified examples 1 to 6.
- the display device 1 of the present modification is the same device as the display device 1 described in the first embodiment, and its configuration is represented by the block diagram shown in FIG.
- FIG. 20A shows an example of an aerial image 30 displayed in the aerial image operation mode displayed by the display device 1 according to this modification
- FIG. 20B shows the main body 10 or the operation detector 13 and the aerial image 30.
- a positional relationship between the detection reference 40 and the detection reference 40 are schematically shown.
- the aerial image 30 illustrated in FIG. 20A is the same as the aerial image 30 illustrated in FIG.
- the detection reference 40 is set as a region sandwiched between the upper surface 401 and the lower surface 402 and having a thickness d2 in the vertical direction.
- the aerial image 30 is formed above the operation detector 13 of the display device 1 at a position separated by a distance H1, and the detection reference 40 has an upper surface 401 and a lower surface 402, respectively.
- H3 and H4 H1 ⁇ H3 ⁇ H4
- d2 H4 ⁇ H3
- the aerial image 30 and the detection reference 40 are set so as to be positioned within the capacitance detection range 13A.
- the detection reference 40 is set above the aerial image 30 in FIG. 3B, but if it is within the capacitance detection range 13A of the operation detector 13, set it below the aerial image 30.
- the position of the aerial image 30 may be set to be included in the region d2.
- the operation detector 13 outputs a detection output corresponding to any of the distances H3 to H4 according to the position of the fingertip when the user's fingertip enters the detection reference 40, and the detection reference control unit 204
- the detection output of the operation detector 13 is a detection output corresponding to between the distances H3 and H4
- the display device 1 detects that the user has operated the display position of the icon 30A of the aerial image 30, and executes a function corresponding to the operated icon 30A. For example, the display image on the display 11 is switched.
- the detection reference control unit 204 moves the finger to the display position of the icon 30A regardless of where the finger is positioned within the thickness d2 of the detection reference 40. Since it is determined that the operation is performed, the operation to the display position can be determined more reliably. For example, when the finger is not moved downward from right above the icon 30A and is moved downward from obliquely above, if the detection reference 40 is a plane as shown in FIG. It may happen that the detection part 40 does not pass through, but passes through the side part, and the operation of the finger icon 30A to the display position cannot be determined.
- the detection reference 40 has the thickness d2
- the display device of the modified example 7 has the detection reference 40 having the thickness d2, It is possible to reliably detect that the detection standard 40 has been entered.
- the first calibration processing mode related to the display device 1 of the modified example 7 having such a detection reference 40 of the thickness d2 will be described below. The same applies to the second calibration processing mode related to the detection reference 40 having the thickness d2. In the following description, the same parts as those in the first embodiment are omitted.
- the aerial image 300 including the icon 300A shown in FIG. 4 is displayed, and the detection reference 40 is initialized.
- the fingertip F is moved downward toward the icon 300A, and the fingertip F is electrostatically charged by the operation detector 13 in FIG.
- the operation detector 13 detects the approaching movement of the user's finger F to the icon 300A, that is, the downward movement as a change in capacitance.
- the operation detector 13 detects the downward movement of the fingertip F, that is, the depression and the subsequent upward movement of a predetermined distance as a change in capacitance. Based on the change in capacitance, the detection reference control unit 204 determines the arrival position 50 or the designated position 50A described above.
- the detection reference control unit 204 changes the detection reference 40 to the detection reference 40 of the three-dimensional region having the thickness d2 based on the arrival position 50 or the designated position 50A as shown in FIG.
- the detection reference 40 for the thickness d2 is set so as to include the arrival position 50 or the designated position 50A.
- the detection reference 40 for the thickness d2 is more than the arrival position 50 or the designation position 50A. It may be set at an upper or lower position.
- a three-dimensional region sandwiched between the position separated by a predetermined distance in the direction opposite to the direction pushed by the user pressing from the arrival position 50 or the designated position 50A and the arrival position 50 or the designated position 50A is thickened.
- the detection reference control unit 204 stores the position data of the detection reference 40 of the thickness d2 in the storage unit 205. Thereafter, when the aerial image operation mode is executed, the detection reference 40 of the thickness d2 is set based on the position data stored in the storage unit 205.
- the detection reference control unit 204 is a procedure for setting the detection reference 40 that is a single surface based on the arrival position 50.
- the present modification 7 is different in that the detection reference 40 of the thickness d2 is set based on the arrival position 50 or the designated position 50A.
- the detection reference 40 is set so that the surface including the reaching position 50 or the designated position 50A of the user's fingertip is intermediate between the upper surface 401 and the lower surface 402 of the detection reference 40. It is also effective to perform calibration processing so as to set. In addition, as long as the fingertip is positioned above the calibration processing icon, even if the user's finger moves obliquely, that is, at an angle with respect to the Z direction, the arrival position or designation The position can be determined.
- the display device 1 of Modification 8 includes an imaging device (for example, a digital camera) 18 as an operation detector, and the imaging device 18 is disposed on the upper surface of the display device 1. A block diagram of such a display device 1 is shown in FIG.
- the display device 1 whose block diagram is shown in FIG. 23 includes an image analysis unit 209 in the control unit 20.
- the imaging device 18 images an object positioned above the display 11, that is, a user's finger, and the captured image is input to the image analysis unit 209.
- the image analysis unit 209 analyzes the captured image input from the imaging device 18 and obtains the position of the user's fingertip. That is, the image analysis unit 209 determines which icon of the plurality of icons is operated by the user's fingertip from the position of the finger image in the captured image. Further, the image analysis unit 209 compares the size of the finger image in the captured image with the size of the reference finger, specifically, the size of the finger at a predetermined height position captured in advance.
- the height position that is, the lowering position of the finger is determined. Thereby, the position of the user's fingertip in the three-dimensional space is obtained.
- the display device 1 of the modification 8 obtains the same information as the information on the position of the fingertip obtained by the operation detector 13 using the capacitance panel by analyzing the captured image by the imaging device 18. be able to. Therefore, the display device of the modification 8 uses the imaging device 18 instead of the capacitance panel described in the above-described embodiment and various modifications, and performs the same processing as in the above-described embodiment and modification. It can be performed.
- the image analysis unit 209 calculates the finger height position from the size of the finger in the captured image, but instead, the image capturing device 18 is mounted on the digital camera.
- the height position of the finger can also be detected by the phase difference focus detection device and the image recognition device.
- the image recognition device recognizes the finger
- the phase difference focus detection device detects the defocus amount for the finger recognized by the image recognition device, and calculates the finger height position from the defocus amount. be able to.
- the height position of the finger can be detected in the same manner by using a contrast detection type focus detection device mounted on a digital camera instead of the phase difference type focus detection device.
- a camera equipped with a TOF (Time-of-Flight) device can be suitably used instead of the phase difference type focus detection device or the contrast detection type focus detection device.
- the TOF camera emits infrared rays from the camera body, receives infrared rays that are reflected by the object and incident on the TOF camera, and determines the distance from the TOF camera to the object based on the phase change between the emitted light and the received light. calculate. Therefore, the distance from the TOF camera to the user's fingertip is obtained by emitting infrared light from the TOF camera toward the user's fingertip using the measurement object as the user's fingertip and receiving the reflected light from the fingertip. Can do.
- the imaging device 18 is preferably a wide-angle lens as its imaging lens to cover the entire aerial image 30, and may be a fish-eye lens. Further, a plurality of (for example, two) imaging devices may be mounted, and the position of the user's fingertip may be further detected from the captured images.
- FIG. 24 shows only the internal configuration of the display device 1, and the main body of the display device is omitted.
- a space for disposing the TOF camera 118 ′ is provided in the center of the display 11 and the imaging optical element 12, and the TOF camera 118 ′ is disposed in the space.
- the TOF camera 118 ′ scans infrared rays within a predetermined range to irradiate the user's fingertip with infrared rays, and measures the distance from the TOF camera 118 ′ to the user's fingertip based on the phase change of the reflected light.
- the position of the user's fingertip with respect to the TOF camera 118 ′ in the three-dimensional space can be obtained.
- the same information as the detection information of the fingertip position by the capacitance panel can be obtained from the distance measurement result by the TOF camera 118 ′.
- a space for arranging the TOF camera 118 ′ is provided in the central portion of the display 11 and the imaging optical system 12, and the TOF camera 118 ′ is arranged in the space.
- the configuration is not limited thereto, and a configuration in which the TOF camera 118 ′ is installed outside the display device 11 may be used.
- the aerial image 30 is formed at a position separated by a distance H1 above the imaging optical system 12 of the display device 1, and the detection reference 40 is an image. It is set at a position above the optical system 12 by a distance H2 (H1 ⁇ H2).
- the imaging device 18 has a detection range 13 ⁇ / b> A for detecting the position of the user's fingertip above the surface of the imaging optical system 12.
- the limit of the range that can be imaged above the imaging device 18 is indicated by a broken line 13a, and the interval between the detection limit 13a and the surface of the imaging optical system 12 is indicated as a detection range 13A.
- the aerial image 30 and the detection reference 40 are set so as to be positioned within the detection range 13A.
- the detection reference 24 is set above the aerial image 30 in FIG. 24, it may be set below the aerial image 30 or at the position of the aerial image 30 as long as it is within the detection range 13A.
- the range other than the region set as the detection reference 40 in the detection range 13A is outside the detection reference 41.
- the detection range 13A is not limited to the limit of the range that can be imaged by the imaging device 18, but a part of the range that can be imaged (for example, a predetermined end portion in the left-right direction in FIG. 24). It may be set as a range excluding (range).
- the display device 1 of Modification 8 includes the imaging device 18 instead of the capacitance panel 13 as an operation detector.
- the display device 1 may include both the operation detector 13 and the imaging device 18.
- the detection range 13A of the operation detector 13 shown in FIG. 3 is divided into two vertically and a lower detection range (detection range closer to the display 11) and an upper detection range. (The detection range farther from the display 11), the lower detection range can be the detection range of the capacitance panel 13, and the upper detection range can be the detection range of the imaging device 18.
- the imaging device 18 detects the first half of the finger's lowering movement, and the capacitance panel 13 detects the lowering movement of the finger. The second half is detected.
- the capacitance panel 13 can detect the vicinity range above the display device 13 with high accuracy, and conversely, the imaging device 18 can image the very vicinity range above the display device 13. Since it may be difficult, it is preferable to share the detection range of the capacitance panel 13 and the detection range of the imaging device 18 as described above.
- the division of the detection range 13A into two is not limited to the case where the detection range 13A is equally divided into upper and lower parts, and may be divided into unequal parts.
- the operation detector 13 is not limited to the capacitance panel 13 and the imaging device 18, and other proximity sensors can be used. Therefore, when the detection range 13A is divided, the various operation detectors 13 can share the divided detection ranges.
- the speed / acceleration detection unit 206 shown in FIG. 9 can also calculate the moving speed and acceleration of the finger based on the captured image of the imaging device 18 of FIG. Therefore, when the detection range 13A is divided, the finger movement speed and acceleration are calculated for each of the upper and lower detection ranges, and the stop position prediction unit 207 can also predict the finger arrival position.
- the display device 1 including at least the control unit 20, the display unit 11, and the operation detector 13 has been described as an example.
- the detection apparatus comprised, and the detection apparatus comprised by the control part 20 and the operation detector 13 may be sufficient.
- the control unit 20 may include at least the calibration unit 203 and the detection reference control unit 204.
- a configuration may be appropriately added from the above configuration as necessary.
- a display device 1 according to a second embodiment will be described with reference to the drawings.
- the display device 1 of the present embodiment is incorporated in a mobile phone will be described as an example.
- the display device of this embodiment is not limited to a mobile phone, but can be incorporated in electronic devices such as a portable information terminal device such as a tablet terminal and a wristwatch terminal, a personal computer, a music player, a fixed phone, and a wearable device. Is possible.
- the display device 1 of the present embodiment is the same as the display device 1 shown in FIG. 1, and the configuration of the main part is represented by the block diagram shown in FIG. That is, the control unit 20, the display 11 and the operation detector 13 controlled by the control unit 20 are illustrated.
- the control unit 20 includes an image generation unit 201, a display control unit 202, a calibration unit 203, a detection reference control unit 204, a storage unit 205, and a user information analysis unit 210.
- the main configuration of the control unit 20 is the same as that of the display device 1 according to the first embodiment except that the user information analysis unit 210 is provided.
- the detection reference control unit 204 initializes the detection reference, and changes the detection reference based on the result of calibration processing described later.
- the user information analysis unit 210 analyzes information regarding the input user.
- the detection reference control unit 204 changes the detection reference based on information input from the user information analysis unit 210 during the calibration process.
- the flowchart in FIG. 26 shows steps S201 to S207, and the subsequent steps are omitted.
- the processing after step S207 is the same as the processing after step S109 in the flowchart shown in FIG.
- Each process shown in the flowchart of FIG. 26 is performed by executing a program by the control unit 20 after the display device 1 is activated. This program is stored in the storage unit 205.
- step S201 it is determined whether or not the user has operated the operation button in the user information input mode. If an affirmative determination is made in step S201, that is, if it is determined that the user has selected the user information input mode, the process proceeds to step S202. If a negative determination is made in step S201, that is, if it is determined that the user has not selected the user information input mode, the process proceeds to step S206.
- step S202 the user information input mode is started, and the process proceeds to step S203.
- step S203 it is determined whether the input of user information has been completed. This determination is made based on, for example, whether or not the user has operated a button for instructing the end of user information input. If an affirmative determination is made in step S203, that is, if the user instructs the end of user information input, the process proceeds to step S204. If a negative determination is made in step S203, the process waits until an affirmative determination is made.
- step S204 the user information analysis unit 210 changes the detection reference 40 that is initially set in the aerial image operation mode based on the input user information, and stores the changed position data of the detection reference 40 in the storage unit.
- step 205 the process proceeds to step S205.
- the detection reference 40 is changed to a position above the arrival position 50 by a predetermined distance d1.
- step S205 the user information input mode is terminated, and the process proceeds to step S206.
- step S206 the aerial image operation mode is started.
- the user information includes, for example, at least one kind or a combination of a plurality of user's sex, age, body type (height, arm length), and visual acuity.
- the storage unit 205 stores in advance a plurality of tables related to the arrival position 50 using one or more combinations of gender, age, body type (height), and visual acuity factors as parameters.
- the user information analysis unit 210 selects a corresponding table based on the type and contents of the input user information, and selects a corresponding arrival position 50 from the table.
- the detection criterion control unit 204 determines the detection criterion 40 based on the selected arrival position 50.
- the arrival position 50 stored as a table is, for example, that a woman who is older than a man, a person who is younger than an older person, or a person who is shorter than a tall person reaches a position closer to the operation detector 13.
- a position 50 is set.
- step S201 of the flowchart of FIG. 26 it is determined whether or not the user has operated the operation button in the user information input mode. However, this process is not necessarily performed, and the apparatus acquires user information. You may transfer to step S204.
- information such as the user's gender, age, body type (height, arm length), eyesight, etc. is stored in the storage unit 205 in association with an ID (Identification Code) or password that identifies the user. May be.
- ID Identity Code
- password that identifies the user. May be.
- the user information which is one of the information related to the user, changes the detection reference 40 for the user, and changes the positional relationship between the detection reference 40 and the display position of the aerial image 300.
- the user may be identified by photographing the user with the imaging device 18 described in the modification 8 of the first embodiment and analyzing the captured image. For example, the user's age and sex are specified from the captured image using a known face recognition technique.
- the detection reference control unit 204 sets the detection reference 50 based on information such as the gender and age of the user. In this way, it is possible to omit the user from entering an ID or password.
- the information that identifies the user which is one of the information related to the user, is used to change the detection reference 40 for the user, and the positional relationship between the detection reference 40 and the display position of the aerial image 300 is changed. .
- the second embodiment can be modified as follows.
- the user information by the user may be sent to an information input device different from the display device 1 instead of being sent to the display device 1, and the information may be transferred to the display device 1 via the interface.
- the user information may be recorded in advance on the IC card.
- the display device 1 or the information input device is provided with a card information reading function.
- the display device 1 including at least the control unit 20, the display device 11, and the operation detector 13 has been described as an example.
- the control unit 20 may include at least the calibration unit 203, the detection reference control unit 204, and the user information analysis unit 210.
- a display device 1 according to a third embodiment will be described with reference to the drawings.
- the display device 1 of the present embodiment is incorporated in a mobile phone will be described as an example.
- the display device of this embodiment is not limited to a mobile phone, but can be incorporated in an electronic device such as a portable information terminal device such as a tablet terminal or a wristwatch terminal, a personal computer, a music player, a fixed phone, or a wearable device. Is possible.
- the display device 1 of the present embodiment is the same as the display device 1 shown in FIG. 1, and the configuration of the main part is represented by the block diagram shown in FIG.
- the display device 1 includes a control unit 20, a display 11 that is controlled by the control unit 20, an operation detector 13, and an environment detection unit 19.
- the environment detection unit 19 detects the usage environment around the display device 1.
- the control unit 20 includes an image generation unit 201, a display control unit 202, a calibration unit 203, a detection reference control unit 204, a storage unit 205, and an environment analysis unit 211.
- the environment analysis unit 211 analyzes the environment information input from the environment detection unit 19, determines whether there has been an environment change, and outputs the environment change information to the detection reference control unit 204 when the environment has changed.
- the detection criterion control unit 204 executes calibration processing for the detection criterion based on the environment change information input from the environment analysis unit 211.
- the calibration process of the present embodiment is executed in parallel with the execution of the aerial image operation mode.
- the calibration process of the present embodiment will be described with reference to the flowchart shown in FIG.
- Each process shown in the flowchart of FIG. 28 is performed by executing a program by the control unit 20 after the display device 1 is activated. This program is stored in the storage unit 205.
- step S211 the aerial image operation mode is started, and the process proceeds to step S212.
- step S212 the aerial image 30 including the aerial image operation mode icon 30A shown in FIG. 3 is displayed, and the detection reference control unit 204 sets the detection reference 40 to a predetermined initial position. Proceed to step S213.
- step S ⁇ b> 213 the environment analysis unit 211 determines whether there has been an environment change based on the environment information regarding the use environment detected by the environment detection unit 19. If an affirmative determination is made in step S213, that is, if it is determined that there has been an environmental change, the process proceeds to step S214. If a negative determination is made in step S213, that is, if it is not determined that there has been an environmental change, the process proceeds to step S216.
- step S214 the environmental change information is output to detection criterion control unit 204, and the process proceeds to step S215.
- the detection reference control unit 204 changes the detection reference for the aerial image operation mode based on the environment change information, and the process proceeds to step S216. That is, in step S216, the aerial image operation mode is continued.
- Environmental information includes temperature, humidity, brightness, etc.
- the reason for calibrating the detection reference based on these environmental changes is as follows.
- the display device 1 is used by the user and the temperature of the display device 1 or the temperature around the display device 1 rises, for example, the display 11, the imaging optical system 12, and the like inside the display device 1.
- a fixing member (not shown) for fixing is extended, and accordingly, the distance between the display 11 and the imaging optical system 12 is increased.
- the cause of the temperature change in the vicinity of the aerial image 30 by the user is that the temperature of the display device rises when the user holds the display device.
- the position where the aerial image 30 is generated is closer to the user side than before the temperature rise.
- the brightness in the vicinity of the aerial image 30 changes, and the appearance of the aerial image 30 from the user changes.
- the aerial image 30 may be felt farther than before the brightness actually changes.
- the display device is a device that can be held by the user's hand, such as a mobile phone
- Examples of the environment detection unit 19 include a temperature sensor, a humidity sensor, and a brightness sensor provided in the main body 10 of the display device 1.
- a temperature sensor a humidity sensor
- a brightness sensor provided in the main body 10 of the display device 1.
- the photometric function of the camera can be used.
- the storage unit 205 includes a plurality of correction values for detection criteria that use one or more combinations of factors of temperature change and humidity change inside or near the display device 1 and brightness change factors near the aerial image 30 as parameters. These tables are stored in advance.
- the environment analysis unit 211 determines that the environment has changed, the environment analysis unit 211 selects the table to be summarized according to the factor that has changed when the environment analysis unit 211 determines that the environment has changed. Select the correction value for the relevant detection criterion.
- the detection criterion control unit 204 changes the detection criterion 40 based on the selected correction value.
- the detection reference 40 is changed for the user based on the surrounding environment change information by the user which is one of the information related to the user, and the positional relationship between the detection reference 40 and the display position of the aerial image 300 is changed. change.
- the detection output of the environment detection unit 19 is generated when the detection reference is changed in the first or second calibration processing mode of the first embodiment or the first to eighth modifications of the first embodiment. Can also be used. That is, in the above-described first or second calibration processing mode, instead of changing the detection reference based on the arrival position of the user's finger, the arrival position or designated position of the user's finger and the detection output of the environment detection unit 19 It is also possible to change the detection criteria based on both.
- the environmental changes described above are not limited to ambient environmental changes caused by the user, but changes due to sunlight entering through windows, changes in humidity due to weather, and temperature rise of the device itself as the display device continues to be activated. It is also possible to detect various environmental changes such as the above, change the detection reference based on the detection result, and change the positional relationship between the detection reference and the aerial image.
- the display device 1 including at least the control unit 20, the display device 11, and the operation detector 13 has been described as an example.
- the control unit 20 may include at least the calibration unit 203, the detection reference control unit 204, and the environment analysis unit 211.
- a configuration may be appropriately added as necessary from the above-described configuration.
- FIG. 29 is a diagram illustrating the display device according to the present embodiment
- FIG. 29A is a cross-sectional view illustrating a schematic configuration of the operation detector 100 of the display device 100 according to the fourth embodiment.
- FIG. 19B is a perspective view of an automatic teller machine (ATM device) 200 as an example of an electronic device in which the display device 100 is incorporated.
- ATM device automatic teller machine
- display device 100 is equipped on a front panel for a user to input a password, an amount, and the like.
- the display device 100 is not limited to an automatic teller machine, but can be widely incorporated in various automatic ticket machines such as train and bus tickets and commuter passes, and various information retrieval terminal devices such as libraries and museums. Is possible.
- a coordinate system composed of the X axis, the Y axis, and the Z axis is set for the display device 100 as shown.
- the display device 100 includes a display 111, an imaging optical system 112, and an operation detector 113 in a main body (not shown).
- the display 111 provided inside the main body is composed of, for example, a liquid crystal element, an organic EL element, or the like, and has a plurality of display pixels arranged in a two-dimensional shape.
- the display 111 is controlled by a control unit (not shown) similar to the control unit 20 of the display device 1 according to the first embodiment, and displays an image corresponding to the display image data.
- the imaging optical system 112 is arranged so as to have a predetermined positional relationship with the display device 111.
- the imaging optical system 112 can have a configuration in which two elements in which two types of band-like reflecting portions are arranged in parallel at a constant interval are stacked inside a transparent substrate.
- FIG. 30 is a block diagram showing the control unit 20, the display unit 111 and the operation detector 113 controlled by the control unit 20 in the configuration of the display device 100.
- the configurations of the display device 111 and the operation detector 113 are different from the configurations of the display device 11 and the operation detector 13 of the display device 1 shown in the block diagram of FIG. Except for this, it is substantially the same.
- the control unit 20 includes a CPU, a ROM, a RAM, and the like, and controls various components including the display 111 and the operation detector 113 of the display device 100 based on the control program, and performs various data processing. It includes an arithmetic circuit to execute.
- the control unit 20 includes an image generation unit 201, a display control unit 202, a calibration unit 203, a detection reference control unit 204, and a storage unit 205.
- the storage unit 205 includes a non-volatile memory that stores the control program, a storage medium that stores image data displayed on the display 111, and the like.
- the imaging optical system 112 deflects the light beam emitted from the image displayed on the display unit 111 corresponding to the display image data, and generates an aerial image 30 including icons as shown in FIG.
- the operation detector 113 is provided in the vicinity of the aerial image 30 so as to surround the aerial image 30.
- FIG. 31 shows a plan view of the operation detector 113.
- the operation detector 113 includes a frame-shaped housing 115 having a rectangular cross section parallel to the XY plane. Of the four surfaces constituting the inner surface of the housing 115, a plurality of light projecting elements 116 are disposed on two adjacent surfaces, and a plurality of light receiving elements 117 are disposed on the remaining two adjacent surfaces.
- FIG. 31 of the pair of inner surfaces parallel to the ZX plane of the frame-shaped housing 115 only the light projecting element 116 is provided on the inner surface ZX1, and only the light receiving element 117 is provided on the inner surface ZX2.
- the light projecting element 116 is provided on the inner surface YZ1
- only the light receiving element 117 is provided on the inner surface YZ2. That is, the light projecting element 116 and the light receiving element 117 are provided to face each other.
- the light projecting element 116 a commercially available laser, LED element, or the like can be used.
- the light receiving element 117 a commercially available photodiode or phototransistor can be used. It is also possible to use a light projecting / receiving element in which the light projecting element and the light receiving element are integrated. In this case, a light projecting / receiving element is disposed instead of the light projecting element, and a mirror is disposed instead of the light receiving element. To do.
- the light projecting element 116 and the light receiving element 117 are regularly arranged in an array so as to have a one-to-one correspondence relationship, and light emitted from one light projecting element 116 corresponds to one corresponding light receiving element. It is configured to be incident only on 117.
- the light beam emitted from the light projecting element 116 travels in a plane parallel to the aerial image I (that is, in a plane parallel to the XY plane) and enters the light receiving element 117.
- the detection state of light by the light receiving element 117 is sent to the control unit, and the control unit grasps the detection state of the light receiving element 117 in correspondence with the position of the light receiving element 117.
- a plurality of two-dimensional lattice-like optical path groups parallel to the XY plane are formed inside the housing 115.
- the wavelength of light emitted from the light projecting element 116 is preferably in the infrared region.
- FIG. 32 shows a configuration in which six light projecting elements 116 and six light receiving elements 117 are arranged in the Z direction for the sake of simplicity.
- the six-stage light projecting elements 116 are 116a, 116b, 116c, 116d, 116e, and 116f, respectively, from the Z direction + side.
- the six-stage light receiving elements 117 are respectively referred to as 117a, 117b, 117c, 117d, 117e, and 117f from the Z direction + side.
- the position detector 113 has a detection range between the light projecting elements 116a and 116f, and this detection range corresponds to the detection range 13A of FIG. 3 in the first embodiment. .
- the display control unit 204 generates the aerial image 30 at a position away from the display device 100.
- the aerial image 30 includes an icon 30A
- the generated position in the vertical direction is a plane including the light projecting element 116d and the light receiving element 117d.
- the detection reference control unit 204 sets the detection reference 40 to a predetermined initial position.
- the detection reference 40 is set to a surface including the light projecting element 116c and the light receiving element 117c.
- the number of light projecting elements 116 and light receiving elements 117 be large in the vertical direction.
- the user moves the fingertip downward toward the icon 30A and sets the detection limit of the operation detector 113 (here, the light projecting element 116a and the light receiving element 117a).
- the operation detector 113 detects the approach of the user's fingertip based on the output of the light receiving element 117.
- the detection reference control unit 204 stops the downward movement of the fingertip F when light is not detected by the light receiving elements 117a and 117b and light is detected by the light receiving element 117c for a predetermined time or longer. That is, the operation to the display position of the icon 30B is determined. At this time, a surface including the light projecting element 116b and the light receiving element 117b is set as an arrival position where the fingertip F has stopped moving by operating the display position of the icon 30A.
- the detection reference control unit 204 changes the position of the initially set detection reference 40 to, for example, the position of the arrival position 50, and stores the changed position data of the detection reference 40 in the storage unit of FIG. Store in 205.
- the detection reference 40 suitable for the user is set based on the arrival position 50. That is, among the plurality of light emitting elements 116 and light receiving elements 117 installed in advance, a surface including the light projecting element 116c and the light receiving element 117c is initially set as the detection reference 40, and is detected based on the arrival position by the calibration process.
- the positional relationship between the detection reference 40 and the display position of the aerial image 30 is changed.
- the position of the detection reference 40 is changed to the position of the arrival position 50 by the calibration process, that is, among the plurality of light projecting elements 116 and light receiving elements 117, the light projecting elements 116b and 117c are included. You may change it to a plane.
- the initially set detection reference can be changed to a position above the arrival position based on the arrival position, or can be changed to a position below the arrival position.
- the detection standard can be changed to a detection standard having a width d2.
- the light projecting element 116 a and the light receiving element 117 a above the arrival position 50 are selected, the surface including the light projecting element 116 a and the light receiving element 117 a is the upper surface 401, and the light projecting element below the arrival position 50 116c and the light receiving element 117c are selected, and the surface including the light projecting element 116c and the light receiving element 117c is the lower surface 402, that is, one set of light projections among the plurality of light projecting elements 116 and light receiving elements 117.
- the upper surface 401 may be set by a surface including the element 116 and the light receiving element 117
- the lower surface 402 may be set by a surface including the other set of the light projecting element 116 and the light receiving element 117.
- one set or a plurality of sets of light projecting elements 116 are selected based on the arrival position from among a plurality of detection standards that can be set by a plurality of light projecting elements 116 and light receiving elements 117.
- the light detection element 117 is selected to select a detection reference, and the position of the detection reference is changed.
- the display 111 and the operation detector 113 of the display device 100 have different configurations from the display 11 and the operation detector 13 of the display device 1, the setting of the detection reference 40 is set by a similar procedure. it can. Therefore, the first and second calibration processes by the display device 100 according to the fourth embodiment can be performed by the same procedure as that shown in the flowcharts of FIGS. 6 and 8, respectively.
- the display device 100 according to the fourth embodiment can be applied to the various modifications described above.
- the display device 100 may include an actuator and an encoder, and the light projecting element 116 and the light receiving element 117 may be moved by a minute distance in the Z direction.
- the detection reference 42 when the detection reference 42 is changed from the arrival position 50 to the position of the distance d1, the light projecting element 116 and the light receiving element 117 that are closest to the position of the distance d1 from the arrival position 50 are selected. Based on the difference between the position where the selected light projecting element 116 and the light receiving element 117 are arranged and the distance d1 from the arrival position 50, the light projecting element 116 and the light receiving element 117 are moved by the actuator to project light. The positions of the element 116 and the light receiving element 117 are finely adjusted. That is, by finely adjusting the positions of the light projecting element 116 and the light receiving element 117, the detection reference 42 can be changed from the arrival position 50 to a position closer to the distance d1.
- the fourth embodiment can be modified as follows.
- the operation detector 113 has been described as having a light projecting element 116 and a light receiving element 117 arranged in a two-dimensional manner in a plurality of stages in the Z direction.
- the light projecting elements 116 and the light receiving elements 117 arranged two-dimensionally may be only one stage.
- a display device 100 having such a configuration equipped with the operation detector 113 ′ is shown in FIG.
- the operation detector 113 ′ includes a frame-shaped housing 115 ′, and among the four surfaces constituting the inner surface of the frame-shaped housing 115 ′, a plurality of light projecting elements are arranged in a row parallel to the XY plane on two adjacent surfaces. 116 is arranged, and a plurality of light receiving elements 117 are arranged in a row parallel to the XY plane on the remaining two adjacent surfaces. That is, the operation detector 113 ′ is configured by only one of the six-stage operation detector 113 described with reference to FIG.
- An actuator 119 is connected to the housing 115 ′ and reciprocates in the Z direction at a predetermined cycle (for example, 10 cycles per second).
- the position of the housing 115 ′ is detected by a sensor that detects a position incorporated in the actuator 119, for example, an encoder (not shown). In this case, a predetermined position included in a range in which the housing 115 ′ can reciprocate is set as the detection reference 40.
- the user pushes the fingertip toward the aerial image 30.
- the position of the fingertip enters the moving range of the housing 115 ′, the light emitted from the light projecting element 116 is blocked by the user's finger and does not reach the light receiving element 117.
- the position that is blocked by the user's finger and does not reach the light receiving element 117 is detected by the encoder, and the position of the user's fingertip is detected.
- the user In order to operate the display position of the icon 30A included in the aerial image 30, the user further pushes the fingertip toward the aerial image 30, and the user reaches the icon 30A and performs an operation on the display position of the icon 30A. I feel that I'm feeling and stop my fingertips.
- the control unit 20 determines that the user's fingertip is stationary. At this time, the position where the user's fingertip is stationary is set as the arrival position, and the position of the detection reference 40 is set based on the determined arrival position.
- the flowchart showing each process is substantially the same as that shown in FIG.
- the display device 1 including at least the control unit 20, the display device 111, and the operation detector 113 has been described as an example.
- the display device 1 includes only the control unit 20. It may be a detection device or a detection device including the control unit 20 and the operation detector 113.
- the control unit 20 may include at least the calibration unit 203 and the detection reference control unit 204.
- a configuration may be appropriately added as necessary from the above-described configuration.
- the position of the user's fingertip is detected using an operation detector using a light projecting element and a light receiving element.
- an imaging unit can be provided as the operation detection unit.
- the display device may be equipped with a camera as an imaging unit, the movement of the user's fingertip may be detected by this camera, and the calibration process of the detection reference 40 may be performed based on the information.
- FIG. 34 shows a display device 100 ′ having a display 111 and an imaging optical system 112 similar to the display device 100 described in the fourth embodiment.
- the display device 100 ′ is different from the display device 100 in that the operation detector 113 is not provided, but an imaging device (for example, a digital camera) 118 is provided instead.
- the position of the user's finger is grasped by the imaging device 118.
- the lens attached to the imaging device 118 is preferably a wide-angle lens to cover the entire aerial image 30, and may be a fish-eye lens. Further, a plurality of (for example, two) imaging devices may be provided, and the position of the user's fingertip may be further detected from the captured images.
- the positional relationship between the detection reference 40 and the aerial image 30 (or the icon 300A or the like) by moving the position of the detection reference 40. Control was done to change. However, in order to change the relative positional relationship between the detection reference 40 and the aerial image 30, the aerial image 30 may be moved. In order to change the relative positional relationship between the detection reference 40 and the aerial image 30, both the detection reference 40 and the aerial image 30 may be moved.
- the detection reference is controlled or changed by the calibration process to change the positional relationship between the detection reference and the display position of the aerial image.
- a fifth embodiment in which the display position of the aerial image is changed by processing to change the positional relationship between the detection reference and the display position of the aerial image will be described.
- 35 and 36 show a display device according to a fifth embodiment. Similar to the display device of the first embodiment, the display device 1 of the fifth embodiment includes a main body 10 incorporating a control unit 20, a display 11, and an imaging optical system as shown in FIG. 12 and the operation detector 13, and as shown in FIG.
- the display device 1 further includes a display position changing unit 500 and a display position control unit 220 in addition to the above-described configuration.
- the display position changing unit 500 includes a driving unit such as a motor or an actuator, and moves the imaging optical system 12 in the optical axis direction of the imaging optical system 12 as indicated by an arrow.
- the display position of the formed aerial image 30 is moved and changed in the Z-axis direction, that is, the optical axis direction.
- the imaging optical system 12 In order to move the aerial image 30 in the upward direction, that is, in the direction away from the display 11, the imaging optical system 12 is moved in the downward direction, that is, in the direction approaching the display 11. In order to move in the direction approaching the display device 11, the imaging optical system 12 is moved upward, that is, in a direction away from the display device 11.
- the display position changing unit 500 moves and changes the display position of the aerial image 30 by moving the display unit 11 in the optical axis direction of the imaging optical system 12 instead of moving the imaging optical system 12.
- the display 11, the imaging optical system 12, and the operation detector 13 have the same configurations as the display 11, the imaging optical system 12, and the operation detector 13 of the first embodiment shown in FIG.
- the imaging optical system 12 or the display device 11 of the present embodiment can be used, it can be moved in the optical axis direction of the imaging optical system 12 as described above.
- a drive unit such as a motor or an actuator is provided, and the imaging optical system 12 is moved in the optical axis direction of the imaging optical system 12 as indicated by an arrow, and is formed by the imaging optical system 12.
- the display position of the aerial image 30 is changed in the Z-axis direction, that is, in the optical axis direction, and will be described.
- the display position control unit 220 controls the display device 11 without being limited thereto.
- the display image for viewing with the right eye and the image for viewing with the right eye may be displayed as a display image for viewing with the left eye having parallax, and the depth display position of the aerial image 30 may be changed.
- the image generation unit 201, the display control unit 202, the calibration unit 203, the detection reference control unit 204, and the storage unit 205 are the same as the image generation unit 201, the display control unit 202, and the calibration in the first embodiment shown in FIG.
- the same function as the operation unit 203, the detection reference control unit 204, and the storage unit 205 is achieved.
- the control unit 20 includes the display position control unit 206 as described above, and the display position control unit 220 moves the aerial image 30 based on the finger arrival position or designated position detected or determined in the calibration processing mode. The amount is calculated or determined, and the display position changing unit 500 changes the display position of the aerial image 30.
- the aerial image operation mode is the same as that of the display device of the first embodiment.
- the aerial image operation mode aerial image 30 shown in FIG.
- the image is displayed by the image optical system 12, and the inspection reference control unit 204 sets the inspection reference 40 at a predetermined initial position.
- the operation detector 13 detects the downward movement of the finger. Based on the detection output of the operation detector 13, the detection reference control unit 204 determines that the finger has reached the position of the detection reference 40, and a function corresponding to the operated icon 30A is executed by this determination. For example, the display content of the aerial image 30 is switched.
- the display control unit 202, the display 11, and the imaging optical system 12 form the aerial image 300 for calibration processing shown in FIG.
- the control unit 204 sets the detection reference 40 in the vicinity of the aerial image 300 at an initial position.
- the operation detector 13 detects the downward movement.
- the detection reference control unit 204 determines the reaching position 50 of the finger based on the detection output of the operation detector 13. For this determination, the method described in the first embodiment or the method described in the first and second modifications of the first embodiment is used.
- the display position control unit 206 causes the display position changing unit 500 to move the position of the aerial image 300 in the optical axis direction of the imaging optical system 12 based on the finger arrival position 50.
- the display control unit 202, the display unit 11, and the imaging optical system 12 may form the aerial image 300 for calibration processing shown in FIGS. 12, 15, and 18, and in this case,
- the detection reference control unit 204 determines the designated position 50 ⁇ / b> A by the finger based on the detection output of the operation detector 13.
- the display position control unit 220 causes the display position changing unit 500 to move the position of the aerial image 300 in the optical axis direction of the imaging optical system 12 based on the designated position 50A.
- the first calibration process is ended by the movement of the display position of the aerial image 300.
- the aerial image 30 for the aerial image operation mode is displayed at the moved display position.
- the display position of the aerial image 300 is moved by the display position control unit 220 and the display position changing unit 500 as follows. That is, the display position control unit 220 and the display position changing unit 500 reach the arrival position 50 or the designated position 50A of the finger when the position is higher than the detection reference 40 as shown in FIG.
- the distance ⁇ H between the position 50 or the designated position 50A and the detection reference 40 is calculated, and the display position of the aerial image 300 is moved to the display position 300 indicated by the dotted line below the distance ⁇ H, for example. Since the aerial image 300 is displayed in space, the visibility is poor, and depending on the user, the display position of the aerial image 300 may look different. In the example shown in FIG.
- the user feels that the aerial image 300 is at a position where the display position of the aerial image 300 is higher than the actual position. Therefore, the arrival position 50 or the designated position 50A by the user is a position above the display position of the aerial image 300. Therefore, in order to detect an operation to the display position of the aerial image 300 by the user, the display position of the aerial image 300 is moved downward by an interval ⁇ H. Therefore, the user comes to operate the display position of the aerial image 300 moved downward, and the arrival position 50 or the designated position 50A can be expected to be further downward. Since the arrival position 50 or the designated position 50A is further down, the arrival position 50 or the designated position 50A reaches the detection reference, and an operation to the display position of the aerial image 300 by the user can be detected.
- the user can detect the operation of the display position of the aerial image 300 by the detection reference 40.
- the display position control unit 220 and the display position change unit 500 reach when the finger arrival position 50 or the designated position 50A is located below the detection reference 40 as shown in FIG.
- the distance ⁇ H between the position 50 or the designated position 50A and the detection reference 40 is calculated, and the display position of the aerial image 300 is moved to the dotted display position 300 that is, for example, above the distance ⁇ H.
- the display position control unit 220 and the display position changing unit 500 are configured to display an aerial image when the finger arrival position 50 or the designated position 50A coincides with the detection reference 40 or exists near the detection reference 40.
- the display position of 300 is not moved.
- the display position control unit 220 and the display position changing unit 500 move the display position of the aerial image 300 downward to reach the arrival position 50.
- the display position of the aerial image 300 is moved upward.
- the amount of movement is the arrival position 50 or the designated position 50A as described above.
- the detection reference 40 need not coincide with the interval ⁇ H, and may be larger or smaller than the interval ⁇ H as described in the first embodiment.
- a large number of users execute calibration processing in advance, and the aerial image 300 is displayed with respect to the distance ⁇ H between the arrival position or the designated position and the detection reference.
- the amount of movement of the aerial image 300 with less sense of incongruity is determined in the operation of the aerial image when the position is moved variously, and statistical processing is performed on the amount of movement of the aerial image 300 to obtain the aerial image 300 for the distance ⁇ H.
- the amount of movement of the aerial image by such statistical processing may be, for example, a value common to all users, a different value for each age group of users, or a difference for each gender. It may be a value.
- the method for determining the aerial image movement amount by this statistical process determines the movement amount of the detection reference when the detection reference is changed based on the arrival position or the designated position in the first embodiment. It can also be applied to.
- the calibration process described above will be described with reference to the flowchart shown in FIG. 38 taking the first calibration process mode as an example.
- the flowchart of FIG. 38 shows steps S301 to S308, and the subsequent steps are omitted.
- the processing after step S308 is the same as the processing after step S9 in FIG.
- step S306 the display position control unit 206 changes the display position of the aerial image 300. That is, the display position control unit 206 moves the display position of the aerial image 300 to the display position changing unit 500 in the optical axis direction of the imaging optical system 12 based on the finger arrival position 50 and proceeds to step S307.
- step S307 the first calibration processing mode is terminated, and the process proceeds to step S308.
- step S308 the aerial image operation mode is started. If the aerial image 300 shown in FIGS.
- step S303 12, 15, and 18 is displayed in step S303, the designated position 50A is determined in step S305, and the aerial image 300 is determined based on the designated position 50A in step S306. What is necessary is just to change the display position.
- the first calibration process mode has been described as an example of the calibration process, but the present invention can also be applied to the second calibration process mode.
- the display device changes the display position of the aerial image based on the arrival position or the designated position in the calibration process, and determines the positional relationship between the display position of the aerial image and the detection reference. It is to be changed, and the positional relationship between the aerial image display position and the detection reference suitable for the user's operation characteristics can be obtained. Further, the movement of the aerial image in the calibration process means that when the detection reference is changed in the calibration process, the changed detection reference goes out of the detection range 13A of the operation detector 13 shown in FIG. If the situation is likely to be in the vicinity of the upper limit or lower limit of the detection range 13A, the above situation can be avoided by moving the aerial image instead of changing the detection reference.
- the display position of the aerial image is changed based on the arrival position or the specified position in the calibration process, but the first modification of the display device of the fifth embodiment is a calibration.
- the display position control unit 220 and the display position changing unit 500 change the display position of the aerial image based on the arrival position or the designated position
- the detection reference control unit 204 changes the position of the detection reference.
- the display position changing unit 500 sets the display position of the aerial image.
- the positional relationship between the display position of the aerial image and the detection reference can be appropriately set by performing coarse adjustment and the detection reference control unit 204 finely adjusting the detection reference.
- Modification 2 of the fifth embodiment Modification 2 of the display device according to the fifth embodiment will be described below.
- the display position control unit 220 and the display position changing unit 500 move the display position of the aerial image
- the display device of Modification 2 fades out the display of the aerial image between the start of movement and the end of movement, and then fades in. . That is, the display brightness is gradually decreased as the aerial image starts moving, and then the display brightness is gradually increased toward the end of the aerial image movement.
- the aerial image is moved as the calibration process, when the aerial image visible to the user moves, the user may feel uncomfortable. Therefore, by gradually reducing the luminance of the display as the aerial image starts moving, it becomes difficult for the user to visually recognize the movement of the aerial image, and the user's uncomfortable feeling can be reduced.
- the display control unit 202 reduces the display brightness and contrast of the aerial image while the aerial image is moving, blinks the aerial image display with the brightness and contrast lowered, and further turns off the display of the aerial image. Can be.
- a display mode that makes the aerial image stand out as the aerial image moves may be performed.
- the display mode that makes the aerial image stand out includes raising the display brightness and contrast of the aerial image and blinking the aerial image display while the aerial image is moving.
- the movement of the aerial image can be made inconspicuous, and the user pays attention to the aerial image itself rather than the movement of the aerial image. Therefore, the user does not care about the movement of the aerial image and can reduce the user's uncomfortable feeling.
- the change of the display mode of the aerial image when the aerial image is moved is performed during the process of step S306 in the flowchart of FIG.
- the display of the aerial image during the movement of the aerial image is not conspicuous or made conspicuous, but it is not performed on the entire aerial image, but for a part of the aerial image, for example, for calibration processing. It may be performed for the icon. Further, whether or not such aerial image movement is conspicuous may be selected according to the user's preference.
- the movement of the aerial image by the display position changing unit 500 may be conspicuous so that the user can confirm the movement of the aerial image during the execution of the calibration process.
- the display control unit 202 may increase the display brightness and contrast of the aerial image or blink the display of the aerial image while the aerial image is moving.
- the movement of the aerial image is made inconspicuous, but conversely, the position after the movement of the aerial image can be made clear by making the movement of the aerial image conspicuous to the user.
- the change in the display brightness of the aerial image during the movement of the aerial image is performed during the process of step S306 in the flowchart of FIG.
- Modification 3 of the fifth embodiment Modification 3 of the display device 1 according to the fifth embodiment will be described below.
- the display device 1 of Modification 3 starts changing the display position of the aerial image by a user operation during the calibration process.
- the display position control unit 220 controls the display position changing unit 500 to start changing the display position of the aerial image.
- the calibration process described above will be described with reference to the flowchart shown in FIG. 39, taking the first calibration process mode as an example.
- the flowchart of FIG. 39 shows steps S311 to S319, and the subsequent steps are omitted.
- the processing after step S319 is the same as the processing after step S9 in FIG.
- steps S311 to S315 are the same as the processes in steps S301 to S305 in the flowchart shown in FIG.
- step S316 it is determined whether or not the operation by the user is finished. If an affirmative determination is made in step S316, that is, if it is determined that the user's operation has ended, the process proceeds to step S317. If a negative determination is made in step S316, the process returns to step S314.
- the processes in steps S317 to S319 are the same as the processes in steps S306 to S308 in the flowchart shown in FIG.
- step S316 for determining whether or not the user's operation is completed determines whether or not to change the display position of the aerial image. Therefore, the end of the operation by the user may be determined as the end of the user operation by determining the arrival position or the specified position, or a specific gesture for changing the display position after the determination of the arrival position or the specified position. It may be determined that the user has finished the operation by detecting (a gesture that changes the shape of the user's hand from par to goo), and a button for changing the display position after the determination of the arrival position or the specified position is in the air It may be determined that the operation by the user is completed when it is displayed on the image and it is detected that the button is pressed by the user. Note that the first calibration process mode has been described as an example of the calibration process, but the present invention can also be applied to the second calibration process mode.
- FIG. 40 is a block diagram showing the display 11 and the operation detector 13 controlled by the control unit 20 in the configuration of the display device 1 according to the fourth modification.
- This display device 1 includes the sound collector 14 in the sixth modification of the first embodiment shown in FIG. 17, and the control unit 20 is provided with a sound detection unit 208.
- the display device 1 determines the arrival position 50 in the same manner as in the first embodiment. For example, as shown in FIG. 41, the display control unit 202 superimposes and displays the message “Calibration is performed. Please say“ High ”” on the calibration icon 300F of the aerial image 300. When the user speaks “high” in accordance with the message of the calibration icon 300F, the sound collector 14 collects this sound and outputs it to the sound detector 208 as sound data. When the sound detector 208 determines that the sound data corresponds to “high”, the display position control unit 220 controls the display position changing unit 500 to change the display position of the aerial image 300.
- step S316 The change in the display position of the aerial image as described above is performed by the user in place of whether or not the operation by the user in step S316 in the flowchart of the modification 3 of the fifth embodiment shown in FIG. ”Is determined. If it is determined that“ high ”is uttered, an affirmative determination is made in step S316 and the process proceeds to step S317.
- the first calibration process mode has been described as an example of the calibration process, but the present invention can also be applied to the second calibration process mode.
- the display device 1 does not include the sound collector 14, and the sound data acquired by the external sound collector is input via wireless or wired, and the sound is detected using the sound data input from the external sound collector.
- the unit 208 may perform voice detection.
- the aerial image is not moved until the user utters “high” even though the detection reference control unit 204 determines the arrival position or designated position of the finger, ”Is detected and the aerial image is moved, the operation for the display position of the aerial image may be repeated a plurality of times before the user utters“ high ”.
- the user utters “high” for example, based on an average value such as an arithmetic average or a geometric average of a plurality of arrival positions or designated positions, or at a plurality of arrival positions 50.
- the aerial image is moved based on the median value (median) or based on the last arrival position or designated position among a plurality of arrival positions or designated positions.
- the display device 1 of Modification 5 stops the movement of the aerial image when the user is viewing the aerial image, and moves the aerial image when the user turns away from the aerial image.
- the display device 1 includes an imaging device such as a camera as in Modification 8 of the first embodiment, the imaging device captures a user who is executing the calibration process, and the control unit 20 captures the image.
- the analyzed image data is analyzed, and based on the analysis result, the orientation of the user's face or the orientation of the user's body is determined to determine whether the user is viewing the aerial image.
- the display position control unit 220 and the display position changing unit 500 move the aerial image when the user is not viewing the aerial image.
- the aerial image is moved as the calibration process, when the aerial image visible to the user moves, the user may feel uncomfortable. Therefore, when the user turns away from the aerial image, the aerial image is moved so that the movement of the aerial image is not visually recognized by the user, and the user's uncomfortable feeling can be reduced.
- the display device 1 includes a line-of-sight detector that detects a user's line of sight instead of or in addition to the image pickup apparatus. Based on the detection output of the line-of-sight detector, the display position control unit 220 and the display position change unit 500 moves the aerial image when the user is not visually recognizing the aerial image and the user's line of sight is not facing the aerial image. For example, the change in the display position of the aerial image as described above is performed instead of determining whether or not the operation by the user in step S316 in the flowchart of the third modification of the fifth embodiment illustrated in FIG. It is determined whether or not the user is viewing the aerial image.
- step S316 When it is determined that the user is viewing the aerial image, an affirmative determination is made in step S316 and the process proceeds to step S317.
- the first calibration process mode has been described as an example of the calibration process, but the present invention can also be applied to the second calibration process mode.
- the above-described line-of-sight detector or imaging device may not be provided in the display device 1.
- the line-of-sight detector may be installed outside the display device 1 and may transmit the line-of-sight detection result to the display device 1 via wireless communication or a cable.
- the imaging device may be installed outside the display device 1 and may transmit imaging data to the display device 1 via wireless communication or a cable.
- the display position of the aerial image is changed when it is determined that the user is viewing the aerial image.
- the display position control is performed when the user is viewing the aerial image.
- the unit 220 and the display position changing unit 500 may perform control to change the display position of the aerial image.
- the control is performed to change the display position of the aerial image when the user is not visually recognizing the aerial image.
- the user's biological information is acquired and the value of the biological information is used.
- Control may be performed to change the display position of the aerial image. For example, the user's pulse rate is acquired as the user's biological information.
- the user is attached with a device for acquiring the user's pulse rate before using the device.
- the display position control part 220 and the display position change part 500 may perform control which changes the display position of an aerial image, when a user's pulse rate becomes large.
- the user's pulse rate increases, the user may not be able to operate well and may be frustrated. In such a case, the user can comfortably use the device by changing the display position of the aerial image.
- the detection reference control unit determines the arrival position or the designated position of the finger, when the user is visually recognizing the aerial image, the user does not move the aerial image. If the aerial image is moved by detecting that it is no longer visually recognized, the user may repeat the operation on the display position of the aerial image a plurality of times before the aerial image is no longer visually recognized. In such a case, when the user stops viewing the aerial image, for example, based on an average value such as an arithmetic average or a geometric average of a plurality of arrival positions or specified positions, or a plurality of arrival positions 50 The aerial image is moved based on the median value (median) of the two or based on the last arrival position or designated position among a plurality of arrival positions or designated positions.
- Modification 6 of the fifth embodiment Modification 6 of the display device according to the fifth embodiment will be described below.
- the display device 1 according to the modification 6 can change the moving speed of the aerial image in the calibration process.
- the display device 1 can move the aerial image at a very high speed or move the aerial image at a low speed.
- the display position control unit 220 and the display position changing unit 500 move the aerial image at an extremely high speed equal to or higher than the first predetermined value, or move the aerial image to a second predetermined value lower than the first predetermined value. Move at low speed.
- By moving the aerial image at an extremely high speed or a low speed it becomes difficult for the user to visually recognize the movement of the aerial image.
- the user may be able to select whether the aerial image is moved at a very high speed or at a low speed by a selection switch or the like.
- a selection switch or the like When the display position control unit 220 and the display position changing unit 500 perform such control, it becomes difficult for the user to visually recognize the movement of the display position of the aerial image. Therefore, it is difficult for the user to visually recognize the movement of the aerial image, and the user's uncomfortable feeling can be reduced.
- the distance for changing the display position of the aerial image is large, the change in the display position of the aerial image may be conspicuous to the user. It may be changed.
- the speed of the first predetermined value is increased and the second predetermined value is compared with the case where the aerial image display position is moved by a predetermined distance or less. You may slow down.
- the display device 1 including at least the control unit 20, the display unit 11, and the operation detector 13 is taken as an example.
- the control apparatus comprised, and the control apparatus comprised by the control part 20 and the indicator 11 may be sufficient.
- the control unit 20 may include at least the calibration unit 203 and the display position control unit 220.
- a configuration may be added as needed from the above configuration.
- FIG. 42 the first detection reference 40a and the second detection reference 40b are initially set across the aerial image 30. It is a point. 42, the aerial image 30 is positioned in the middle between the first and second detection references 40a and 40b, that is, the distance between the aerial image 30 and the first detection reference 40a is the aerial image 30. And the second detection reference 40b are set to be equal to each other.
- the distance between the aerial image 30 and the first detection reference 40a is not necessarily set equal to the distance between the aerial image 30 and the second detection reference 40b.
- An aerial image 30 is displayed with an icon 30A.
- the operation detector 13 detects the downward movement of the finger F.
- the detection reference control unit 204 determines that the finger F has reached the first detection reference 40a based on the detection output of the operation detector 13, and responds to this determination.
- the display control unit 202 changes the display mode of the icon 30A.
- the change in the display mode may be a highlight display such as an increase in display brightness or a blinking display, or the display color may be changed. By such a change in the display mode of the icon 30A, the user can confirm that the finger has selected the icon 30A.
- the detection reference control unit 204 determines that the finger F has reached the second detection reference 40b based on the detection output of the operation detector 13. In response to this determination, the display control unit 202 switches the display content of the aerial image 30. That is, the second detection reference 40b performs the same function as the detection reference 40 described in the fifth embodiment. Although it has been described that the finger F has reached the second detection criterion 40b and the display control unit 202 switches the display content of the aerial image 30 in accordance with this determination, the present invention is not limited to this.
- the display control unit 202 displays a moving image as the aerial image 30, and performs control to reproduce the moving image. Also good. Further, it may be determined that the finger F has reached the second detection reference 40b, and the display control unit 202 may perform scroll control of the aerial image 30 in accordance with this determination.
- the display position control unit 220 positions the display position of the aerial image 30 below the distance ⁇ H. That is, it moves to the position 30 indicated by the dotted line.
- the detection reference control unit 204 is configured as shown in FIG. As shown in (b), the second detection reference 40b is moved downward to the position 40b indicated by the dotted line, and the distance between the aerial image 30 and the second detection reference 40b is set to the distance between the aerial image 30 and the second detection reference 40b. 1 to the detection reference 40a.
- the downward movement of the aerial image 30 shortens the distance from the position of the aerial image 30 to the position of the second detection reference 40b.
- the user since the user reaches the second detection reference 40b immediately after thinking that the aerial image 30 is touched, it may be difficult to operate.
- the user can easily operate by setting the distance between the aerial image 30 and the second detection reference 40b to an appropriate distance. Further, if the distance between the aerial image 30 and the second detection reference 40b is equal to the distance between the aerial image 30 and the first detection reference 40a, based on the distance between the aerial image 30 and the first detection reference 40a, The user can easily grasp the distance between the aerial image 30 and the second detection reference 40b.
- the display position control unit 220 and the display position are displayed.
- the position changing unit 500 moves the display position of the aerial image 30 upward by a distance ⁇ H as shown in FIG.
- the detection reference control unit 204 moves the second detection reference 40b upward to a position 40b indicated by a dotted line as shown in FIG. This makes the distance between the aerial image 30 and the second detection reference 40b equal to the distance between the aerial image 30 and the first detection reference 40a.
- the upward movement of the aerial image 30 increases the distance from the position of the aerial image 30 to the position of the second detection reference 40b.
- the user can easily operate by setting the distance between the aerial image 30 and the second detection reference 40b to an appropriate distance. Further, since the distance between the aerial image 30 and the second detection reference 40b is equal to the distance between the aerial image 30 and the first detection reference 40a, the distance between the aerial image 30 and the first detection reference 40a. The user can easily grasp the distance between the aerial image 30 and the second detection reference 40b.
- the description of the second calibration processing mode is as described above, but the same applies to the first calibration processing mode.
- the positional relationship between the first detection reference 40a and the aerial image 30 is changed by the movement of the aerial image 30, and the second detection is performed as the aerial image 30 moves.
- the reference 40b is moved so that the distance between the aerial image 30 and the second detection reference 40b is substantially equal to the distance between the aerial image 30 and the first detection reference 40a.
- Modification 1 will be described.
- the positional relationship between the first detection reference 40a and the aerial image 30 is changed by the movement of the aerial image 30 as in the display device of the sixth embodiment.
- the positional relationship with the detection reference 40b is changed by the movement of the second detection reference 40b based on the arrival position or designated position of the finger with respect to the second detection reference.
- the display device of the first modification displays the aerial image 30 shown in FIG. 42 and initializes the first and second detection references 40a and 40b.
- the calibration process will be described with reference to FIGS. 45 (a), (b), and (c).
- FIG. 45 (a) is the same as FIG. 39 (a).
- FIG. 45 (a) for example, when the user feels that the finger F has been moved downward toward the first icon 30A of the aerial image 30 and the selection operation is performed on the icon 30A, that is, the finger F is the first icon.
- the user's finger F is stopped from descending.
- the detection reference control unit 204 determines the finger arrival position 50 or the designated position 50A based on the detection output of the operation detector 13. Since the arrival position 50 or the designated position 50A is positioned above the first detection reference 40a by the distance ⁇ H, the display position control unit 220 sets the display position of the aerial image 30 to a position below the approximate distance ⁇ H, for example. That is, it moves to the position 30 indicated by the dotted line. By the downward movement of the aerial image 30, the positional relationship between the aerial image 30 and the first detection reference 40a is changed, that is, a calibration process is performed.
- the detection reference control unit 204 determines the finger arrival position 50 or the designated position 50A with respect to the second detection reference 40b based on the detection output of the operation detector 13. Since the finger arrival position 50 or the designated position 50A is located above the second detection reference 40b by the distance ⁇ H, the detection reference control unit 204 in FIG. 36 moves the finger arrival position 50 or the designated position 50A to the finger arrival position 50 or the designated position 50A.
- the second detection reference 40b is moved upward by, for example, a substantially distance ⁇ H.
- the second detection reference 40b is moved upward to the dotted line position 40a.
- the positional relationship between the aerial image 30 and the second detection reference 40b is changed, that is, calibrated.
- the display position control unit 220 moves the aerial image 30 upward based on the finger arrival position 50 or the designated position 50A.
- the detection reference control unit 204 performs the finger arrival position 50 or the designation position 50A.
- the second detection reference 40b is moved downward.
- the positional relationship between the aerial image 30 and the second detection reference 40b is changed and calibrated.
- the aerial image 30 and the second detection reference 40b are moved by the movement of the second detection reference 40b.
- the aerial image 30 is moved based on the finger arrival position 50 or the designated position 50A with respect to the first detection reference 40a, and the aerial image 30 and the first detection reference 40a.
- the aerial image 30 is further moved based on the finger arrival position 50 or the designated position 50A with respect to the second detection reference 40b, and the position of the aerial image 30 and the second detection reference 40b. You may change the relationship.
- the control can be performed with simple control. The positional relationship can be changed.
- the display position of the aerial image 30 is moved in accordance with the movement.
- the display position control unit 220 controls the display position of the aerial image 30 so that the display position of the aerial image and the position of the finger that moves downward are included in a predetermined range. By controlling in this way, the display position of the aerial image 30 can be moved downward so as to follow the descending finger. Further, the display position control unit 220 sets the display position of the aerial image 30 so that it is always positioned below the lowering finger, and moves the display position of the aerial image 30 downward according to the lowering finger. Controlling the display position of the aerial image 30 prevents the user's finger from penetrating the aerial image 30.
- the detection reference control unit 204 reaches the second detection reference 40b.
- the display control unit 202 displays a reproduced image. In this way, when the finger reaches the aerial image 30, the aerial image 30 moves downward following the downward movement of the finger, so that the user can move the downward movement of the finger to the second detection reference 40b by the aerial image 30. It feels like being guided and can reliably reach the second detection criterion 40b.
- the display device 1 including at least the control unit 20, the display 11, and the operation detector 13 has been described as an example.
- the control unit 20 is described.
- a control device configured by the control unit 20 and the display 11 may be used.
- the control unit 20 may include at least a calibration unit 203, a display position control unit 220, and a detection reference control unit 204.
- the first modification, or the second modification a configuration may be appropriately added as necessary from the above-described configuration.
- the display device according to the present embodiment is the same as that of the display device 100 of the fourth embodiment shown in FIGS. 29, 31, and 32, or the first modification of the fourth embodiment shown in FIGS. It has the same configuration as the display device 100. Also in the display device 100 according to the seventh embodiment, an aerial image is displayed in the same manner as the display device 1 according to the fifth embodiment and the modifications 1 to 4 and the sixth embodiment and the modification 1. The position can be changed. As shown in FIG. 46, the display device 100 according to the present embodiment includes a display position changing unit 500 and a display position control unit 220 in addition to the configuration of the display device 100 according to the fourth embodiment shown in FIG. Prepare.
- the detection reference control unit 204 is based on the detection output of the operation detector 13 in the same manner as in the fifth embodiment and its modified examples 1 to 4 and the sixth embodiment and its modified example 1.
- the arrival position 50 is determined.
- the display position control unit 206 causes the display position changing unit 500 to move the position of the aerial image 300 in the optical axis direction of the imaging optical system 12 based on the finger arrival position 50.
- the display position changing unit 500 moves the aerial image 30 in the Z direction by moving the display device 111 in the X direction.
- the aerial image 30 is moved in the Z direction + side by moving the display 111 in the X direction + side
- the aerial image 30 is moved in the Z direction-side by moving the display 111 in the X direction ⁇ side.
- the display position changing unit 500 may move the imaging optical system 112 in parallel without moving the display 111, or may move both the imaging optical system 112 and the display 111 together.
- the detection reference 40 and the aerial image 30 are moved by moving the position of the detection reference 40 and moving the display position of the aerial image 30. Calibration processing was performed so as to adjust the relative positional relationship between the two. However, in order to adjust the relative positional relationship between the detection reference 40 and the aerial image 30, both the detection reference 40 and the aerial image 30 may be moved.
- the display device 100 including at least the control unit 20, the display device 111, and the operation detector 113 has been described as an example.
- a control device including only the control unit 20 A control device including the control unit 20 and the display 111 may be used.
- the control unit 20 may include at least the calibration unit 203 and the display position control unit 220.
- a configuration may be appropriately added as necessary from the configuration described above.
- the positional relationship between the detection reference and the display position of the aerial image is controlled or changed based on the arrival position or the specified position of the fingertip in the calibration process.
- the display device 1 of the present embodiment has the same configuration as the display device 1 of the first embodiment shown in FIGS.
- the display control unit 202, the display 11, and the imaging optical system 12 are used for the aerial image operation mode shown in FIGS. 47 (a) and 47 (b).
- the aerial image 30 is displayed in the air. 47A, the aerial image 30 includes, for example, two rectangular icons 30D and 30E.
- the detection reference control unit 204 initially sets a rectangular parallelepiped detection reference 42 for each of the two icons 30D and 30E included in the aerial image 30. To do.
- the size of the cross section of the detection reference 42 corresponding to the icon 30D corresponds to the size of the icon 30D as clearly shown in FIG.
- the height in the vertical direction, that is, the Z direction is D1. That is, in the rectangular parallelepiped detection reference 42, the length W1 of one side of the cross section is equal to the length W1 of one side of the icon 30D, and the length W2 of the other side of the cross section is the length of the other side of the icon 30D. It is set equal to W2.
- the upper surface is the upper reference surface 42a
- the lower surface is the lower reference surface 42b
- the side surface generated by the lengths W2 and D1 is the side reference surface 42c
- the lengths W1 and D1 are The side surface generated in the above.
- the outside of the detection reference 42 is referred to as a non-detection reference 41.
- the detection reference 42 is described as a rectangular parallelepiped shape, but the present invention is not limited to this.
- a spherical shape, a cylindrical shape, a prismatic shape, or the like may be used.
- the aerial image 30 is positioned between the upper reference surface 42a and the lower reference surface 42b of the detection reference 42, that is, the distance between the aerial image 30 and the upper reference surface 42a, the aerial image 30 and the lower reference surface 42b. Is set to be equal to the distance. Note that the aerial image 30 is not limited to the one located between the upper reference plane 42a and the lower reference plane 42b, and the distance between the aerial image 30 and the upper reference plane 42a is the distance between the aerial image 30 and the lower reference plane 42b. And the aerial image 30 may be positioned above the upper reference plane 42a, or the aerial image 30 may be positioned below the lower reference plane 42b.
- the detection reference 42 corresponding to the icon 30E is also a rectangular parallelepiped shape having a predetermined height corresponding to the shape of the icon 30E, similarly to the detection reference 42 corresponding to the icon 30D.
- FIGS. 48A to 48C show examples of predetermined non-contact operations 600A to 600C in this embodiment (referred to as a predetermined non-contact operation 600 when generically referred to).
- predetermined non-contact operations 600A to 600C are schematically shown using arrows as trajectories when the finger F moves.
- the predetermined non-contact operation 600A shown in FIG. 48A is an operation in which the user makes a U-turn after moving the finger F downward by the distance L1, and moves upward by the distance L1.
- the predetermined non-contact operation 600A is a U-turn locus in which the descending movement distance and the ascending movement distance are equal. Further, the predetermined non-contact operation 600A may be a U-turn, that is, a V-shaped trajectory instead of a U-shaped trajectory, and the distance along the descending trajectory after the finger F moves downward by a distance L1. It may be an operation of moving upward by L1. Furthermore, the predetermined non-contact operation 600A may be such that the downward movement distance L1 and the upward movement distance L1 are not equal.
- the predetermined non-contact operation 600A in the present embodiment may be an operation in which the finger moves up and down following the finger's downward movement.
- the predetermined non-contact operation 600B in FIG. 48B is to stop the finger F for a predetermined time after the user moves the finger F downward by the distance L1.
- the predetermined non-contact operation 600C in FIG. 48C is an operation in which the user moves the finger F downward by a distance L1 and then moves the finger F at least a predetermined distance L2 in the lateral direction.
- the predetermined non-contact operation 600 is not limited to that represented by the movement trajectories of the various fingers F described above, and the movement trajectory (finger F or hand movement trajectory) can be detected by the operation detector 13. If so, other movement trajectories may be drawn.
- the detection reference control unit is based on the detection output of the operation detector 13 that detects the movement of the user's finger F. 204 determines that the finger F has operated the display position of the icon.
- FIG. 49 illustrates a case where the detection reference control unit 204 determines that the non-contact operation 600 ⁇ / b> A is performed based on the detection reference 42 among the predetermined non-contact operations 600 described above.
- the predetermined non-contact operation 600A1 shows a case where the finger F moves the distance L1 downward from the upper reference plane 42a, continues to make a U-turn and moves the distance L1 upward, and the finger F reaches the upper reference plane 42a.
- the predetermined non-contact operation 600A2 shows a case where the finger F moves a distance L1 downward between the upper reference surface 42a and the lower reference surface 42b, and then continues to make a U-turn and move a distance L1 upward.
- the predetermined non-contact process 600A3 shows a case where the finger F moves downward by a distance L1 and makes a U-turn on the lower reference surface 42b and moves upward by a distance L1.
- the detection reference control unit 204 detects all of the downward movement of the distance L1 of the predetermined non-contact operation 600A, the U-turn, and the upward movement of the distance L1. If it is performed within 42, it is determined that the predetermined non-contact operation 600A is performed according to the detection standard 42. That is, the detection reference control unit 204 detects a predetermined non-contact operation 600 ⁇ / b> A with the detection reference 42.
- the detection reference control unit 204 determines that the predetermined non-contact operation 600 is performed based on the detection standard 42 when the entire predetermined non-contact operation 600 is performed based on the detection standard 42. If even a part of the predetermined non-contact operation 600 is performed outside the detection standard 41, it is not determined that the predetermined non-contact operation 600 is performed based on the detection standard 42.
- the width D1 of the detection reference 42 that is, the distance between the upper reference surface 42a and the lower reference surface 42b (length in the Z direction) is The distance is required to be at least the distance L1, and is set to about 1.5 to 3 times the distance L1, for example.
- the outside detection reference 41 is an external space outside the detection reference 42. Specifically, in FIG. 47C, in an outer space other than the space surrounded by the upper reference surface 42a, the lower reference surface 42b, the side reference surface 42c, and the side reference surface 42d of the detection reference 42. is there.
- FIG. 50 shows an example in which the entire predetermined non-contact operation 600 ⁇ / b> A is detected outside the detection reference 41.
- the entire predetermined non-contact operation 600 ⁇ / b> A with the finger F is performed at a position above the upper reference surface 42 a of the detection reference 42.
- the entire predetermined non-contact operation 600 ⁇ / b> A is detected outside the detection reference 41 by the operation detector 13 and the detection reference control unit 204.
- the entire predetermined non-contact operation 600Aa with the finger F is performed at a position below the lower reference surface 42b of the detection reference 42, and the entire predetermined non-contact operation 600Ab with the finger F is detected.
- the measurement is performed at a position outside the side reference surface 42c of the reference 42.
- the entire predetermined non-contact operation 600 ⁇ / b> Aa and the entire 600 ⁇ / b> Ab are detected outside the detection reference 41 by the operation detector 13 and the detection reference control unit 204.
- a method of detecting a predetermined non-contact operation 600 outside the detection reference 41 by the operation detector 13 and the detection reference control unit 204 will be described. First, the operation detector 13 sequentially detects the movement of the finger F.
- the detection criterion control unit 204 determines whether or not the movement trajectory of the finger F corresponds to a predetermined non-contact operation 600 and the position of the movement trajectory of the finger F (the detection criterion 42 or Non-detection reference 41 or both detection reference 42 and non-detection reference 41). Based on the determination result, the predetermined non-contact operation 600 can be detected outside the detection reference 41.
- FIG. FIG. 51 shows a case where the predetermined non-contact operation 600 ⁇ / b> A is outside the detection reference 41 and is detected at a position above the upper reference surface 42 a of the detection reference 42.
- the following calibration processing is performed using the aerial image 30 for manipulating the aerial image, but the aerial image 300 for calibration processing shown in FIG. 4 and the like may be used.
- FIG. 51 shows a case where the predetermined non-contact operation 600 ⁇ / b> A is outside the detection reference 41 and is detected at a position above the upper reference surface 42 a of the detection reference 42.
- the following calibration processing is performed using the aerial image 30 for manipulating the aerial image, but the aerial image 300 for calibration processing shown in FIG. 4 and the like may be used.
- FIG. 51 shows a case where the predetermined non-contact operation 600 ⁇ / b> A is outside the detection reference 41 and is detected at a position above the upper reference surface 42 a of the detection reference 42.
- the following calibration processing is performed using the aerial image 30 for manipulating the aerial
- the finger F in order for the user to operate the display position of the icon 30D of the aerial image 30, the finger F is moved downward, and when the finger F reaches the upper limit 13a of the detection range 13A of the operation detector 13,
- the operation detector 13 sequentially detects the downward movement of the finger and sequentially stores the detection output associated with the movement of the finger in the storage unit 205.
- the detection reference control unit 204 determines whether or not the movement trajectory of the finger F corresponds to a predetermined non-contact operation 600A, and the finger F It is determined whether or not all the movement trajectories are present in the detection reference 42.
- the detection reference control unit 204 determines that the predetermined non-contact operation 600A has been performed and determines that all of the predetermined non-contact operations have been performed outside the detection reference 41, the detection reference control unit 204 stores Based on the detection output from the operation detector 13 stored in the unit 205, the interval ⁇ H10 between the operation start position of the predetermined non-contact operation 600A and the upper reference plane 42a is calculated.
- the interval ⁇ H10 can be calculated from the operation start position of the predetermined non-contact operation 600A and the position of the upper reference surface 42a as described above, but can also be calculated by the following method.
- the lowest position of the predetermined non-contact operation 600A that is, the arrival position of the predetermined non-contact operation 600A is obtained based on the detection output from the operation detector 13 stored in the storage unit 205, and this predetermined non-contact operation
- the interval ⁇ H10 can also be calculated by calculating the interval between the reaching position of 600A and the position of the upper reference surface 42a and adding the distance L1 of the predetermined non-contact operation 600A to the calculated interval.
- the detection reference control unit 204 moves the entire detection reference 42 upward in the drawing based on the distance ⁇ H10 as shown in FIG. 51 (b).
- the moved detection reference 42 is indicated by a one-dot chain line.
- the amount of upward movement of the detection reference 42 may be substantially equal to the distance ⁇ H10 as shown in FIG. 51B, or may be larger or smaller than the distance ⁇ H10.
- the entire detection reference 42 is moved to the predetermined non-contact operation.
- the detection reference 42 is changed by moving upward so as to approach the position where the error is made. As a result, if the user's operation does not reach the detection standard 42 and the operation is not effective, the detection standard 42 is changed according to the user's operation position, so that the user's uncomfortable feeling in operation is alleviated. Can do.
- FIG. 52 is a diagram for explaining a calibration process in a case where the predetermined non-contact operation 600A is outside the detection reference 41 and is detected at a position below the lower reference surface 42b of the detection reference 42.
- the detection reference control unit 204 determines that a predetermined non-contact operation 600A has been performed based on the detection output of the operation detector 13 stored in the storage unit 205, and the predetermined non-contact operation. If it is determined that the contact operation has been performed outside the detection standard, the detection standard control unit 204 has the lowest position of the movement trajectory of the predetermined non-contact operation 600A, that is, the arrival position of the predetermined non-contact operation 600A and the lower part of the detection standard 42.
- An interval ⁇ H10 with respect to the reference surface 42b is calculated.
- the detection reference control unit 204 moves the entire detection reference 42 downward in the drawing based on the distance ⁇ H10 as shown in FIG. 52 (b).
- the moved detection reference 42 is indicated by a one-dot chain line.
- the amount of downward movement of the detection reference 42 may be substantially equal to the distance ⁇ H10 as shown in FIG. 52B, or may be larger or smaller than the distance ⁇ H10.
- the entire detection reference 42 is subjected to the predetermined non-contact operation.
- the detection reference 42 is changed by moving downward so as to be close to the made position. As a result, when the user's operation passes through the detection standard 42 and the operation is not effective, the detection standard 42 is changed in accordance with the user's operation position, so that the user's uncomfortable feeling is alleviated. can do.
- FIG. 53 is a diagram for explaining a calibration process when it is detected that the non-contact operation 600A is performed outside the detection reference 41 and outside the side reference surface 42c of the detection reference 42.
- the detection reference control unit 204 determines that a predetermined non-contact operation 600A has been performed based on the detection output of the operation detector 13 stored in the storage unit 205, and the predetermined non-contact operation.
- the detection reference control unit 204 determines that the side reference surface of the side reference surface 42c of the detection reference 42 and the predetermined non-contact operation 600A moves.
- a distance ⁇ H10 from the portion farthest from 42c is calculated.
- the detection reference control unit 204 moves the entire detection reference 42 in the horizontal direction in the drawing, that is, a predetermined non-contact operation 600A based on the distance ⁇ H10 as shown in FIG. 53 (b). Move it closer.
- the moved detection reference 42 is indicated by a one-dot chain line.
- the amount of movement of the detection reference 42 in the horizontal direction may be substantially equal to the distance ⁇ H10 as shown in FIG. 53B, or may be larger or smaller than the distance ⁇ H10.
- the entire detection reference 42 is Then, the detection reference 42 is changed by moving so as to approach the position where the predetermined non-contact operation is performed. As a result, if the user's operation deviates from the detection standard 42 and the operation is not effective, the detection standard 42 is changed according to the user's operation position, so that the user's operational discomfort is alleviated. can do.
- the detection reference 42 is changed by the calculated change amount ⁇ H10.
- the detection reference 42 may be changed using a value obtained by adding the predetermined amount h to the interval ⁇ H10 as the change amount.
- the predetermined amount h is a value obtained by averaging the difference in the arrival position of the predetermined non-contact operation 600 (difference from the arrival position of the non-contact operation to the closest reference plane of the detection reference 42) or the start position of the plurality of non-contact operations 600. And the like (average difference between the start position of the non-contact operation and the reference plane closest to the detection reference 42), and the like.
- the predetermined amount h may be a fixed value set in advance.
- the detection reference 42 moves by an amount corresponding to the addition of the predetermined amount h as a margin to the interval ⁇ H10. Therefore, even if the non-contact operation cannot be performed at the exact same position as the non-contact operation performed during the calibration process, if the error is within the range of the predetermined amount h, the user's non-contact operation is detected. 42 can be detected. Even if the start position or the arrival position of the user's non-contact operation varies for each operation, the user's non-contact operation can be detected by the detection reference 42.
- the rate at which the non-contact operation is detected by the detection reference 42 can be made higher than when the value of the interval ⁇ H10 is used as the change amount.
- FIG. 54 shows an example in which a part of the predetermined non-contact operation 600A is detected outside the detection reference 41. 54A, a part of the predetermined non-contact operation 600A by the finger F, that is, a part corresponding to the distance ⁇ H10 is performed at a position above the upper reference surface 42a of the detection reference 42, and the remaining part is detected. This is done within the standard 42.
- the predetermined non-contact operation 600A is obtained.
- a part of the predetermined non-contact operation 600 ⁇ / b> A is detected outside the detection reference 41 by the operation detector 13 and the detection reference control unit 204.
- a part of the predetermined non-contact operation 600Aa by the finger F that is, a part corresponding to the distance ⁇ H10 is performed at a position below the lower reference surface 42b of the detection reference 42, and the remaining part is detected. This is done within the standard 42.
- the predetermined non-contact operation 600Aa when a part of the predetermined non-contact operation 600Aa detected by the detection reference 42 and a part of the predetermined non-contact operation 600A detected by the non-detection reference 41 are combined, the predetermined non-contact operation 600 Aa. Further, a part of the predetermined non-contact operation 600Ab by the finger F, that is, a part corresponding to the distance ⁇ H10 is performed outside the side reference surface 42c of the detection reference 42, and the remaining part is performed in the detection reference 42. Yes. In other words, when a part of the predetermined non-contact operation 600Ab detected by the detection reference 42 and a part of the predetermined non-contact operation 600Ab detected by the non-detection reference 41 are combined, the predetermined non-contact operation 600Ab is obtained. Become. Also in these cases, a part of the predetermined non-contact operation 600 ⁇ / b> Aa or a part of Ab is detected outside the detection reference 41 by the operation detector 13 and the detection reference control unit 204.
- the calibration process related to the case is the same as in the case of FIG. That is, the entire detection reference 42 is moved downward in the drawing based on the distance ⁇ H10. Like the predetermined non-contact operation 600Ab shown in FIG. 54 (b), a part of the predetermined non-contact operation 600Ab is performed on the detection reference 42, and the remaining part is performed at a position outside the side reference surface 42c.
- the calibration process related to the case is the same as in FIG. That is, the entire detection reference 42 is moved horizontally based on the distance ⁇ H10.
- step S704 based on the detection output from the operation detector 13, whether or not the operation by the user (more specifically, the operation to the display position of the icon 300A of the aerial image 300 by the user) is a predetermined non-contact operation. Determine whether. If the operation is a predetermined non-contact operation, an affirmative determination is made in step S704 and the process proceeds to step S705. If the operation is not a predetermined non-contact operation, the process waits until an affirmative determination is made in step S704.
- step S705 it is determined whether or not a predetermined non-contact operation has been performed based on the detection reference 42. As shown in FIG. 49, when the predetermined non-contact operation is performed based on the detection reference 42, an affirmative determination is made in step S705, and the process proceeds to step S708 described later.
- a predetermined non-contact operation is not detected by the detection standard 42, that is, (1) when all the predetermined non-contact operations are detected outside the detection standard 41, or (2) a part of the predetermined non-contact operation is detected If the reference 42 is detected and the other part is detected outside the detection reference 41, a negative determination is made in step S705 and the process proceeds to step S706.
- step S706 the change amount of the detection reference 42 is calculated based on the positional relationship between the predetermined non-contact operation and the detection reference 42, and the process proceeds to step S707.
- step S707 the position of the detection reference 42 is changed based on the change amount calculated in step S706, and the process proceeds to step S708.
- step S708 the first calibration processing mode is terminated, and the process proceeds to step S709.
- step S709 the aerial image operation mode is started.
- the position of the detection reference is changed when the predetermined non-contact operation of the user is not detected by the detection reference. That is, the vertical center position and / or the horizontal center position of the detection reference 42 is changed.
- an operation can be performed at a place suitable for the user.
- the positional relationship between the detection reference and the aerial image can be changed to a positional relationship suitable for the user's operation.
- the first calibration process mode has been described as an example, but the flowchart of FIG. 55 can also be applied to the second calibration process mode.
- the detection reference 42 is described as being set corresponding to each of the icons 30D and 30E, but is not limited thereto.
- the detection reference 42 may be set in common for a plurality of icons, or one detection reference 42 may be set in the entire area of the aerial image 30.
- the detection reference 42 is changed in the vertical direction and / or the horizontal direction based on the positional relationship between the position in the space where the predetermined non-contact operation 600 is detected and the detection reference 42. That is, the description has been given assuming that the vertical center position and / or the horizontal center position of the detection reference 42 is changed.
- the display device 1 of Modification 1 may change the size of the width D1 of the detection reference 42 when changing the positional relationship between the detection reference 42 and the predetermined non-contact operation 600 in the space. For example, as shown in FIG. 50 (a), when a predetermined non-contact operation 600A is detected outside the detection reference 41 above the detection reference 42, the position of the lower reference surface 42b is not changed.
- the vertical center position of the detection reference 42 may be changed by changing the width D1 of the detection reference 42.
- the upper reference surface 42a may be changed upward by the change amount ⁇ H10
- the lower reference surface 42b may be changed downward by the change amount ⁇ H10. That is, the detection reference 42 may be changed without changing the center position of the detection reference 42 in the vertical direction by changing the width D1 of the detection reference 42 in the vertical direction with the same change amount ⁇ H10.
- the position of the lower reference surface 42b may be changed downward by the change amount ⁇ H10.
- the position of the lower reference surface 42b may be changed downward by the change amount ⁇ H10 and the position of the upper reference surface 42a may be changed upward by the change amount ⁇ H10.
- the position of the side reference surface 42c may be similarly changed in the left-right direction. That is, the detection reference 42 may be changed by changing the horizontal center position of the detection reference 42, or the width of the detection reference 42 may be changed without changing the center position.
- Modification 2 of the eighth embodiment A display device 1 of Modification 2 will be described.
- the display device 1 according to the second modification has a predetermined distance between the predetermined non-contact operation 600 and the detection reference 42. If the value is equal to or less than the value, the detection reference 42 is changed. As an example, a case where a predetermined non-contact operation 600A is detected outside the detection reference 41 above the detection reference 42 as shown in FIG.
- the display device 1 displays the aerial image.
- the detection reference 42 is changed by assuming that the user intends to operate the display position.
- the display device 1 causes the user to display an aerial image.
- the detection reference 42 is not changed because the operation is not intended to be performed on the position, the operation is erroneous, or the operation is interrupted.
- the position of the detection reference 42 may be changed based on the distance between the upper position and the detection reference 42. For example, as shown in FIG. 54A, it is determined whether or not a part of the non-contact operation 600 detected above the upper surface reference surface 42a of the detection reference 42, that is, the distance ⁇ H10 is equal to or smaller than a predetermined threshold value. When the distance ⁇ H10 is equal to or smaller than the predetermined threshold, the entire predetermined non-contact operation 600A of the finger F is not performed based on the detection reference 42 as shown in FIG.
- Most of the operation 600A is performed on the detection reference 42.
- the display device 1 assumes that the user intended to operate the display position of the aerial image, and the position of the detection reference 42 is changed.
- the distance ⁇ H10 exceeds the predetermined threshold value
- most of the predetermined non-contact operation 600A is performed outside the detection reference 41 as shown in FIG. 56 (b).
- the display device 1 considers that the user did not intend to perform the operation to the display position of the aerial image, was an erroneous operation, or interrupted the operation, and changed the position of the detection reference 42. do not do.
- the speed or acceleration of the user's fingertip is calculated based on the detection output of the operation detector 13 as in the first modification of the first embodiment.
- the position of the detection reference 42 may be changed based on the acceleration. That is, based on the speed of at least a part of the predetermined non-contact operation 600, the detection reference 42 is changed particularly when the speed of a part of the predetermined non-contact operation 600 is smaller than a predetermined value.
- FIG. 57 is a block diagram showing the control unit 20, the display 11 and the operation detector 13 controlled by the control unit 20 in the display device 1 of the third modification.
- the speed of at least a part of the predetermined non-contact operation 600 indicates the speed of at least a part of the predetermined non-contact operation 600.
- the operation of at least a part of the predetermined non-contact operation 600 is, for example, an operation (predetermined) after the predetermined non-contact operation 600 is operated from the position outside the detection reference 41 toward the detection reference 42.
- the non-contact operation 600A the operation of at least a part of the operation when the operation is directed from the detection reference non-41 to the detection reference 42 is shown.
- the predetermined non-contact operation 600 is an operation that is operated from a position where the detection reference 42 is located toward one end of the detection reference 42 and then is subsequently turned back (predetermined non-contact operation 600A)
- the speed (acceleration) in all the operations of the predetermined non-contact operation 600 (for example, the operation from the start of the descent in the predetermined non-contact operation 600A until the completion of the subsequent increase) is monitored, and the average value of the speeds (acceleration) May be calculated
- the strength of the operation may be determined based on the average value, and the detection reference 42 may be changed when the operation is detected next time. For example, when the operation speed is high on average, there is a possibility of penetrating the detection standard 42, so that the width of the detection standard 42 after the next time may be increased.
- the speed / acceleration detection unit 206 shown in FIG. 57 reads out the capacitance value detected by the operation detector 13 at predetermined time intervals.
- the moving speed of the finger is calculated from the change in capacitance value.
- the moving acceleration of the finger is calculated from the calculated speed, and it is determined whether or not the predetermined value is exceeded.
- the operation prediction unit 211 is based on the finger movement speed or acceleration output from the speed / acceleration detector 206.
- the movement trajectory of the finger F is calculated, that is, predicted.
- the detection reference control unit 204 changes the detection reference 42 based on the movement trajectory of the finger F predicted by the operation prediction unit 211. That is, when the predicted movement trajectory of the finger F is not in the detection criterion 42, it is determined that the predetermined non-contact operation 600 is not detected by the detection criterion 42. In that case, as in the case of the eighth embodiment, the detection reference 42 is changed with the calculated change amount ⁇ H10. Further, when the predicted movement trajectory of the finger F is in the detection criterion 42, it is determined that the predetermined non-contact operation 600 is detected by the detection criterion 42, and the detection criterion 42 is not changed.
- the operation prediction unit 211 predicts the movement trajectory of the finger F and changes the detection reference 42 when the movement speed and / or movement acceleration calculated by the speed / acceleration detection unit 206 is equal to or greater than a predetermined value. Good. That is, if the detection path 42 does not have the movement trajectory of the finger F predicted when the movement speed and / or movement acceleration of the finger F is greater than or equal to a predetermined value, the predetermined non-contact operation 600 is not detected by the detection reference 42 to decide. In that case, as in the case of the eighth embodiment, the detection reference 42 is changed with the calculated change amount ⁇ H10.
- steps S764 to S766 are the same as those in the flowchart of FIG.
- the operation detector 13 detects the movement of the finger F as a change in capacitance value in step S764. To do.
- the speed / acceleration detection unit 206 calculates the moving speed and acceleration of the fingertip F based on the detection output of the operation detector 13.
- step S765 the operation predicting unit 211 determines whether or not the moving speed and acceleration calculated by the speed / acceleration detecting unit 206 are not less than a first predetermined value and not more than a second predetermined value.
- the first predetermined value is determined corresponding to the speed and acceleration at which the movement of the finger F from the upper side to the lower side of the detection reference 42 is predicted not to reach the upper reference surface 42a
- the second predetermined value is the first predetermined value.
- the value is larger than the predetermined value, and is determined in correspondence with the speed and acceleration at which the downward movement of the finger F is predicted to pass through the lower reference surface 42b.
- step S765 When the moving speed and acceleration of the fingertip F are not less than the first predetermined value and not more than the second predetermined value, an affirmative determination is made in step S765 and the process proceeds to step S770. If the moving speed and acceleration of the finger F are smaller than the first predetermined value or larger than the second predetermined value, a negative determination is made in step S765 and the process proceeds to step S767.
- step S767 the operation prediction unit 211 calculates the movement locus of the fingertip F based on the movement speed and acceleration calculated by the speed / acceleration detection unit 206.
- the movement trajectory of the finger F calculated by the operation prediction unit 211 when the movement speed and the movement acceleration are equal to or less than the first predetermined value, that is, the predicted movement path of the finger F is indicated by a broken line 600Ac.
- the detection reference control unit 204 calculates the change amount ⁇ H10 of the detection reference 42 and changes the detection reference 42 in the same manner as shown in FIG. In order to predict the reaching position of the finger, both the moving speed and acceleration of the finger may be used, or one of them may be used. Note that the first calibration process mode has been described as an example of the calibration process, but the present invention can also be applied to the second calibration process mode.
- the operation prediction unit 211 calculates the movement trajectory of the fingertip F.
- the movement trajectory need not be calculated. That is, the control unit 20 of the display device 1 does not include the operation prediction unit 211, and detects only a predetermined change amount when the movement speed and the movement acceleration calculated by the speed / acceleration detection unit 206 are equal to or less than a predetermined value.
- the reference 42 may be changed. For example, if the moving speed or moving acceleration is detected at a position a predetermined distance above the detection reference 42 and the detected moving speed or moving acceleration is equal to or less than a predetermined value, the finger F must reach the detection reference 42. Predict and change detection criteria 42.
- the speed / acceleration detection unit 206 reads the capacitance value detected by the operation detector 13 every predetermined time, and specifies the change from the change in the capacitance value per predetermined time.
- the movement acceleration of the finger is calculated from the calculated speed.
- the present invention is not limited to this method, and an imaging device may be used as the speed / acceleration detection unit 206.
- the movement speed or acceleration of the user's finger is calculated.
- the user's foot or elbow, or a stylus pen possessed by the user may be used.
- the display device 1 according to the eighth embodiment and the first to third modifications thereof is based on the positional relationship between the position of the predetermined non-contact operation 600A in space and the detection reference 42 in one calibration process.
- the position of the detection reference 42 was changed. That is, one calibration process was performed by one user operation.
- the display device 1 according to the modification 4 performs one calibration process by a plurality of user operations. That is, the detection reference 42 is changed based on the number of times that the predetermined non-contact operation 600A is detected outside the detection reference 41 or the number of times that the predetermined non-contact operation 600A is detected on the detection reference 42.
- the detection reference control unit 204 determines whether or not the finger F has performed the predetermined non-contact operation 600A based on the detection output of the operation detector 13, and the predetermined non-contact operation 600A is performed. If it is detected, the position on the space where the predetermined non-contact operation 600A is performed is detected. When it is detected that the predetermined non-contact operation 600A is performed based on the detection reference 42, the detection reference control unit 204 determines that the first calibration process is successful and stores the determination result in the storage unit 205. . When the predetermined non-contact operation 600A is detected outside the detection reference 41, the detection reference control unit 204 determines that the first user operation has failed, and the amount of change is the same as in the eighth embodiment.
- ⁇ H10 is calculated, and the determination result and the change amount ⁇ H10 are stored in the storage unit 205. Subsequently, in the second user operation, the operation success or failure determination result and / or the change amount ⁇ H10 is stored in the storage unit 205. Furthermore, processing may be performed in the third user operation.
- the detection reference 42 is changed based on the plurality of determination results and / or the change amount ⁇ H10 stored in the storage unit 205 in a plurality of user operations performed continuously.
- Various methods are conceivable to determine whether or not to change the detection reference 42 based on the determination results of the plurality of user operations and / or the change amount ⁇ H10.
- the detection reference 42 is changed. Specifically, the detection reference 42 is changed when it is determined that both the first user operation and the second user operation have failed.
- the detection criterion 42 may be changed.
- the detection reference 42 may be changed when a user operation determined to have failed among a plurality of user operations exists more than a predetermined number of times. Specifically, the detection reference 42 is changed when it is determined that five or more user operations have failed among ten user operations. In this case, the detection reference 42 may be changed when the user operation is determined to be failed for the fifth time (when the failure determination is accumulated five times), or after all the ten user operations are completed, The detection reference 42 may be changed. In addition, when the frequency with which the predetermined non-contact operation 600A is detected outside the detection standard 41 is high, the frequency of changing the detection standard 42 may be increased.
- the detection reference 42 is changed when it is determined that five or more user operations have failed among the ten user operations, eight user operations out of the ten user operations are performed. Suppose that it is determined to have failed. In that case, after the next time, the detection reference 42 may be changed when it is determined that three or more of the five user operations have failed.
- the change amount ⁇ H10 is processed in the same manner as the calculation method when determining the detection reference in the second modification of the first embodiment. Just do it. That is, it is only necessary to calculate one change amount ⁇ H10 by arithmetically averaging or geometrically averaging the change amounts calculated by the user operation for which the failure determination has been made. Also in this case, as described in the second modification of the first embodiment, a new change amount ⁇ H10 can be calculated by performing appropriate weighting.
- the detection reference 42 when changing the detection reference 42 based on the result of a plurality of user operations, when the arithmetic average or geometric mean value of the change amount ⁇ H10 calculated by each user operation exceeds a predetermined threshold, When the change amount ⁇ H10 calculated by each user operation tends to increase, the detection reference 42 may be changed.
- the example in which the position of the detection reference 42 is changed based on the number of times when the predetermined non-contact operation 600A is detected outside the detection reference 41 has been described, but is different from the predetermined non-contact operation 600A.
- An operation that is, a user operation when a part of the predetermined non-contact operation 600 ⁇ / b> A is detected outside the detection reference 41 may be regarded as a failure. That is, among a plurality of user operations, when a part of the predetermined non-contact operation 600A is continuously detected outside the detection reference 41, or a part of the predetermined non-contact operation 600A is predetermined outside the detection reference 41
- the detection reference 42 may be changed also when detected more than once.
- the predetermined non-contact operation 600 is an operation in which the user pushes the finger F toward the display position 1.
- the finger F is U-turned as shown in FIG. 48A, but the present invention is not limited to this.
- the predetermined non-contact operation 600 may be a case where three fingers are put out at the display position, or a movement operation of the finger F to the display position 1 in front of the body.
- the predetermined non-contact operation 600 may be an operation in which the movement of the finger F stops for a predetermined time, for example, 20 seconds.
- the detection reference control unit 204 determines whether or not the predetermined non-contact operation 600 has been performed based on the detection output of the operation detector 13.
- the predetermined non-contact operation 600 may not be performed accurately or may not be performed successfully.
- the predetermined non-contact operation 600 is a downward movement of 10 cm of the finger and a subsequent upward movement of 10 cm, depending on the user, as a non-contact operation, the downward movement of the finger by 5 cm and the subsequent upward movement of 5 cm are performed. And sometimes do.
- the predetermined non-contact operation 600 is to put out three fingers at the display position 1, depending on the user, the third finger may not be opened well and two fingers may be put out.
- the predetermined non-contact operation 600 is a movement operation to the display position of the finger F in front of the body, the user may move to the display position of the finger F on the side of the body.
- the predetermined non-contact operation 600 may cause the user to move the finger in about 15 seconds before reaching 20 seconds. is there.
- the operation performed by the user itself (the user's operation) If the detection value itself detected as an operation) does not match the “predetermined non-contact operation 600” (a reference value indicating the predetermined non-contact operation 600), even if the detection reference 42 (position and width as described above) is used. Even if it is set (changed), the user operation cannot be recognized. In such a case, the user's operation can be recognized as the predetermined non-contact operation 600 by changing the reference value indicating the predetermined non-contact operation 600 as the change of the detection reference 42.
- the display device 1 estimates that the user is performing a predetermined non-contact operation 600. Then, the display device 1 detects the reference value (definition of the predetermined non-contact operation 600) stored in the display device 1 as the operation performed by the user (user operation). Change (update). For example, the detection reference control unit 204 detects the non-contact operation of the user, and compares the detected detection value of the non-contact operation of the user with a reference value stored in advance indicating the predetermined non-contact operation 600.
- the reference value indicating the predetermined non-contact operation 600 indicates the definition or template of the predetermined non-contact operation 600 stored in advance in the display device 1.
- the reference value (prestored value) of the predetermined non-contact operation 600 is changed based on the detected value of the detected non-contact operation of the user. .
- the user can operate with a non-contact operation performed by the user. For example, it is assumed that a value indicating that “finger F is lowered by 10 cm” is stored as a reference value indicating a predetermined non-contact operation 600.
- the reference value indicating the predetermined non-contact operation 600 is changed to a value indicating that the finger F is lowered by 5 cm. In this way, by changing the reference value indicating the predetermined non-contact operation 600, it is possible to operate even with a non-contact operation similar to the predetermined non-contact operation 600. Further, by changing the reference value of the predetermined non-contact operation 600 so that the operation works even if the operation amount of the non-contact operation of the user is small, it is possible to reduce the burden on the user's operation.
- a value indicating an operation of “moving the finger 10 cm downward in front of the body and subsequently moving 10 cm upward” is stored as a reference value indicating a predetermined non-contact operation.
- the reference value indicating the predetermined non-contact operation 600 is set to “the finger is moved down 10 cm next to the body and then continues. The value is changed to a value indicating an operation of “moving upward 10 cm”.
- the display device 1 can be operated by moving the finger down 10 cm next to the body and subsequently moving up 10 cm.
- the detection reference 42 (a reference value indicating a predetermined non-contact operation 600) may be changed based on a plurality of user operations. That is, although different from the predetermined non-contact operation 600, when a similar non-contact operation is performed a plurality of times, the reference value indicating the predetermined non-contact operation 600 may be changed. Thus, changing the reference value indicating the predetermined non-contact operation 600 is included in the change of the detection reference 42.
- the detection reference control unit 204 may change the detection reference 42 based on the sound.
- the change of the detection reference 42 in this case includes a change of the position and width of the detection reference 42 and a change of a reference value indicating a predetermined non-contact operation 600.
- the display device 1 includes the sound collector 14 similar to that of the sixth modification of the first embodiment, and the control unit 20 includes a sound detection unit 208 that detects sound data input from the sound collector 14. Prepare.
- the voice detection unit 208 has a known voice recognition function capable of voice recognition other than “high”.
- the display device 1 of the modified example 7 uses the voice recognition function to make this conversation.
- the detection reference 42 is changed by detecting a message. Specifically, the detection reference 42 may be moved or the width of the detection reference 42 may be changed so that the position of the user's finger when the sound (speech) is detected is included. Alternatively, when a sound (speech) is detected, the detection reference 42 may be moved in a direction approaching the user by a predetermined amount, for example, 1 cm, or the width of the detection reference 42 may be changed.
- the reference value indicating the predetermined non-contact operation 600 may be changed so that the detected value is detected as a user operation when a sound (speech) is detected.
- the reference value indicating the predetermined non-contact operation 600 may be changed by a predetermined amount. For example, when a value indicating an operation “decrease 10 cm” is stored as a reference value indicating a predetermined non-contact operation 600, a value indicating an operation “decrease 9 cm” is detected when a sound (utterance) is detected.
- the reference value indicating the non-contact operation 600 may be changed (updated).
- the display device 1 does not include the sound collector 14, and the sound data acquired by the external sound collector is input via wireless or wired, and the sound is detected using the sound data input from the external sound collector.
- the unit 208 may perform voice detection.
- the detection reference control unit 204 may change the detection reference 42 based on time.
- the change of the detection reference 42 in this case includes a change of the position and width of the detection reference 42 and a change of a reference value indicating a predetermined non-contact operation 600.
- the display device 1 of the modification 8 changes the detection standard 42 by a predetermined amount.
- the control unit 20 is provided with a time measuring unit, and when the power switch of the display device 1 is turned on and there is no operation for a predetermined time, an icon or the like, the control unit 20 detects based on the output of the time measuring unit that timed the predetermined time.
- the reference control unit 204 changes the detection reference 42 by a predetermined amount. In addition, when an operation for a certain icon or the like is performed and then an operation for the next icon or the like is not performed after a predetermined time, the detection reference control unit is based on the output of the time measuring unit that clocks the predetermined time. 204 changes the detection reference 42 by a predetermined amount. When the detection reference 42 is changed based on the measurement of a predetermined time in the modified example 8, it is desirable to change the detection reference 42 so as to move a predetermined amount in a direction approaching the user.
- the center position (overall position) of the detection reference 42 may be moved in a direction approaching the user by a predetermined amount, for example, 1 cm, and the width of the detection reference 42 may be increased. It may be changed. Further, the center position of the detection reference 42 may be moved so as to include the position of the user's finger when the predetermined time has elapsed, or the width of the detection reference 42 may be changed. Further, the reference value indicating the predetermined non-contact operation 600 may be changed so that the detected value is detected as a user operation when a predetermined time has elapsed.
- the reference value indicating the predetermined non-contact operation 600 may be changed by a predetermined amount. For example, when a value indicating an operation “down 10 cm” is stored as a reference value indicating the predetermined non-contact operation 600, a value indicating an operation “down 9 cm” after a predetermined time has elapsed The reference value indicating the operation 600 may be changed (updated).
- the detection criterion control unit 204 may change the detection criterion 42 based on the user's face.
- the change of the detection reference 42 in this case includes a change of the position and width of the detection reference 42 and a change of a reference value indicating a predetermined non-contact operation 600.
- the user's face is imaged by the camera provided in the display device 1 of the modification 9, and the control unit 20 analyzes the captured image to detect a predetermined facial expression of the user's face (using a so-called face recognition function).
- the detection reference 42 is changed.
- the predetermined facial expression is, for example, a troubled face when the user cannot perform an operation well, and the detection reference 42 is changed when a troubled face of the user is detected.
- the detection reference 42 may be moved in a direction approaching the user by a predetermined amount (for example, 1 cm), or the width of the detection reference 42 May be changed.
- a detection value indicating an operation performed by the user immediately before recognizing the face in trouble is stored, and a reference value indicating a predetermined non-contact operation 600 is set based on the stored detection value. It may be changed.
- the detection reference control unit 204 may change the detection reference 42 (the position and width of the detection reference 42 and a reference value indicating a predetermined non-contact operation) when the gesture operation by the user is not detected by the detection reference 42.
- the gesture operation by the user as the predetermined non-contact operation 600 is, for example, one of movements such as goo, choki, and par with a hand, or a lateral movement following a downward movement of the finger F
- the display device 1 stores the feature information (reference value indicating the feature) of each operation in the storage unit 205 in advance.
- the display device 1 detects the user's gesture operation, and compares the detected gesture operation with any one of the feature information selected from the plurality of feature information stored in the storage unit 205. Then, it is determined whether or not the gesture operation corresponds to any one of the predetermined non-contact operations.
- the display device 1 changes the detection criterion 42 when the user's gesture operation is not detected by the detection criterion 42.
- the change of the detection reference 42 in this case is a change of the selection of the reference value indicating the predetermined non-contact operation 600. That is, for example, it is assumed that the display device 1 initially selects feature information indicating goo as a reference value to be used for detection by the detection reference 42.
- the display device 1 uses a reference value based on feature information indicating goo as an operation different from goo (another operation in the above-described plurality of gesture operations, for example, Select and change to feature information indicating “Chioki”.
- the predetermined non-contact operation 600 is an operation in which the position of the finger F coincides with the predetermined position
- the predetermined position is in the detection reference 42 or the predetermined position is outside the detection reference 41.
- the predetermined position matches the display position of the icon, or the predetermined position is the position of the detection reference 42.
- the predetermined position is in the detection reference 42, it is determined that the predetermined non-contact operation 600 has been performed when the finger is present on the detection reference 42.
- the predetermined position is outside the detection reference 41, it is determined that the predetermined non-contact operation 600 has been performed when the finger is outside the detection reference 41.
- the predetermined non-contact operation 600 is performed when the finger F matches the display position of the aerial image icon or when the icon display position is operated. judge.
- the predetermined position is the position of the detection reference 42, when the finger F passes the boundary between the detection reference 42 and the non-detection reference 41, or the finger passes the boundary and passes the boundary again. It is determined that a predetermined non-contact operation 600 has been performed.
- the detection reference 42 has been described as having a width D1 in the vertical direction.
- the detection reference 42 is configured by a surface like the detection reference 40 of the first embodiment. Also good.
- the detection is performed when a predetermined non-contact operation 600A for making a U-turn is performed downward from the detection reference 40 at a position of the distance L1 or the distance L1 or more. Based on the reference 40, it is detected that a predetermined non-contact operation 600A has been performed.
- a predetermined non-contact operation 600A is performed above the detection reference 40 (within the capacitance detection range 13A) as shown in FIG.
- the predetermined non-contact operation 600A is detected and the predetermined non-contact operation 600A is performed by passing through the detection reference 40 as shown in FIG. 61 (b), the predetermined non-contact operation 600A is detected. A part is detected using the operation detector 13 at a detection criterion 41.
- the displacement amount ⁇ H10 is calculated based on the distance from the detection reference 40, and the position of the detection reference 40 (the position in the Z direction in FIG. 61) is calculated based on the displacement amount ⁇ H10. ) Should be changed.
- the eighth embodiment and its modifications 1 to 12 can be performed by the display device 100 described in the fourth embodiment, its modification 1 and the seventh embodiment.
- the eighth embodiment and its modifications 1 to 12 the case where the predetermined non-contact operation 600 for the operation to the display position of the aerial image has been described, but the present invention is not limited to this example.
- the predetermined non-contact operation 600 is performed in space on the image displayed on the display 11 of the display device according to the eighth embodiment and the modifications 1 to 12 thereof, the predetermined non-contact operation is performed.
- the position of the detection reference 42 may be changed based on the positional relationship between the position 600 in space and the detection reference 42.
- the display device 1 including at least the control unit 20, the display unit 11, and the operation detector 13 has been described as an example.
- the control apparatus comprised by the comprised control apparatus and the control part 20 and the indicator 11 may be sufficient.
- the control unit 20 may include at least the calibration unit 203 and the detection reference control unit 204.
- the display device 1 has been described with the configuration in which the image displayed on the display 11 is generated as an aerial image by the imaging optical system 12.
- the configuration in which the image displayed on the display 111 is generated as an aerial image by the imaging optical system 112 has been described.
- the configuration for generating the aerial image is not limited to the above configuration, and includes a method described below. Needless to say, the configuration described below is an example, and other methods for generating an aerial image are also included.
- an image viewed with the right eye and an image viewed with a left eye having a parallax are displayed on the display device of the display device.
- the user has a method of generating an image that feels different in depth from the image displayed on the display. According to this method, the user recognizes that an image corresponding to the image displayed on the display device is displayed in the air.
- a transmissive head-mounted display can be used as a display device and can be attached to the user.
- the user feels as if the image displayed on the HMD is displayed in the air because the displayed image is superimposed on the actual field of view.
- examples of a method for generating an aerial image include a method of projecting a virtual image and a method of directly forming an image on the user's retina.
- a method for generating an aerial image a laser beam may be collected in the air, and molecules constituting the air may be turned into plasma to emit light in the air to form an image in the air.
- a three-dimensional image is generated as a real image in the air by freely controlling the condensing position of the laser light in the three-dimensional space.
- a display device having a function of generating mist in the air in addition to the projector function is used, and a screen is formed by generating mist in the air.
- An image may be generated in the air by projecting (fog display).
- a program for executing calibration using the display device 1 or 100 may be recorded on a computer-readable recording medium, and the program may be read into a computer system to execute calibration.
- the “computer system” may include hardware such as an OS (Operating System) and peripheral devices.
- the “computer system” includes a homepage providing environment (or display environment) using the WWW system.
- the “computer-readable recording medium” includes a flexible disk, a magneto-optical disk, a ROM, a writable nonvolatile memory such as a flash memory, a portable medium such as a CD-ROM, a hard disk built in a computer system, etc. This is a storage device.
- the above-mentioned “computer-readable recording medium” is a volatile memory (for example, DRAM (Dynamic DRAM)) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. Random Access Memory)), etc. that hold programs for a certain period of time.
- DRAM Dynamic DRAM
- the “program” may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium.
- the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
- the program may be for realizing a part of the functions described above. Furthermore, what can implement
- the above-described embodiments and modifications are combined to detect a user's operation on an aerial display, and a detection unit that detects a predetermined non-contact operation based on a detection criterion, and detects the operation
- a control unit that changes the positional relationship between the detection criterion and the display to change the detection criterion when a predetermined non-contact operation is not detected by the detection criterion, and the control unit can change the positional relationship by the user
- the configuration of the detection device may be taken.
- the control unit of the detection apparatus may further change the detection reference based on the sound. Further, the control unit of the detection device may further change the detection reference based on time. Further, the control unit of the detection apparatus may further change the detection reference based on the user's face.
- control part of this detection apparatus may change a detection reference
- control unit of the detection device may further change the detection reference when an operation to be pressed as a predetermined non-contact operation is not detected by the detection reference.
- control part of this detection apparatus may change a detection reference
- the control unit of the detection apparatus may further change the detection reference when the shape of the operation object that performs the non-contact operation does not match the predetermined shape as the predetermined non-contact operation.
- a detection unit including a detection criterion for detecting a non-contact operation, and a non-contact operation is detected.
- a control unit that changes the positional relationship between the detection criterion and the display and changes the detection criterion based on a non-contact operation detected outside the detection criterion, and the control unit can detect the positional relationship can be changed by the user
- the configuration of the apparatus may be taken.
- a detection unit that detects a predetermined non-contact operation based on a detection standard
- a control unit that changes the detection standard when the predetermined non-contact operation is not detected based on the detection standard
- the control unit may have a configuration of a detection device that changes the positional relationship based on a user's operation.
- a detection unit that detects a predetermined non-contact operation with a detection criterion
- a control unit that changes the detection criterion when the predetermined non-contact operation is not detected with a detection criterion
- the control unit may take the configuration of a detection device that changes the positional relationship based on user information.
- a detection unit that detects a predetermined non-contact operation with a detection criterion
- a control unit that changes the detection criterion when the predetermined non-contact operation is not detected with a detection criterion
- the control unit may have a configuration of a detection device that changes the positional relationship based on a change in the environment around the detection device by a user.
- a detection device that detects a user's operation on an aerial display
- a detection unit that detects a predetermined non-contact operation based on a detection criterion
- a control unit that changes the detection reference when the operation is not detected by the detection reference
- the control unit may have a configuration of a detection device that controls the display and changes the positional relationship.
- the positional relationship between the display and the detection device that detects the user's operation on the display in the air is changed by controlling the display based on the user's operation.
- a control unit may be provided, and the control unit may have a configuration of a control device in which the positional relationship can be changed by the user.
- the positional relationship between the display and the detection device that detects the user's operation on the display in the air is changed by controlling the display based on the user's information.
- a control unit may be provided, and the control unit may have a configuration of a control device in which the positional relationship can be changed by the user.
- the positional relationship between the display and the detection device that detects the user's operation on the display in the air is displayed based on the environmental change around the detection device by the user.
- the control unit may be configured to control and change the control unit, and the control unit may change the positional relationship by the user.
- a control unit that changes the detection reference when a predetermined non-contact operation is not detected by the detection reference and the control unit is a control device that can change the positional relationship by the user.
- a configuration may be taken.
- the control unit of the control device may further change the detection reference based on the sound.
- the control part of this control apparatus may change a detection reference further based on time.
- the control unit of the control device may further change the detection reference based on the user's face.
- control part of this control apparatus may change a detection reference
- control unit of the control device may further change the detection reference as a predetermined non-contact operation when the pressing operation is not detected by the detection reference.
- control part of this control apparatus may change a detection reference
- the control unit of the control device may further change the detection reference as a predetermined non-contact operation when the shape of the operation object that performs the non-contact operation does not match the predetermined shape.
- a detection unit including a detection reference for detecting a non-contact operation, and a positional relationship between the detection reference for detecting a user operation with respect to an aerial display and the display.
- a control unit that changes the detection reference based on a non-contact operation detected outside the detection reference, and the control unit has a configuration of a control device whose positional relationship can be changed by the user. good.
- the present invention is not limited to the above-described embodiment and the above-described modifications, as long as the characteristics of the present invention are not impaired.
- the scope of the present invention also includes other forms conceivable within the scope of the technical idea of the present invention. Contained within. Moreover, you may combine suitably the form of the said Example and the said modification for this invention.
Abstract
Description
本発明の第2の態様によると、制御方法であって、空中の表示を制御し、使用者の操作を検出する検出基準と表示との位置関係を使用者によって変更する。
本発明の第3の態様によると、コンピュータに実行させるプログラムであって、空中の表示を制御し、使用者の操作を検出する検出基準と表示との位置関係を使用者によって変更する処理をコンピュータに実行させる。
図面を参照しながら、第1の実施の形態による表示装置について説明する。第1の実施の形態においては、本実施の形態の表示装置が携帯電話に組み込まれた場合を一例に挙げて説明を行う。なお、本実施の形態の表示装置は、携帯電話に限らず、タブレット端末、腕時計型端末等の携帯型情報端末装置、パーソナルコンピュータ、音楽プレイヤ、固定電話機、ウエアラブル装置等の電子機器に組み込むことが可能である。
図1(a)は表示装置1の分解斜視図であり、図1(b)は表示装置1の一部を拡大して示す側面図である。なお、説明の都合上、表示装置1について、X軸、Y軸およびZ軸からなる座標系を図示の通りに設定する。なお、座標系はX軸、Y軸およびZ軸からなる直交座標系に限らず、極座標系や円筒座標系を採用してもよい。即ち、X軸は、表示装置1の矩形表示面の短辺方向に設定され、Y軸は、表示装置1の矩形表示面の長辺方向に設定され、Z軸は、表示面に垂直な方向に設定されている。
なお、到達位置50が検出基準40よりも下方に位置する場合には、指は、到達位置50に到達する前に、検出基準40を通過するので、検出基準制御部204は、操作検出器13の検出出力に基づき、指が検出基準40に達したことを判定するが、第1のキャリブレーション処理モードでは、表示器13の表示の切り替えを行わない。到達位置50が検出基準40に一致する場合も同様に、表示器13の表示の切り替えを行わない。勿論、指が検出基準40に達した時に、例えば、アイコン300Aを点滅させるなどのハイライト表示をして、指が検出基準に到達したことをユーザに告知してもよい。
なお、以上の説明では、アイコン300Aの表示位置への操作の例としてユーザによるアイコン300Aを押し下げる操作を例に挙げたがこれに限られない。ユーザによるアイコン300Aに対する所定の非接触操作が操作検出器13によって検出された場合、所定の非接触操作が行われた場所に基づいて、検出基準40を変更してもよい。所定の非接触操作とは、例えば、アイコン300Aをタッチするようなジェスチャー操作である。アイコン300Aをタッチするような操作を行った位置に基づいて検出基準40を変更してもよい。タッチするような操作とは、例えば、アイコン300Aをユーザの手で振り払うようなジェスチャーなどが挙げられる。また、タッチするような操作を行った位置として、ユーザの手で振り払うようなジェスチャーが終了し、ユーザの手が停止したと判定された位置やユーザの手で振り払うようなジェスチャーが開始された位置に基づいて決定してもよい。
を停止する。検出基準制御部204は、操作検出器13が検出する静電容量の値が殆ど変化しなくなったことをもって、指の押下げの停止を判定し、この押下げの停止位置を到達位置50と判定、即ち決定する。なお、下方移動の停止は、操作検出器13が検出する静電容量の値が短時間、例えば0.1秒~1秒程度の間、変化しないことをもって、判定される。更に別の方法としては、ユーザの指の移動の速度ベクトル、即ち、指の移動速度と移動方向とを、静電容量の変化から検出し、指の速度ベクトルの向きが下方向から逆方向に変化したこと及び逆方向の速度ベクトルが所定の大きさになったことに基づき、逆方向の所定の大きさの速度ベクトルを検出した時の指の位置を到達位置と判定してもよい。この方法の場合には、上記の速度ベクトルの所定の大きさを略ゼロに設定すると、速度ベクトルの向きが下向きから逆向きに変化する時点の指の位置、即ち最も下方の位置が到達位置として判定され、所定の大きさをゼロ以外の値に設定すると、指が上述の最も下方の位置から所定距離だけ上方の位置が、到達位置として判定される。以上のように、到達位置は、検出基準制御部405によって指がアイコンの表示位置を操作したと判定された時の指の最下方位置またはその近傍の位置に決定される。
(第1の実施の形態の変形例1)
変形例1の表示装置1は、操作検出器13の検出出力に基づき、ユーザの指先の速度あるいは加速度を算出し、算出した速度または加速度に基づき指の到達位置を予測し、その予測到達位置に基づいて検出基準を変更するものであってもよい。図9は、本変形例1の表示装置1の構成のうち、制御部20と、制御部20によって制御される表示器11及び操作検出器13とを示したブロック図である。
なお、上述の説明では、速度・加速度検出部206は、操作検出器13によって検出される静電容量値を所定の時間毎に読み出し、所定の時間当たりの静電容量値の変化から指の移動速度を算出すると共に、算出した速度から指の移動加速度を算出したが、この方法に限られず、速度・加速度検出部206として撮像装置を用いてもよい。また、上述の説明では、ユーザの指の移動速度または加速度を算出したが、そのほかに、ユーザの足や肘、ユーザが所持しているスタイラスペンであってもよい。
なお、算出したユーザの指の移動速度及び加速度に基づきユーザの指の予測到達位置60を算出し、ユーザの指の予測到達位置60に基づいて検出基準40を変更していたが、操作毎にユーザの指の予測到達位置60を決定する必要はない。ユーザが操作を行う前に意図せず動いてしまった場合、その動きに基づいて予測到達位置60を算出してしまうと、極端に上の位置に検出基準40が設定しまうなど、適切な位置に検出基準を設定することができなくなってしまう。上述のような場合を防ぐため、ユーザの指の移動速度及び加速度を検出した場合に、所定の閾値以上のユーザの指の移動速度及び加速度を検出した場合のみ予測到達位置60を算出し、予測到達位置60に基づいて検出基準40の位置を変更してもよい。
第1の実施の形態及び変形例1に係る表示装置1は、一回のキャリブレーション処理において、到達位置を検出、または予測し、その到達位置に基づき検出基準を変更し、その検出基準の位置データを記憶部205に記憶し、空中像操作モード時に空中像操作モードの検出基準を記憶部205に記憶された検出基準の位置に設定、または変更するものであったが、その代わりに、変形例2に係る表示装置は、複数回のキャリブレーション処理においてそれぞれ設定された複数の検出基準の位置をそれぞれ記憶部205に記憶し、その記憶された複数の検出基準位置に基づき、空中像操作モードにおける検出基準を変更するものである。第一回目のキャリブレーション処理において検出基準制御部204は、操作検出器13の検出出力に基づき指の到達位置50を判定し、この到達位置50に基づき検出基準40を変更し、その検出基準40の位置データを記憶部205に記憶する。引き続き第2回目のキャリブレーション処理を行い、同様に変更された検出基準の位置データを記憶部205に記憶する。更に、引き続き第三回のキャリブレーション処理を行ってもよい。このようにして連続的に行われた複数回のキャリブレーション処理によって、記憶部205に記憶された複数の検出基準の位置データから一つの検出基準を算出し、その算出された検出基準の位置データを記憶部205に記憶する。その後に実行される空中像操作モードでは検出基準を、記憶部205に記憶された算出検出基準の位置に設定する。
更に、キャリブレーション処理において、キャリブレーション処理毎に、検出基準の変更を行うのではなく、複数回のキャリブレーション処理において、到達位置を判定した回数と、実際に到達位置が検出基準に達したことを判定した回数と、から、アイコンの表示位置への操作が失敗した回数を算出して、所定の回数以上、失敗したと判定された時に、検出基準を変更するようにしてもよい。
第1の実施の形態においては、ユーザの指の空中像の表示位置に対する操作を検出して到達位置を判定して、到達位置に基づいて検出基準の変更を行った。しかし、ユーザは、空中像のアイコンの表示位置を操作したと感じた指の位置を指定して、その指定位置を検出基準制御部が判定し、指定位置に基づいて検出基準を変更して、検出基準と空中像との位置関係を変更してもよい。以下に、ユーザが空中像の表示位置を操作する位置を指定位置として指定する変形例を説明する。なお、以下の説明は、第1の実施の形態における第1のキャリブレーション処理モードに適用した場合について説明するが、第2のキャリブレーション処理モードや上述した変形例1、2にも適用できる。
指が下降移動から横方向の移動に変化し、その横方向移動が終了した時の指の高さを指定位置50Aとして判定してもよい。また、検出基準制御部204は、指の横方向の移動開始から横方向の移動終了までの指の高さの平均値や中央値などを指定位置50Aとしてもよい。
変形例3の表示装置1においては、ユーザは、指先でアイコンの表示位置を操作したと思った位置を指定位置として、指の下降移動から横方向への移動に変化させることによって指定した。変形例4の表示装置1は、ユーザが、指先でアイコンの表示位置を操作したと思った位置を別のアイコンを操作することによって、指定するものである。このキャリブレーション処理について次に説明する。なお、説明は、第1の実施の形態における第1のキャリブレーション処理モードに適用した場合について説明するが、第2のキャリブレーション処理モードや上述した変形例1~3にも適用できる。
なお、右側のアイコン300Cの表示位置を操作する右手の指は、その操作したと感じた時の位置が指定位置と判定されるので、空中像に接近する下降移動が必要である。しかしながら、左側のアイコン300Dの表示位置への操作をする左側の指は、指をアイコン300Dの上または下に位置付ければよいので、必ずしも左手の指を下降移動させる必要はなく、左手の指は、例えば空中像300の平面に平行な方向、即ち横方向に移動してアイコン300Dの上または下の位置まで横方向移動してもよい。
また、必ずしも左手の指と右手の指とを用いる必要はなく、キャリブレーション用空中像300のアイコン300C上とアイコン300D上との両方において上記の動作が検出されればよい。例えば、一方の手の2本の指を用いてもよい。また、この変形例4は、アイコン300Dの表示位置への操作をする代わりに、表示装置1に設けた不図示の決定ボタンを押すように構成することもできる。
さらに、ユーザがアイコン300Dの表示位置への操作をしたり、不図示の決定ボタンを押したりした時点における右手の指先の位置を指定位置と判定するものに代えて、ユーザがたとえば左手で所定のジェスチャーを行ったことを検出したときにおける右手の指先の位置を指定位置と判定しても良い。この場合、表示装置1は、後述する変形例8の撮像装置18(図22参照)を備え、この撮像装置18により取得された画像を用いてユーザのジェスチャー(たとえば手をグーからパーに変える)が行われたことを検出する。
変形例5の表示装置は、ユーザが指先でアイコンの表示位置を操作したと思ったら、ユーザが指を所定時間停止させることにより指定位置を指定するものである。なお、この本変形例は、第1の実施の形態における第1のキャリブレーション処理モードに適用した場合について説明するが、第2のキャリブレーション処理モードや上述した変形例1~4にも適用できる。
この場合には、キャリブレーション用の空中像に含まれるアイコンに、メッセージ「キャリブレーションを行います。このアイコンの位置を指し示して下さい。指し示した状態をしばらくキープして下さい。」が重畳表示される。ユーザがアイコンの表示位置を操作したと感じて、しばらくの間、指の移動を停止すると、操作検出器13は指の下降移動が所定時間停止したことを検出する。検出基準制御部204は、このときの操作検出器13の検出出力に基づき、指の停止位置を指定位置として判定する。
この指定位置の判定は、次のように行われる。即ち、下方に移動している指先Fが停止して上下方向の比較的小さい所定の停止範囲内に所定の時間以上、滞在したことをもって、アイコン300Aの表示位置への操作が行われたと判定する。このように、指先Fが所定の時間以上、所定の停止範囲内に留まることをもって、指先Fのアイコン300Aの表示位置への操作と判定する理由は、操作が空中像300のアイコン300Aの表示位置への操作であるため、タッチパネルへの操作とは異なり、指先Fがアイコン300Aの表示位置で完全には停止しない可能性があるからである。なお、指定位置判定のための所定の停止範囲は、操作検出器13の静電容量の検出範囲13Aに比べて十分に小さい値、例えば5mmに設定され、また、所定の時間は、例えば2秒程度に設定される。
変形例6の表示装置は、ユーザが、指先でアイコンの表示位置を操作したと思った指定位置を発声によって指定するものである。なお、この変形例は、第1の実施の形態における第1のキャリブレーション処理モードに適用した場合について説明するが、第2のキャリブレーション処理モードや上述した変形例1~5にも適用できる。
なお、表示装置1が集音器14を備えず、外部の集音装置によって取得された音声データを無線や有線を介して入力し、外部の集音装置から入力した音声データを用いて音声検出部208が音声検出をしても良い。
上記においては、検出基準を単一の平面または段差を持った平面として説明した。しかし、検出基準は面ではなく、厚みを持った領域であってもよい。このような領域を持った検出基準のキャリブレーション処理について説明する。なお、説明は、第1の実施の形態における第1のキャリブレーション処理モードに適用した場合について説明するが、第2のキャリブレーション処理モードや上述した変形例1~6にも適用できる。また、上述した変形例1~6に適用してもよい。
また、指先がキャリブレーション処理用のアイコンの上方に位置している限り、ユーザの指が斜めに、すなわちZ方向に対して角度を有して下降移動した場合であっても、到達位置または指定位置を判定することができる。
上記説明において、ユーザの指先の下方への移動を、静電容量パネルにより構成される操作検出器13によって検出したが、撮像装置によりユーザの指先の位置を検出してもよい。変形例8の表示装置1は、図22に示すように、操作検出器として撮像装置(例えばデジタルカメラ)18を備え、この撮像装置18は表示装置1の上面に配置される。このような表示装置1のブロック図を図23に示す。
変形例8の表示装置1においても、図24に示すように、空中像30が表示装置1の結像光学系12の上方に距離H1だけ離れた位置に形成され、検出基準40は、結像光学系12の上方に距離H2(H1<H2)だけ離れた位置に設定される。撮像装置18は、結像光学系12の表面から上方にユーザの指先の位置を検出するための検出範囲13Aを有する。図24では、撮像装置18の上方に撮像可能な範囲の限界を破線13aで示し、この検出限界13aと結像光学系12の表面との間隔が検出範囲13Aとして示される。変形例8においても、上記の第1の実施の形態や変形例1~7の場合と同様に、空中像30と検出基準40とが検出範囲13A内に位置するように設定される。なお、検出基準24は、図24では空中像30の上方に設定されているが、検出範囲13A内であれば、空中像30の下方でも、または空中像30の位置に一致させてもよい。また、変形例8においても、検出範囲13A内のうち検出基準40に設定された領域以外の範囲は検出基準外41である。なお、検出範囲13Aは、撮像装置18の撮像可能な範囲の限界として設定されるものに限定されず、撮像可能な範囲のうち一部の範囲(たとえば図24中の左右方向の端部の所定範囲)を除いた範囲として設定されてもよい。
図面を参照しながら、第2の実施の形態に係る表示装置1について説明する。第2の実施の形態においては、本実施の形態の表示装置1が携帯電話に組み込まれた場合を一例に挙げて説明を行う。なお、本実施の形態の表示装置は、携帯電話に限らず、タブレット端末、腕時計型端末等の携帯型情報端末装置、パーソナルコンピュータ、音楽プレイヤ、固定電話、ウエアラブル装置等の電子機器に組み込むことが可能である。
なお、図26のフローチャートのステップS201において、ユーザによりユーザ情報入力モードの操作ボタンが操作されたかどうかを判定していたが、この処理を必ずしも行う必要はなく、装置がユーザ情報を取得した段階でステップS204に移行してもよい。
第2の実施の形態は次のように変形できる。ユーザによるユーザ情報は、表示装置1に対して行う代わりに、表示装置1とは別の情報入力装置に対して行い、その情報を、インターフェースを介して表示装置1に転送してもよい。また、ユーザ情報は、ICカードに予め記録されていてもよく、その場合には、表示装置1または情報入力装置には、カード情報読み取り機能が備えられることが好ましい。
図面を参照しながら、第3の実施の形態に係る表示装置1について説明する。第3の実施の形態においては、本実施の形態の表示装置1が携帯電話に組み込まれた場合を一例に挙げて説明を行う。なお、本実施の形態の表示装置は、携帯電話に限らず、タブレット端末、腕時計型端末等の携帯型情報端末装置、パーソナルコンピュータ、音楽プレイヤ、固定電話機、ウエアラブル装置等の電子機器に組み込むことが可能である。
本実施の形態の表示装置1は、図1に示した表示装置1と同様であり、その主要部の構成は、図27に示すブロック図で表される。本実施の形態の表示装置1は、制御部20と、制御部20によって制御される表示器11と操作検出器13と環境検出部19とを備える。環境検出部19は、表示装置1の周囲の使用環境を検出する。制御部20は、画像生成部201と、表示制御部202と、キャリブレーション部203と、検出基準制御部204と、記憶部205と、環境解析部211とを備える。
図面を参照しながら、第4の実施の形態に係る表示装置100について説明する。本実施の形態における表示装置100は、操作検出器の構成が第1の実施における操作検出器13とは異なる。図29は、本実施の形態に係る表示装置を示す図であり、図29(a)は、第4の実施の形態による表示装置100の操作検出器100の概略構成を説明する断面図である。また、図19(b)は表示装置100が組み込まれた電子機器の一例として、現金自動預払機(ATM装置)200の斜視図である。表示装置100は、現金自動預払機200において、ユーザが暗証番号や金額等を入力するための前面パネル上に装備されている。なお、表示装置100は、現金自動預払機に限らず、鉄道やバスの乗車券・定期券等の各種の自動券売機や、図書館や美術館等の各種の情報検索端末装置等に広く組み込むことが可能である。説明の都合上、表示装置100について、X軸、Y軸およびZ軸からなる座標系を図示の通りに設定する。
更には、検出基準は、図21(c)に示したように、幅d2を持つ検出基準に変更することもできる。たとえば到達位置50に対して上方の投光素子116aと受光素子117aとが選択され、投光素子116aと受光素子117aとを含む面を上面401とし、到達位置50に対して下方の投光素子116cと受光素子117cとが選択され、投光素子116cと受光素子117cとを含む面を下面402とする、すなわち複数の投光素子116と受光素子117との組のうち、一組の投光素子116と受光素子117とを含む面で上面401を設定し、他の一組の投光素子116と受光素子117とを含む面で下面402を設定すればよい。
上述したように、本実施の形態は、複数設置された投光素子116と受光素子117により設定可能な複数の検出基準の中から、到達位置に基づいて一組あるいは複数組の投光素子116と受光素子117とを選択することにより検出基準を選択して、検出基準の位置を変更するものである。
なお、表示装置100はアクチュエータとエンコーダとを備え、投光素子116と受光素子117とをZ方向に微小距離移動させてもよい。たとえば、到達位置50から距離d1の位置に検出基準42を変更する場合、到達位置50から距離d1の位置に最も近い位置の投光素子116と受光素子117とが選択される。この選択された投光素子116と受光素子117とが配置された位置と到達位置50から距離d1との差分に基づいて、投光素子116と受光素子117とがアクチュエータにより移動されて、投光素子116と受光素子117との位置が微調整される。すなわち、投光素子116と受光素子117との位置が微調整されることにより、検出基準42を到達位置50から距離d1により接近した位置に変更することができる。
第4の実施の形態は次のように変形できる。第4の実施の形態の表示装置100においては、操作検出器113がZ方向に複数段の二次元状に配列された投光素子116、受光素子117を有する構成として説明した。しかし、二次元状に配列された投光素子116、受光素子117は1段のみでもよい。このような操作検出器113’を装備した構成の表示装置100を図33に示す。操作検出器113’は枠状のハウジング115’を備え、枠状のハウジング115’の内面、を構成する4面にうち、隣接する2面にはXY平面に平行な一列に複数の投光素子116が配置され、残る隣接する2面にはXY平面に平行な一列に複数の受光素子117が配置されている。すなわち、操作検出器113’は、図32にて説明した6段構成の操作検出器113のうちの1段のみで構成したものである。ハウジング115’にはアクチュエータ119が接続され、Z方向に所定の周期(たとえば、毎秒10サイクル)で往復移動させる。ハウジング115’の位置はアクチェータ119に組み込まれた位置を検出するセンサ、例えばエンコーダ(不図示)、により検出される。この場合、ハウジング115’を往復移動可能な範囲に含まれる所定の位置を検出基準40と設定する。
以上の実施の形態及びその変形例は、キャリブレーション処理によって検出基準を制御または変更して、検出基準と空中像の表示位置との位置関係を変更するものであったが、次に、キャリブレーション処理によって空中像の表示位置を変更して、検出基準と空中像の表示位置との位置関係を変更する第5の実施の形態を説明する。
図35及び図36は、第5の実施の形態の表示装置を示すものである。第5の実施の形態の表示装置1は、第1の実施の形態の表示装置と同様に、図35に示すように制御部20を内蔵する本体10と、表示器11と、結像光学系12と、操作検出器13とを備えると共に、図36に示すように画像生成部201と、表示制御部202と、キャリブレーション部203と、検出基準制御部204と、記憶部205とを備える。第5の実施の形態の表示装置1は、上述の構成に加えて更に、表示位置変更部500と、表示位置制御部220とを備える。
表示器11と結像光学系12と操作検出器13とは、図1に示した第1の実施の形態の表示器11と結像光学系12と操作検出器13の各々と同一の構成のものを使用することができるが、本実施の形態の結像光学系12または表示器11は、上述のように結像光学系12の光軸方向に移動可能に構成されている。
なお、以下の説明では、モーターまたはアクチュエータなどの駆動部を有し、結像光学系12を矢印で示すように結像光学系12の光軸方向に移動して、結像光学系12によって形成される空中像30の表示位置をZ軸方向、すなわち光軸方向に移動して変更する例で説明を行うが、これに限定されず、表示位置制御部220は、表示器11を制御し、右目で視認するための表示画像と、右目で視認する画像とは視差を有する左目で視認するための表示画像とを表示させ、空中像30の奥行の表示位置を変更してもよい。
また、表示位置制御部220及び表示位置変更部500は、指の到達位置50または指定位置50Aが図37(b)に示したように、検出基準40よりも下方に位置する場合には、到達位置50または指定位置50Aと検出基準40との間隔ΔHを算出して、空中像300の表示位置を例えば間隔ΔHだけ上方である点線の表示位置300に移動させる。
更に、表示位置制御部220及び表示位置変更部500は、指の到達位置50または指定位置50Aが、検出基準40に一致した場合、または検出基準40の近傍位置に存在した場合には、空中像300の表示位置を移動しない。
なお、表示位置制御部220及び表示位置変更部500は、到達位置50または指定位置50Aが検出基準40よりも上方に位置する場合には、空中像300の表示位置を下方向に移動させ、到達位置50または指定位置50Aが検出基準40よりも下方に位置する場合には、空中像300の表示位置を上方向に移動させるが、その移動量は、上述のように到達位置50または指定位置50Aと検出基準40との間隔ΔHに一致させる必要はなく、第1の実施の形態において説明したように、間隔ΔHよりも大きくしてもまたは小さくしてもよい。
このような統計的処理による空中像の移動量は、例えば、全てのユーザに共通の値であってもよいし、ユーザの年齢層毎に異なった値であってもよいし、性別毎に異なった値であってもよい。なお、この統計的な処理による空中像の移動量の決定方法は、上述の第1の実施の形態において、到達位置または指定位置に基づいて検出基準を変更する際の検出基準の移動量を決定する場合にも適用することができる。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、第2のキャリブレーション処理モードにも適用できる。
また、キャリブレーション処理において空中像を移動することは、キャリブレーション処理において検出基準を変更する場合に変更後の検出基準が図3に示した操作検出器13の検出範囲13Aの外に出てしまったり、検出範囲13Aの上限または下限付近に位置するといった事態になりそうな場合には、検出基準の変更の代わりに、空中像の移動を行えば、上記の事態を避けることができる。
次に、第5の実施の形態の表示装置の変形例1を説明する。
第5の実施の形態の表示装置は、キャリブレーション処理において、到達位置または指定位置に基づき空中像の表示位置を変更したが、第5の実施の形態の表示装置の変形例1は、キャリブレーション処理において、到達位置または指定位置に基づき、表示位置制御部220及び表示位置変更部500が空中像の表示位置を変更すると共に、検出基準制御部204が検出基準の位置を変更する。この空中像の表示位置の変更および検出基準の位置の変更によって、ユーザの操作特性に適した空中像の表示位置と検出基準との位置関係をうることができる。表示位置変更部500が、到達位置または指定位置に基づいて決定される適正な表示位置に高精度に空中像を移動することが難しい場合には、表示位置変更部500が空中像の表示位置を粗調整し、検出基準制御部204が検出基準を微調整することによって、空中像の表示位置と検出基準との位置関係を適正に設定することができる。
第5の実施の形態の表示装置の変形例2を以下に説明する。変形例2の表示装置は、表示位置制御部220及び表示位置変更部500が空中像の表示位置を移動する際に、移動開始から移動終了の間に空中像の表示をフェードアウトしそれからフェードインする。即ち、空中像の移動開始に伴い表示の輝度を徐々に小さくし、その後、空中像の移動終了に向けて表示輝度を徐々に大きくする。キャリブレーション処理として空中像を移動させているが、ユーザが視認可能な空中像が移動すると、ユーザに違和感が生じる場合がある。そこで、空中像の移動開始に伴い表示の輝度を徐々に小さくすることにより、空中像の移動がユーザに視認されづらくなり、ユーザの違和感を軽減できる。
また、表示制御部202は、空中像の移動中、空中像の表示輝度やコントラストを下げたり、輝度やコントラストを下げた状態で空中像の表示を点滅させたり、更には空中像の表示を消したりすることができる。このように、表示位置変更部500による空中像の移動を目立たなくする、即ち視認し難くすることで、ユーザの違和感を軽減できる。
また、逆に、空中像の移動に伴って空中像自体を目立たせるような表示態様を行っても良い。空中像を目立たせるような表示態様とは、空中像の移動中、空中像の表示輝度やコントラストを上げたり、空中像の表示を点滅させたりすること等である。このように、空中像自体を目立たせるような表示態様を行うことによって、空中像の移動を目立たさなくすることができ、ユーザは空中像の移動よりも空中像自体を注目するようになる。そのためユーザは、空中像の移動が気にならなくなり、ユーザの違和感を軽減できる。
上記のように空中像の移動時における空中像の表示態様の変更は、図38のフローチャートにおけるステップS306の処理の最中に行われる。
このような空中像の移動中における空中像の表示を目立たなくしたり、目立つようにしたりすることは、空中像の全体に対して行うのではなく、空中像の一部、例えば、キャリブレーション処理用のアイコンに対して行うようにしてもよい。また、このような空中像の移動を目立たせるか否かは、ユーザの好みに応じて、選択することができるようにしてもよい。
なお上記のように空中像の移動時における空中像の表示輝度の変更は、図38のフローチャートにおけるステップS306の処理の最中に行われる。
第5の実施の形態の表示装置1の変形例3を以下に説明する。変形例3の表示装置1は、キャリブレーション処理の際にユーザの操作によって空中像の表示位置の変更を開始する。この場合、ユーザの操作が終了した後、表示位置制御部220は表示位置変更部500を制御して、空中像の表示位置の変更を開始させる。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、第2のキャリブレーション処理モードにも適用できる。
第5の実施の形態の表示装置1の変形例4を以下に説明する。変形例4の表示装置1は、キャリブレーション処理の際にユーザが空中像の表示位置の変更を開始するタイミングを発声により指定するものである。図40は、変形例4における表示装置1の構成のうち、制御部20によって制御される表示器11および操作検出器13を示したブロック図である。この表示装置1は、図17に示す第1の実施の形態の変形例6における集音器14を含み、制御部20には音声検出部208が備えられる。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、第2のキャリブレーション処理モードにも適用できる。
また、表示装置1が集音器14を備えず、外部の集音装置によって取得された音声データを無線や有線を介して入力し、外部の集音装置から入力した音声データを用いて音声検出部208が音声検出をしてもよい。
なお、以上のように、検出基準制御部204が指の到達位置または指定位置を判定したにも拘わらず、ユーザが「ハイ」と発声するまでは空中像を移動せずに、ユーザが「ハイ」と発声したことを検出して空中像を移動するようにすると、ユーザが「ハイ」と発声するまでの間に、空中像の表示位置に対する操作を複数回繰り返す場合がありうる。このような場合には、ユーザが「ハイ」と発声した時に、例えば、複数回の到達位置または指定位置の相加平均や相乗平均などの平均的な値に基づき、または複数の到達位置50の中央値(メジアン)に基づいて、または複数の到達位置または指定位置のうち、最後の到達位置または指定位置に基づいて、空中像を移動する。
第5の実施の形態の表示装置の変形例5を以下に説明する。変形例5の表示装置1は、ユーザが空中像を視認している場合には、空中像の移動を中止し、ユーザが空中像から目をそらした場合に、空中像の移動を行う。このために、表示装置1は、第1の実施の形態の変形例8のようにカメラなどの撮像装置を備え、撮像装置がキャリブレーション処理を実行中のユーザを撮影し、制御部20が撮影された画像データを解析して、この解析結果に基づきユーザの顔の向き、またはユーザの身体の向きを判定して、ユーザが空中像を視認しているか否かを判定する。表示位置制御部220および表示位置変更部500は、ユーザが空中像を視認していないときに、空中像を移動する。キャリブレーション処理として空中像を移動させているが、ユーザが視認可能な空中像が移動すると、ユーザに違和感が生じる場合がある。そこで、ユーザが空中像から目をそらした場合に、空中像の移動を行うことにより、空中像の移動がユーザに視認されず、ユーザの違和感を軽減できる。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、第2のキャリブレーション処理モードにも適用できる。
なお、上述の視線検出器または撮像装置は、表示装置1に設けられてなくてもよい。視線線検出器は、表示装置1の外部に設置されていて、視線検出結果を無線通信またはケーブルを介して表示装置1に送信してもよい。また、撮像装置は、表示装置1の外部に設置されていて、撮像データを無線通信またはケーブルを介して表示装置1に送信してもよい。
なお、上述の説明では、ユーザが空中像を視認していると判定した場合に空中像の表示位置の変更をしたが、反対に、ユーザが空中像を視認している場合に、表示位置制御部220および表示位置変更部500は、空中像の表示位置の変更をさせるような制御を行ってもよい。この場合には、ユーザが確実に空中像を視認しているため、どの程度空中像が移動したかをユーザは知覚できるため、ユーザに対して操作する位置の変更を促すことができる。
なお、上述の説明では、ユーザが空中像を視認していない場合に、空中像の表示位置の変更をさせるような制御を行ったが、ユーザの生体情報を取得し、生体情報の値に基づいて空中像の表示位置を変更させるような制御を行ってもよい。例えば、ユーザの生体情報としてユーザの脈拍数を取得する。ユーザの脈拍を取得する方法として、例えば、装置を利用する前にユーザの脈拍数を取得するための装置をユーザに取り付けさせる。そして、表示位置制御部220および表示位置変更部500は、ユーザの脈拍数が大きくなった場合、空中像の表示位置の変更をさせるような制御を行ってもよい。ユーザの脈拍数が大きくなった場合、ユーザは操作がうまくいかずイライラしている場合がある。このような場合に、空中像の表示位置を変更することにより、ユーザに装置を快適に利用させることができる。
第5の実施の形態の表示装置の変形例6を以下に説明する。変形例6の表示装置1は、キャリブレーション処理において、空中像の移動速度を変更することができる。表示装置1は、空中像を極めて高速度で移動させるか、または空中像を低速で移動させることができる。表示位置制御部220と表示位置変更部500は、空中像を第1の所定値以上の極めて高速度で移動させるか、または空中像を第1の所定値よりも小さい第2の所定値以下の低速度で移動させる。このように、空中像を極めて高速度または低速度で移動させることによって、ユーザは、空中像の移動を視認し難くなる。また、空中像を極めて高速で移動させるか、低速度で移動させるかをユーザが選択スイッチなどによって選択することができるようにしてよい。表示位置制御部220と表示位置変更部500とがこのような制御を行うことにより、ユーザは空中像の表示位置の移動を視認しづらくなる。そのため、空中像の移動がユーザに視認されづらくなり、ユーザの違和感を軽減することができる。また、空中像の表示位置を変更する距離が大きい場合、空中像の表示位置の変更がユーザに目立つ場合があるため、変更する距離に基づいて第1の所定値と第2の所定値とを変更してもよい。たとえば、空中像の表示位置を所定距離以上移動させる場合は、空中像の表示位置を所定距離以下移動させる場合と比較して、第1の所定値の速度を速くし、第2の所定値の速度を遅くしてもよい。
次に第6の実施の形態の表示装置を、図42を参照して説明する。説明する表示装置としては、図35および図36に示した第5の実施の形態の表示装置と同一の構成を備える。上述した第5の実施の形態の表示装置との差異は、図42に示したように、空中像30を挟んで、第1の検出基準40aと第2の検出基準40bとが初期に設定されている点である。なお、図42では、空中像30が、第1および第2の検出基準40a、40bの間の中間に位置するように、即ち空中像30と第1の検出基準40aとの距離が空中像30と第2の検出基準40bとの距離とが等しくなるように定められている。しかし、必ずしも空中像30と第1の検出基準40aとの距離と、空中像30と第2の検出基準40bとの距離とを等しく設定しなくてもよい。また、空中像30にはアイコン30Aが表示されている。
空中像操作モードで、指Fをアイコン30Aの方へ下降移動させると、操作検出器13が指Fの下降移動を検出する。指Fが第1の検出基準40aに達すると、検出基準制御部204は操作検出器13の検出出力に基づき、指Fが第1の検出基準40aに達したことを判定し、この判定に応じて、表示制御部202がアイコン30Aの表示態様を変化させる。この表示態様の変化は、表示の高輝度化や点滅表示などのハイライト表示でもよいし、表示の色を変えてもよい。このようなアイコン30Aの表示態様の変化によって、ユーザは、指がアイコン30Aを選択していることを確認することができる。
到達位置50または指定位置50Aが第1の検出基準40aよりも距離ΔHだけ、上方の位置に位置しているので、表示位置制御部220は、空中像30の表示位置を距離ΔHだけ下方の位置に、即ち点線で示した位置30に移動する。
第2のキャリブレーション処理モードの説明は、以上の通りであるが、第1のキャリブレーション処理モードも同様である。
上述した第6の実施の形態の表示装置は、第1の検出基準40aと空中像30との位置関係を空中像30の移動によって変更し、この空中像30の移動に伴い、第2の検出基準40bを、空中像30と第2の検出基準40bとの距離が空中像30と第1の検出基準40aとの距離にほぼ等しくなるように、移動するものであった。次に、この変形例1を説明する。この変形例1では、第1の検出基準40aと空中像30との位置関係は、第6の実施の形態の表示装置と同様に空中像30の移動によって変更するが、空中像30と第2の検出基準40bとの位置関係は、第2の検出基準に対する指の到達位置または指定位置に基づく第2の検出基準40bの移動によって、変更するものである。
図45(a)において、例えば、ユーザは、指Fを空中像30の第1のアイコン30Aの方に下降移動させて、アイコン30Aに対する選択操作をしたと感じると、すなわち指Fが第1の検出基準40aに達しているとユーザが判断した場合、ユーザの指Fの下降が停止するなどが生じる。検出基準制御部204は、操作検出器13の検出出力に基づき、指の到達位置50または指定位置50Aを判定する。到達位置50または指定位置50Aが第1の検出基準40aよりも距離ΔHだけ上方に位置しているので、表示位置制御部220は、空中像30の表示位置を、例えば略距離ΔHだけ下方の位置に、即ち点線で示した位置30に移動する。この空中像30の下方移動によって、空中像30と第1の検出基準40aとの位置関係が変更、即ちキャリブレーション処理される。
また、第2の検出基準40bに対する指の到達位置50または指定位置50Aが、第2の検出基準40bよりも下方の位置である場合には、検出基準制御部204は指の到達位置50または指定位置50Aに基づき、第2の検出基準40bを下方向に移動する。この第2の検出基準40bの下方向移動によって、空中像30と第2の検出基準40bとの位置関係が変更、キャリブレーションされる。このように、空中像30の移動によって空中像30と第1の検出基準40aとの位置関係を変更した後、第2の検出基準40bの移動によって空中像30と第2の検出基準40bとの位置関係を変更することにより、空中像30と第1の検出基準40aとの位置関係、空中像30と第2の検出基準40bとの位置関係の両方をともに適切な位置関係に変更することができる。
次に、第6の実施の形態の表示装置の変形例2を説明する。上述の第6の実施の形態および変形例1との差異としては、空中像30の移動のさせ方が異なる。即ち、ユーザの指の下降移動に合わせて、空中像30の表示位置を移動させる。
変形例2は、空中像操作モードにおいてユーザの指がアイコン30Aの方に下降移動され、指が第1の検出基準40aに達し、これによって、アイコン30Aの表示態様が変化した後に、指が更に下降移動させ、空中像30の表示位置に達すると、表示位置制御部220は、操作検出器13の検出出力に基づき、指が空中像30の表示位置に達したことを判定し、指の下降移動に合わせて空中像30の表示位置を移動させる。表示位置制御部220は、空中像の表示位置と下降移動する指の位置とが所定の範囲に含まれるように空中像30の表示位置を制御する。このように制御することによって、下降する指に追随しているように空中像30の表示位置を下降移動させることができる。また、さらに下降する指よりも常に下方に位置するように空中像30の表示位置を設定し、下降する指に合わせて空中像30の表示位置を下降移動するように、表示位置制御部220が空中像30の表示位置を制御することによって、ユーザの指が空中像30を貫通することがなくなる。
指の下降移動およびそれに追随する空中像30の下降移動によって、指と空中像30とが第2の検出基準40bに達すると、検出基準制御部204が指が第2の検出基準40bに達したことを判定し、表示制御部202が再生画像の表示を行う。
このように、指が空中像30に到達すると、空中像30が指の下降移動に追随して下降移動するので、ユーザは、指の下降移動が空中像30によって第2の検出基準40bまで、案内されるように感じられ、第2の検出基準40bに確実に達することができる。
第7の実施の形態の表示装置について説明する。本実施の形態に係る表示装置は、図29、図31、図32に示す第4の実施の形態の表示装置100や、図33、図34に示す第4の実施の形態の変形例1の表示装置100と同様の構成を有している。第7の実施の形態の表示装置100においても、第5の実施の形態やその変形例1~4や第6の実施の形態やその変形例1の表示装置1と同様に、空中像の表示位置を変更可能に構成される。
図46に示す通り、本実施の形態に係る表示装置100は、図29に示す第4の実施の形態の表示装置100の構成に加えて、表示位置変更部500と表示位置制御部220とを備える。検出基準制御部204は、第5の実施の形態やその変形例1~4や第6の実施の形態やその変形例1の場合と同様にして、操作検出器13の検出出力に基づき、指の到達位置50を判定する。表示位置制御部206は、指の到達位置50に基づき、表示位置変更部500に空中像300の位置を結像光学系12の光軸方向に移動させる。この場合、表示位置変更部500は、表示器111をX方向に移動させることで、空中像30をZ方向に移動させる。すなわち表示器111をX方向+側へ移動させることで空中像30をZ方向+側へ移動させ、表示器111をX方向-側へ移動させることで空中像30をZ方向-側へ移動させることができる。勿論、表示位置変更部500は、表示器111を移動させず、結像光学系112を平行移動させても良いし、結像光学系112と表示器111とを共に移動させても良い。
以上の実施の形態及びその変形例は、キャリブレーション処理における指先の到達位置や指定位置に基づいて検出基準およびまたは空中像を制御または変更して、検出基準と空中像の表示位置との位置関係を変更するものであった。次に、キャリブレーション処理における所定の非接触操作が検出基準で検出されない場合に検出基準を変更する第8の実施の形態を説明する。
本実施の形態の表示装置1は、図1、図2に示す第1の実施の形態の表示装置1と同様の構成を備える。第8の実施の形態の表示装置1は、空中像操作モードにおいて、表示制御部202と表示器11と結像光学系12が図47(a)および(b)に示す空中像操作モード用の空中像30を空中に表示する。図47(a)において、空中像30は、たとえば2個の矩形状のアイコン30D、30Eを含む。
直方体状の検出基準42について、その上面を上部基準面42aとし、下面を下部基準面42bとし、長さW2とD1とで生成される側面を側部基準面42cとし、長さW1とD1とで生成される側面を側部基準面42dと称する。検出基準42の外部を検出基準外41と称する。
また、本実施の形態では、検出基準42を直方体形状として説明するが、これに限定されるものではない。球状、円柱状や角柱状等でもよいし、それ以外の形状でもよい。
なお、アイコン30Eに対応する検出基準42も、アイコン30Dに対応する検出基準42と同様に、横断面がアイコン30Eの形状に対応し所定の高さを有する直方体形状であるとする。
所定の非接触操作600は、上述した種々の指Fの移動軌跡で表されるものに限られず、その移動軌跡(指Fや手の移動軌跡)が操作検出器13で検出することができるものであれば、その他の移動軌跡を描くものであっても良い。
図49は、検出基準制御部204によって、前述の所定の非接触操作600のうち非接触操作600Aが検出基準42で行われたと判定される場合を例示する。所定の非接触操作600A1は、指Fが上部基準面42aから下方に距離L1移動し、引き続きUターンし上方に距離L1移動して、指Fが上部基準面42aに達した場合を示す。所定の非接触操作600A2は、指Fが上部基準面42aと下部基準面42bとの中間で指Fが下方に距離L1移動し、引き続きUターンし上方に距離L1移動した場合を示す。所定の非接触処理600A3は指Fが下方に距離L1移動して下部基準面42bにおいてUターンし上方に距離L1移動した場合を示す。
以上のように、検出基準制御部204は、図49に示されたように、所定の非接触操作600Aの距離L1の下降移動と、Uターンと、距離L1の上昇移動との全てが検出基準42内で行われた場合に、所定の非接触操作600Aが検出基準42で行われたと判定する。即ち、検出基準制御部204は、所定の非接触操作600Aを検出基準42で検出する。
なお、図49において検出基準外41とは、検出基準42の外側の外部空間である。詳述すると、図47(c)において、検出基準42の上部基準面42aと、下部基準面42bと、側部基準面42cと、側部基準面42dとで囲まれた空間以外の外側空間である。
以降、本実施の形態および変形例の説明においては、代表的に所定の非接触操作600Aを用いて説明するが、他の非接触操作600B、600C等においても同等の技術が適用されるものとする。
図50は、所定の非接触操作600Aの全体が検出基準外41で検出される場合の例を示したものである。図50(a)において、指Fによる所定の非接触操作600Aの全体が検出基準42の上部基準面42aの上方の位置で行われている。この場合には、所定の非接触操作600Aの全体が操作検出器13と検出基準制御部204とによって、検出基準外41で検出される。
図50(b)は、指Fによる所定の非接触操作600Aaの全体が検出基準42の下部基準面42bの下方の位置で行われ、また、指Fによる所定の非接触操作600Abの全体が検出基準42の側部基準面42cよりも外側の位置で行われている。これらの場合にも、所定の非接触操作600Aaの全体、600Abの全体が操作検出器13と検出基準制御部204とによって、それぞれ検出基準外41で検出される。操作検出器13と検出基準制御部204とによって、検出基準外41で所定の非接触操作600を検出する方法を説明する。まず、操作検出器13が指Fの移動を逐次検出する。次に、検出基準制御部204は操作検出器13の検出出力に基づき、指Fの移動軌跡が所定の非接触操作600に対応するか否かと、指Fの移動軌跡の位置(検出基準42又は検出基準外41又は検出基準42と検出基準外41との両方)を判定する。この判定結果に基づき、検出基準外41で所定の非接触操作600を検出できる。
図51(a)において、ユーザが空中像30のアイコン30Dの表示位置を操作するために、指Fを下降移動させ、その指Fが操作検出器13の検出範囲13Aの上限13aに達すると、操作検出器13が指の下降移動を逐次検出し、その指の移動に伴う検出出力を順次、記憶部205に記憶する。検出基準制御部204は、記憶部205に記憶された操作検出器13の検出出力に基づき、指Fの移動軌跡が所定の非接触操作600Aに対応するか否かを判定するとともに、その指Fの移動軌跡の全てが検出基準42に存在するか否かを判定する。
図54は、所定の非接触操作600Aの一部が検出基準外41で検出される場合の例を示したものである。図54(a)において、指Fによる所定の非接触操作600Aの一部、即ち距離ΔH10に対応する部分が、検出基準42の上部基準面42aよりも上方の位置で行われ、その残部が検出基準42内で行われている。言い換えると、検出基準42で検出された所定の非接触操作600Aの一部と、検出基準外41で検出された所定の非接触操作600Aの一部とを合わせると、所定の非接触操作600Aになる。
この場合、所定の非接触操作600Aの一部が操作検出器13と検出基準制御部204とによって、検出基準外41で検出される。
図54(b)において、指Fによる所定の非接触操作600Aaの一部、即ち距離ΔH10に対応する部分が、検出基準42の下部基準面42bよりも下方の位置で行われ、その残部が検出基準42内で行われている。言い換えると、検出基準42で検出された所定の非接触操作600Aaの一部と、検出基準外41で検出された所定の非接触操作600AのAaの一部とを合わせると、所定の非接触操作600Aaになる。
また、指Fによる所定の非接触操作600Abの一部、即ち距離ΔH10に対応する部分が、検出基準42の側部基準面42cの外側で行われ、その残部が検出基準42内で行われている。言い換えると、検出基準42で検出された所定の非接触操作600Abの一部と、検出基準外41で検出された所定の非接触操作600Abの一部とを合わせると、所定の非接触操作600Abになる。
これらの場合にも、所定の非接触操作600Aaの一部、またはAbの一部が操作検出器13と検出基準制御部204とによって検出基準外41で検出される。
図54(a)に示す所定の非接触操作600Aのように、所定の非接触操作600Aの一部が検出基準42で行われ、その残部が上部基準面42aよりも上方の位置で行われた場合に関するキャリブレーション処理は、図51の場合と同様である。即ち、距離ΔH10に基づき、検出基準42の全体を図中上方に移動する。
図54(b)に示す所定の非接触操作600Aaのように、所定の非接触操作600Aaの一部が検出基準42で行われ、その残部が下部基準面42bよりも下方の位置で行われた場合に関するキャリブレーション処理は、図52の場合と同様である。即ち、距離ΔH10に基づき、検出基準42の全体を図中下方に移動する。
図54(b)に示す所定の非接触操作600Abのように、所定の非接触操作600Abの一部が検出基準42で行われ、その残部が側部基準面42cの外側の位置で行われた場合に関するキャリブレーション処理は、図53の場合と同様である。即ち、距離ΔH10に基づき、検出基準42の全体を横に移動する。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、図55のフローチャートは第2のキャリブレーション処理モードにも適用できる。
第8の実施の形態では、所定の非接触操作600が検出された空間上での位置と検出基準42との位置関係に基づいて、検出基準42を上下方向および/または左右方向に変更した。すなわち検出基準42の上下方向の中心位置および/または左右方向の中心位置を変更するものとして説明した。変形例1の表示装置1は、検出基準42と所定の非接触操作600との空間上における位置関係を変更する際に、検出基準42の幅D1の大きさを変更してもよい。たとえば、図50(a)に示すように、所定の非接触操作600Aが検出基準42の上方の検出基準外41で検出された場合には、下部基準面42bの位置を変更することなく、上部基準面42aのみを変更量ΔH10だけ上方に変更してもよい。すなわち、検出基準42の幅D1を変更することにより、検出基準42の上下方向の中心位置を変更しても良い。または、上部基準面42aを変更量ΔH10だけ上方に変更し、下部基準面42bを変更量ΔH10だけ下方に変更してもよい。すなわち、検出基準42の幅D1を上下方向に同一の変更量ΔH10にて変更することにより、検出基準42の上下方向の中心位置を変更することなく検出基準42を変更しても良い。なお、図50(b)に示すように、検出基準42の下方に所定の非接触操作600Aaが検出された場合には、下部基準面42bの位置を変更量ΔH10だけ下方に変更しても良いし、下部基準面42bの位置を変更量ΔH10だけ下方に変更するとともに上部基準面42aの位置を変更量ΔH10だけ上方に変更しても良い。なお、検出基準42の右側に所定の非接触操作600Abが検出された場合にも、同様に左右方向に側部基準面42cの位置を変更しても良い。すなわち、検出基準42の左右方向の中心位置を変更して検出基準42を変更しても良いし、中心位置を変更することなく検出基準42の幅を変更するようにしても良い。
変形例2の表示装置1について説明する。変形例2の表示装置1は、キャリブレーション処理のときに行われた所定の非接触操作600が検出基準外41で検出された場合、所定の非接触操作600と検出基準42との距離が所定の値以下の場合に、検出基準42を変更する。図50(a)に示すように所定の非接触操作600Aが検出基準42の上方の検出基準外41で検出された場合を例に挙げる。この場合、間隔ΔH10が所定の値以下であると判定されると、すなわち所定の非接触操作600Aが検出基準42の近傍にて行われたと判定されると、表示装置1はユーザが空中像の表示位置への操作を行う意図を有していたものと見なし、検出基準42を変更する。間隔ΔH10が所定の値より大きいと判定されると、すなわち所定の非接触操作600Aが検出基準42から遠く離れた位置にて行われたと判定されると、表示装置1はユーザが空中像の表示位置への操作を行う意図を有していなかった、誤操作であった、または操作を途中で中断したものと見なし、検出基準42を変更しない。
第8の実施の形態の変形例3は、第1の実施の形態の変形例1のように、操作検出器13の検出出力に基づき、ユーザの指先の速度あるいは加速度を算出し、算出した速度または加速度に基づき検出基準42の位置を変更するものであってもよい。すなわち、所定の非接触操作600の少なくとも一部の速さに基づいて、特に、所定の非接触操作600の一部の速さが所定値よりも小さい場合に検出基準42を変更する。図57は変形例3の表示装置1のうち、制御部20と、制御部20によって制御される表示器11および操作検出器13とを示したブロック図である。
ここで所定の非接触操作600の少なくとも一部の速さ、とは、所定の非接触操作600のうち少なくとも一部の操作の速さ、を示すものである。所定の非接触操作600のうちの少なくとも一部の操作、とは、例えば、所定の非接触操作600が検出基準外41の位置から検出基準42に向けて操作された後、引き続き引き返す操作(所定の非接触操作600A)である場合において、その操作が検出基準外41から検出基準42に向かう際の少なくとも一部区間の操作を示す。あるいは、所定の非接触操作600が検出基準42のある位置から検出基準42の一端に向けて操作された後、引き続き引き返す操作(所定の非接触操作600A)である場合において、その一端に向かう際の少なくとも一部区間を示す。
なお、所定の非接触操作600の全操作(例えば所定の非接触操作600Aにおける下降開始してから引き続き上昇し終えるまでの操作)における速度(加速度)をモニターし、その速度(加速度)の平均値を算出して、その平均値に基づいて操作の強弱を判断し、次回以降の操作検出の際に検出基準42を変更するようにしてもよい。例えば、操作速度が平均的に速い場合には検出基準42を突き抜ける可能性があるので、次回以降の検出基準42の幅を広げるよう制御してもよい。
また、操作予測部211は、速度・加速度検出部206により算出された移動速度および/または移動加速度が所定値以上の場合に、指Fの移動軌跡を予測し、検出基準42を変更してもよい。即ち、指Fの移動速度および/または移動加速度が所定値以上の場合に予測された指Fの移動軌跡が、検出基準42にない場合、所定の非接触操作600は検出基準42で検出されないと判断する。その場合、第8の実施の形態の場合と同様にして、算出した変更量ΔH10にて検出基準42を変更する。
なお、キャリブレーション処理の説明として第1のキャリブレーション処理モードを例として説明したが、第2のキャリブレーション処理モードにも適用できる。
また、なお、上述の説明では、速度・加速度検出部206は、操作検出器13によって検出される静電容量値を所定の時間毎に読み出し、所定の時間当たりの静電容量値の変化から指の移動速度を算出すると共に、算出した速度から指の移動加速度を算出したが、この方法に限られず、速度・加速度検出部206として撮像装置を用いてもよい。また、上述の説明では、ユーザの指の移動速度または加速度を算出したが、そのほかに、ユーザの足や肘、ユーザが所持しているスタイラスペンであってもよい。
第8の実施の形態およびその変形例1~3に係る表示装置1は、一回のキャリブレーション処理において、所定の非接触操作600Aの空間上の位置と検出基準42との位置関係に基づいて、検出基準42の位置を変更した。即ち、1回のユーザ操作で、1回のキャリブレーション処理を行った。変形例4に係る表示装置1は、複数回のユーザ操作で、1回のキャリブレーション処理を行う。つまり、所定の非接触操作600Aが検出基準外41で検出された回数又は所定の非接触操作600Aが検出基準42で検出された回数に基づいて検出基準42を変更する。
なお、複数回のユーザ操作の結果に基づいて検出基準42を変更する場合、それぞれのユーザ操作にて算出された変更量ΔH10の相加平均や相乗平均の値が所定の閾値を超える場合や、それぞれのユーザ操作にて算出された変更量ΔH10が増加傾向にある場合に、検出基準42を変更しても良い。
上述の第8の実施の形態では、所定の非接触操作600は、ユーザが表示位置1に向けて指Fを押し込む操作であった。例えば、図48(a)に示すような指FがUターンする操作であったが、これに限られない。所定の非接触操作600は、表示位置において三本の指を出すことでもよいし、身体の前において指Fの表示位置1への移動動作であってもよい。また、所定の非接触操作600は指Fの移動が所定の時間、例えば20秒間停止する動作であってもよい。
上述の実施の形態では、検出基準制御部204が、所定の非接触操作600が行われたか否かを、操作検出器13の検出出力に基づき判定した。ところが、ユーザによっては、所定の非接触操作600を正確に行わない場合、または上手く行うことができない場合がある。例えば、所定の非接触操作600が、指の10cmの下降移動とそれに引き続く10cmの上昇移動である場合に、ユーザによっては、非接触操作として、指の5cmの下降移動とそれに引き続く5cmの上昇移動と、を行う場合がある。また、所定の非接触操作600が表示位置1において三本の指を出すことである場合に、ユーザによっては三本目の指が上手く開かず、二本の指を出すことがある。また、所定の非接触操作600が身体の前において指Fの表示位置への移動動作である場合に、ユーザによっては身体の横で指Fの表示位置への移動を行う場合がある。また、所定の非接触操作600は指Fの移動が所定の時間、例えば20秒間停止するである場合に、ユーザによっては20秒に達する前に、例えば15秒程度で指を動かしてしまう場合がある。
このような場合、例えば検出基準42の中心位置や検出幅を変更することによって、ユーザ操作の全てを検出基準42で検出できるようにしていたとしても、ユーザの実行している操作自体(ユーザの操作として検出される検出値自体)が「所定の非接触操作600」(所定の非接触操作600を示す基準値)と一致しなければ、たとえ検出基準42(上述のような位置や幅)を設定(変更)したとしても、ユーザ操作を認識できない。このような場合、検出基準42の変更として、所定の非接触操作600を示す基準値を変更することによって、ユーザの操作を所定の非接触操作600である、と認識することができる。
なお、ユーザの複数の操作に基づいて検出基準42(所定の非接触操作600を示す基準値)を変更してもよい。つまり、所定の非接触操作600とは異なるが、類似した非接触操作が複数回行われた場合に、上記の所定の非接触操作600を示す基準値を変更してもよい。
このように、所定の非接触操作600を示す基準値を変更することが検出基準42の変更に含まれる。
第8の実施の形態のおよびその変形例1~5に係る表示装置1は、所定の非接触操作600や所定の非接触操作600の一部が検出基準外41で検出された場合等に、検出基準42を変更した。
検出基準42の変更を指示する操作が検出基準42で検出された場合に、検出基準42を変更してもよい。この場合の検出基準42の変更には、検出基準42の位置や幅の変更、所定の非接触操作600を示す基準値の変更が含まれる。例えば、変形例6では、キャリブレーションを指示するジェスチャーが表示装置1に記憶されており、ユーザが検出基準42でキャリブレーションを指示するジェスチャーを行った場合、検出基準42を変更してもよい。また、検出基準外41でキャリブレーションを指示するジェスチャーを検出した場合にも、上述と同様にして検出基準42を変更してもよい。
(第8の実施の形態の変形例7)
検出基準制御部204は、音に基づいて、検出基準42を変更してもよい。この場合の検出基準42の変更には、検出基準42の位置や幅の変更、所定の非接触操作600を示す基準値の変更が含まれる。例えば、表示装置1は、第1の実施の形態の変形例6と同様の集音器14を有し、制御部20は、集音器14から入力した音声データを検出する音声検出部208を備える。なお、この場合において、音声検出部208は、「ハイ」以外も音声認識可能な周知の音声認識機能を備えるものである。ユーザが、「実行することができない」旨の発言や会話をした場合や、キャリブレーションして欲しい旨を発言した場合等に、変形例7の表示装置1は、音声認識機能を用いてこの会話や発言を検出して検出基準42を変更する。具体的には、音(発言)を検出したときのユーザの指の位置が含まれるように、検出基準42を移動してもよいし、検出基準42の幅を変更してもよい。あるいは、音(発言)を検出したら、所定量、例えば1cm、ユーザに近づく方向に検出基準42を移動してもよいし、検出基準42の幅を変更してもよい。また、音(発言)を検出したときにユーザの操作として検出された検出値になるよう、所定の非接触操作600を示す基準値を変更してもよい。また、音(発言)を検出したら、所定量だけ所定の非接触操作600を示す基準値を変更してもよい。例えば、所定の非接触操作600を示す基準値として、「10cm下降する」操作を示す値が記憶されていた場合、音(発言)を検出したら「9cm下降する」という操作を示す値を、所定の非接触操作600を示す基準値として変更(更新)してもよい。
なお、表示装置1が集音器14を備えず、外部の集音装置によって取得された音声データを無線や有線を介して入力し、外部の集音装置から入力した音声データを用いて音声検出部208が音声検出をしても良い。
検出基準制御部204は、時間に基づいて、検出基準42を変更してもよい。この場合の検出基準42の変更には、検出基準42の位置や幅の変更、所定の非接触操作600を示す基準値の変更が含まれる。例えば、所定の非接触操作600が検出基準42で所定の時間内に検出されなかった場合に、変形例8の表示装置1は検出基準42を所定量だけ変更する。このために、制御部20は、計時部を備え、表示装置1の電源スイッチがオンされて、所定の時間、アイコンなどに対する操作がない場合に、所定時間を計時した計時部の出力に基づき検出基準制御部204は検出基準42を所定量だけ変更する。また、或るアイコンなどに対する操作が行われた後に、所定の時間経っても次のアイコンなどに対する操作が行われない場合には、所定の時間を計時した計時部の出力に基づき検出基準制御部204は検出基準42を所定量だけ変更する。
変形例8において所定の時間の計時に基づき検出基準42を変更する場合は、検出基準42をユーザの方に近づく方向に所定量移動するように変更することが望ましい。例えば、所定時間にユーザの操作が検出されなかった場合、所定量、例えば1cm、ユーザに近づく方向に検出基準42の中心位置(全体位置)を移動しても良いし、検出基準42の幅を変更しても良い。また、所定時間経過時のユーザの指の位置が含まれるように検出基準42の中心位置を移動しても良いし、検出基準42の幅を変更しても良い。また、所定時間経過時にユーザの操作として検出された検出値になるよう、所定の非接触操作600を示す基準値を変更しても良い。所定時間経過したら、所定量だけ所定の非接触操作600を示す基準値を変更してもよい。例えば、所定の非接触操作600を示す基準値として「10cm下降する」という操作を示す値が記憶されていた場合、所定時間経過したら「9cm下降する」という操作を示す値を、所定の非接触操作600を示す基準値として変更(更新)しても良い。
検出基準制御部204は、ユーザの顔に基づいて、検出基準42を変更してもよい。この場合の検出基準42の変更には、検出基準42の位置や幅の変更、所定の非接触操作600を示す基準値の変更が含まれる。例えば、変形例9の表示装置1に設けられたカメラによって、ユーザの顔を撮像し、制御部20においてその撮像画像を解析して、ユーザの顔の所定の表情を検出(いわゆる顔認識機能による所定表情の認識)すると、検出基準42を変更する。なお、所定の表情は、例えばユーザが上手く操作を行うことができない場合の困った顔などであり、ユーザの困った顔を検出した場合、検出基準42を変更する。
例えば、表示装置1の顔認識機能を用いて、ユーザが困っていることを検出すると、所定量(例えば1cm)ユーザに近づく方向に検出基準42を移動しても良いし、検出基準42の幅を変更しても良い。また、困っている顔を認識する直前にユーザが行っていた操作を示す検出値を記憶しておき、その記憶しておいた検出値に基づいて、所定の非接触操作600を示す基準値を変更しても良い。
検出基準制御部204は、ユーザによるジェスチャー操作が検出基準42で検出されない場合に、検出基準42(検出基準42の位置や幅、所定の非接触操作を示す基準値)を変更してもよい。例えば、所定の非接触操作600としてのユーザによるジェスチャー操作が、例えば、手によるグー、チョキ、パーなどの動作、または指Fの下降移動に引き続く横方向移動の動作の何れかであった場合に、変形例10の表示装置1は、それら各動作の特徴情報(特徴を示す基準値)をそれぞれ記憶部205に予め記憶しておく。そして、表示装置1は、ユーザのジェスチャー操作を検出し、その検出されたジェスチャー操作と、記憶部205に記憶されている複数の特徴情報の中から選択された何れか一つの特徴情報とを比較して、そのジェスチャー操作が上記何れか一つの所定の非接触操作に相当するか否かを判定する。表示装置1は、ユーザのジェスチャー操作が検出基準42で検出されなかった場合に、検出基準42を変更する。この場合の検出基準42の変更は、所定の非接触操作600を示す基準値の選択の変更である。すなわち、例えば表示装置1が、最初はグーを示す特徴情報を検出基準42での検出の際に使用する基準値として選択しているとする。表示装置1は、その検出基準42ではユーザ操作を検出できなかった場合に、基準値を、グーを示す特徴情報から、グーとは別の操作(上述の複数のジェスチャー操作における別の操作、例えばチョキ)を示す特徴情報に選択変更する。
所定の非接触操作600が指Fの位置が所定の位置に一致する動作である場合には、その所定の位置が検出基準42の中にある場合や、所定の位置が検出基準外41にある場合や、所定の位置がアイコンの表示位置に一致する場合や、所定の位置が検出基準42の位置である場合等がある。所定の位置が検出基準42の中にある場合には、指が検出基準42に存在するときに所定の非接触操作600が行われたと判定する。所定の位置が検出基準外41にある場合には、指が検出基準外41にあるときに所定の非接触操作600が行われたと判定する。所定の位置がアイコンの表示位置に一致する場合には、指Fが空中像のアイコンの表示位置に一致したとき、またはアイコンの表示位置を操作したときに所定の非接触操作600が行われたと判定する。所定の位置が検出基準42の位置である場合には、指Fが検出基準42と検出基準外41との境界を通過するときに、または、指が上記の境界を通過し、再度境界を通過するときに、所定の非接触操作600が行われたと判定する。
第8の実施の形態およびその変形例1~11では、検出基準42は上下方向に幅D1を有するものとして説明したが、第1の実施の形態の検出基準40のように面で構成されても良い。図60(a)、(b)に示すように、検出基準40から下方に向かって距離L1または距離L1以上の位置にてUターンを行う所定の非接触操作600Aが行われた場合に、検出基準40にて所定の非接触操作600Aが行われたことが検出される。図61(a)に示すように検出基準40の上方(静電容量検出範囲13A内)にて所定の非接触操作600Aが行われた場合には、検出基準外41にて、操作検出器13を用いて所定の非接触操作600Aが検出され、図61(b)に示すように、検出基準40を通過して所定の非接触操作600Aが行われた場合に、所定の非接触操作600Aの一部が検出基準外41にて操作検出器13を用いて検出される。図61(a)、(b)の場合には、検出基準40からの距離に基づいて変位量ΔH10を算出し、この変位量ΔH10にて検出基準40の位置(図61中のZ方向の位置)を変更すれば良い。
また、第8の実施の形態およびその変形例1~12では、空中像の表示位置への操作に対する所定の非接触操作600を行う場合について説明したが、この例に限定されない。例えば、第8の実施の形態およびその変形例1~12の表示装置の表示器11に表示された画像に対して空間上で所定の非接触操作600を行う場合にも、所定の非接触操作600の空間上の位置と検出基準42との位置関係に基づいて、検出基準42の位置を変更して良い。
11、111 表示器
12、112 結像光学系
13、113 操作検出器
14 集音器
18、118 撮像装置
19 環境検出部
20 制御部
116 投光素子
117 受光素子
119 アクチュエータ
201 画像生成部
202 表示制御部
203 キャリブレーション部
204 検出基準制御部
205 記憶部
206 速度・加速度検出部
207 停止位置予測部
208 音声検出部
209 画像解析部
210 ユーザ情報解析部
220 表示位置制御部
500 表示位置変更部
Claims (19)
- 空中の表示に対する使用者の操作を検出する検出装置と前記表示との位置関係を前記表示を制御し変更する制御部を備え、
前記制御部は、前記位置関係を使用者によって変更可能である制御装置。 - 請求項1に記載の制御装置であって、
前記検出基準または前記検出基準とは異なる検出基準で検出された前記使用者の操作に基づいて前記位置関係を変更する制御装置。 - 請求項1または請求項2に記載の制御装置であって、
前記制御部は、前記使用者によって前記位置関係を変更するタイミングを制御する制御装置。 - 請求項3に記載の制御装置であって、
前記制御部は、前記使用者の操作が終了した後、前記位置関係を変更する制御装置。 - 請求項4に記載の制御装置であって、
前記使用者の操作は、前記使用者の指による操作であり、
前記制御部は、検出された前記指の位置の変化が所定の範囲内である場合、前記使用者の操作が終了したと判定する制御装置。 - 請求項3に記載の制御装置であって、
前記制御部は、前記使用者の発する音声が検出された場合、前記位置関係を変更する制御装置。 - 請求項3に記載の制御装置であって、
前記制御部は、前記表示を前記使用者が視認していない場合、前記位置関係を変更する制御装置。 - 請求項7に記載の制御装置であって、
前記制御部は、検出された前記使用者の視線に関する情報、前記使用者の顔の向きに関する情報、および前記使用者の体の向きに関する情報の少なくとも一つに基づいて、前記使用者が前記表示を視認しているか否かを判定する制御装置。 - 請求項1から請求項8の何れか一項に記載の制御装置であって、
前記制御部は、前記表示を第1所定値以下の速度、または前記第1所定値より速い第2所定値以上の速度で移動させ前記位置関係を変更する制御装置。 - 請求項1から請求項9の何れか一項に記載の制御装置であって、
前記制御部は、前記位置関係を変更する場合、前記表示の表示態様を前記位置関係を変更する前の前記表示の表示態様から異なる表示態様に変更する制御装置。 - 請求項10に記載の制御装置であって、
前記制御部は、前記表示の輝度および前記表示のコントラストのうち少なくとも一方を変更する制御装置。 - 請求項10または請求項11に記載の制御装置であって、
前記制御部は、前記表示の一部の表示態様を変更する制御装置。 - 請求項1から請求項12の何れか一項に記載の制御装置であって、
前記検出基準は、第1検出基準と第2検出基準とを含み、
前記制御部は、前記第1検出基準で検出された前記使用者の操作に基づいて前記表示を制御し、前記第1検出基準と前記表示との第1位置関係を変更し、前記第2検出基準で検出された前記使用者の操作に基づいて前記第2検出基準を制御し、前記第2検出基準と前記表示との第2位置関係を変更する制御装置。 - 請求項1から請求項13のいずれか一項に記載の制御装置であって、
空中に前記表示を表示する表示部を備えた制御装置。 - 請求項14に記載の制御装置であって、
前記表示部を移動させる移動部を備え、
前記制御部は、前記移動部を制御して前記表示部を移動させ、前記位置関係を変更する制御装置。 - 請求項14に記載の制御装置であって、
前記表示部は、視差による立体画像を空中に前記表示させ、
前記制御部は、前記表示部によって表示された前記立体画像の視差を制御して前記位置関係を変更する制御装置。 - 請求項1から請求項16のいずれか一項に記載の制御装置を備えた電子機器。
- 空中の表示を制御し、使用者の操作を検出する検出基準と前記表示との位置関係を使用者によって変更する制御方法。
- 空中の表示を制御し、使用者の操作を検出する検出基準と前記表示との位置関係を使用者によって変更する処理をコンピュータに実行させるプログラム。
Priority Applications (8)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US15/123,893 US10748509B2 (en) | 2014-12-26 | 2014-12-26 | Control device, electronic apparatus, control method and program |
JP2016509809A JP6520918B2 (ja) | 2014-12-26 | 2014-12-26 | 制御装置、電子機器、制御方法およびプログラム |
KR1020167004326A KR20170105404A (ko) | 2014-12-26 | 2014-12-26 | 제어 장치, 전자 기기, 제어 방법 및 프로그램 |
CN201480046601.4A CN105934732B (zh) | 2014-12-26 | 2014-12-26 | 控制装置、电子设备、控制方法及程序 |
PCT/JP2014/084716 WO2016103522A1 (ja) | 2014-12-26 | 2014-12-26 | 制御装置、電子機器、制御方法およびプログラム |
EP14909123.3A EP3239818A4 (en) | 2014-12-26 | 2014-12-26 | Control device, electronic instrument, control method, and program |
CN202110375065.4A CN112965636A (zh) | 2014-12-26 | 2014-12-26 | 控制装置 |
TW104143546A TW201636796A (zh) | 2014-12-26 | 2015-12-24 | 控制裝置、電子機器、控制方法及程式 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2014/084716 WO2016103522A1 (ja) | 2014-12-26 | 2014-12-26 | 制御装置、電子機器、制御方法およびプログラム |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2016103522A1 true WO2016103522A1 (ja) | 2016-06-30 |
Family
ID=56149604
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/084716 WO2016103522A1 (ja) | 2014-12-26 | 2014-12-26 | 制御装置、電子機器、制御方法およびプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US10748509B2 (ja) |
EP (1) | EP3239818A4 (ja) |
JP (1) | JP6520918B2 (ja) |
KR (1) | KR20170105404A (ja) |
CN (2) | CN112965636A (ja) |
TW (1) | TW201636796A (ja) |
WO (1) | WO2016103522A1 (ja) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197190A (ja) * | 2018-05-11 | 2019-11-14 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
JP6916568B1 (ja) * | 2021-01-29 | 2021-08-11 | 株式会社ツガワ | 端末処理装置 |
WO2022137940A1 (ja) * | 2020-12-24 | 2022-06-30 | マクセル株式会社 | 空間浮遊映像表示装置 |
JP7349200B1 (ja) | 2022-12-19 | 2023-09-22 | 株式会社ソリトンウェーブ | 非接触入力装置 |
Families Citing this family (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10503392B2 (en) * | 2015-06-25 | 2019-12-10 | Oath Inc. | User interface adjustment methods and systems |
US10722800B2 (en) | 2016-05-16 | 2020-07-28 | Google Llc | Co-presence handling in virtual reality |
US10592048B2 (en) * | 2016-05-17 | 2020-03-17 | Google Llc | Auto-aligner for virtual reality display |
US10459522B2 (en) * | 2016-06-15 | 2019-10-29 | Konkuk University Glocal Industry-Academic Collaboration Foundation | System and method for inducing somatic sense using air plasma and interface device using them |
CN110249297B (zh) * | 2017-02-09 | 2023-07-21 | 索尼公司 | 信息处理设备和信息处理方法 |
JP2019185169A (ja) * | 2018-04-03 | 2019-10-24 | 富士通コンポーネント株式会社 | 入力装置 |
KR102101565B1 (ko) * | 2018-08-21 | 2020-04-16 | 박주현 | 미디어 안내장치 |
EP3651003B1 (en) * | 2018-11-07 | 2022-04-06 | Vestel Elektronik Sanayi ve Ticaret A.S. | Touch-sensitive input device, screen and method |
US11481069B2 (en) * | 2020-09-15 | 2022-10-25 | International Business Machines Corporation | Physical cursor control in microfluidic display devices |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011108152A (ja) * | 2009-11-20 | 2011-06-02 | Fujitsu Toshiba Mobile Communications Ltd | 3次元入力表示装置 |
JP2011175617A (ja) * | 2010-01-29 | 2011-09-08 | Shimane Prefecture | 画像認識装置および操作判定方法並びにプログラム |
JP2012203736A (ja) * | 2011-03-25 | 2012-10-22 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
JP2012216095A (ja) * | 2011-03-31 | 2012-11-08 | Sharp Corp | 検出領域拡大装置、表示装置、検出領域拡大方法、プログラムおよび、コンピュータ読取可能な記録媒体 |
Family Cites Families (27)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20060176294A1 (en) * | 2002-10-07 | 2006-08-10 | Johannes Vaananen | Cursor for electronic devices |
JP4274997B2 (ja) | 2004-05-06 | 2009-06-10 | アルパイン株式会社 | 操作入力装置および操作入力方法 |
US7893920B2 (en) | 2004-05-06 | 2011-02-22 | Alpine Electronics, Inc. | Operation input device and method of operation input |
CN101042621A (zh) * | 2006-03-20 | 2007-09-26 | 南京Lg同创彩色显示系统有限责任公司 | 写字板显示器的误差校正装置及其方法 |
JP4318056B1 (ja) | 2008-06-03 | 2009-08-19 | 島根県 | 画像認識装置および操作判定方法 |
JP4793422B2 (ja) | 2008-10-10 | 2011-10-12 | ソニー株式会社 | 情報処理装置、情報処理方法、情報処理システムおよび情報処理用プログラム |
TW201020901A (en) * | 2008-11-20 | 2010-06-01 | Ibm | Visual feedback for drag-and-drop operation with gravitational force model |
JP4701424B2 (ja) | 2009-08-12 | 2011-06-15 | 島根県 | 画像認識装置および操作判定方法並びにプログラム |
US8514255B2 (en) * | 2009-08-31 | 2013-08-20 | Namco Bandai Games Inc. | Information storage medium, image control device, and image control method |
US8933910B2 (en) * | 2010-06-16 | 2015-01-13 | Panasonic Intellectual Property Corporation Of America | Information input apparatus, information input method, and program |
CN101866243A (zh) | 2010-07-09 | 2010-10-20 | 苏州瀚瑞微电子有限公司 | 三维空间触控操作的方法及其手势 |
US20120056989A1 (en) * | 2010-09-06 | 2012-03-08 | Shimane Prefectural Government | Image recognition apparatus, operation determining method and program |
WO2012039140A1 (ja) | 2010-09-22 | 2012-03-29 | 島根県 | 操作入力装置および方法ならびにプログラム |
JP5494423B2 (ja) | 2010-11-02 | 2014-05-14 | ソニー株式会社 | 表示装置、位置補正方法およびプログラム |
JP5785753B2 (ja) | 2011-03-25 | 2015-09-30 | 京セラ株式会社 | 電子機器、制御方法および制御プログラム |
TW201248452A (en) * | 2011-05-30 | 2012-12-01 | Era Optoelectronics Inc | Floating virtual image touch sensing apparatus |
CN102566822A (zh) * | 2012-01-17 | 2012-07-11 | 苏州瀚瑞微电子有限公司 | 一种触摸屏的自动校准方法 |
CN104471511B (zh) * | 2012-03-13 | 2018-04-20 | 视力移动技术有限公司 | 识别指点手势的装置、用户接口和方法 |
KR20130109389A (ko) * | 2012-03-27 | 2013-10-08 | 박승배 | 개인화 가상키보드 제공방법 |
CN103376891A (zh) * | 2012-04-23 | 2013-10-30 | 凹凸电子(武汉)有限公司 | 多媒体系统,显示装置的控制方法及控制器 |
JP6040564B2 (ja) * | 2012-05-08 | 2016-12-07 | ソニー株式会社 | 画像処理装置、投影制御方法及びプログラム |
US9524098B2 (en) * | 2012-05-08 | 2016-12-20 | Sonos, Inc. | Methods and systems for subwoofer calibration |
JP2014067071A (ja) | 2012-09-10 | 2014-04-17 | Askanet:Kk | 空中タッチパネル |
US20140191927A1 (en) * | 2013-01-09 | 2014-07-10 | Lg Electronics Inc. | Head mount display device providing eye gaze calibration and control method thereof |
JP6206180B2 (ja) * | 2013-12-27 | 2017-10-04 | 船井電機株式会社 | 画像表示装置 |
CN103731526A (zh) * | 2014-01-21 | 2014-04-16 | 唐景华 | 一种能将三维画面悬浮在空中虚拟成像的智能手机 |
US10353207B2 (en) * | 2014-12-19 | 2019-07-16 | Sony Interactive Entertainment Inc. | Head-mounted display device and video display system |
-
2014
- 2014-12-26 CN CN202110375065.4A patent/CN112965636A/zh active Pending
- 2014-12-26 KR KR1020167004326A patent/KR20170105404A/ko not_active Application Discontinuation
- 2014-12-26 CN CN201480046601.4A patent/CN105934732B/zh active Active
- 2014-12-26 US US15/123,893 patent/US10748509B2/en active Active
- 2014-12-26 EP EP14909123.3A patent/EP3239818A4/en not_active Withdrawn
- 2014-12-26 JP JP2016509809A patent/JP6520918B2/ja active Active
- 2014-12-26 WO PCT/JP2014/084716 patent/WO2016103522A1/ja active Application Filing
-
2015
- 2015-12-24 TW TW104143546A patent/TW201636796A/zh unknown
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2011108152A (ja) * | 2009-11-20 | 2011-06-02 | Fujitsu Toshiba Mobile Communications Ltd | 3次元入力表示装置 |
JP2011175617A (ja) * | 2010-01-29 | 2011-09-08 | Shimane Prefecture | 画像認識装置および操作判定方法並びにプログラム |
JP2012203736A (ja) * | 2011-03-25 | 2012-10-22 | Kyocera Corp | 電子機器、制御方法および制御プログラム |
JP2012216095A (ja) * | 2011-03-31 | 2012-11-08 | Sharp Corp | 検出領域拡大装置、表示装置、検出領域拡大方法、プログラムおよび、コンピュータ読取可能な記録媒体 |
Non-Patent Citations (1)
Title |
---|
See also references of EP3239818A4 * |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2019197190A (ja) * | 2018-05-11 | 2019-11-14 | 富士ゼロックス株式会社 | 情報処理装置、情報処理システム及びプログラム |
WO2022137940A1 (ja) * | 2020-12-24 | 2022-06-30 | マクセル株式会社 | 空間浮遊映像表示装置 |
JP6916568B1 (ja) * | 2021-01-29 | 2021-08-11 | 株式会社ツガワ | 端末処理装置 |
JP2022117247A (ja) * | 2021-01-29 | 2022-08-10 | 株式会社ツガワ | 端末処理装置 |
JP7349200B1 (ja) | 2022-12-19 | 2023-09-22 | 株式会社ソリトンウェーブ | 非接触入力装置 |
Also Published As
Publication number | Publication date |
---|---|
EP3239818A4 (en) | 2018-07-11 |
US10748509B2 (en) | 2020-08-18 |
EP3239818A1 (en) | 2017-11-01 |
JPWO2016103522A1 (ja) | 2017-09-14 |
JP6520918B2 (ja) | 2019-05-29 |
CN112965636A (zh) | 2021-06-15 |
US20170025097A1 (en) | 2017-01-26 |
TW201636796A (zh) | 2016-10-16 |
CN105934732B (zh) | 2021-04-27 |
KR20170105404A (ko) | 2017-09-19 |
CN105934732A (zh) | 2016-09-07 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6365660B2 (ja) | 検出装置、電子機器、検出方法およびプログラム | |
WO2016103522A1 (ja) | 制御装置、電子機器、制御方法およびプログラム | |
JP6724987B2 (ja) | 制御装置および検出方法 | |
JP6733731B2 (ja) | 制御装置、プログラムおよび制御方法 | |
JP6822472B2 (ja) | 表示装置、プログラム、表示方法および制御装置 | |
US20220011900A1 (en) | Detection device and program | |
JP6658809B2 (ja) | 検出装置、電子機器、検出方法およびプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
ENP | Entry into the national phase |
Ref document number: 20167004326 Country of ref document: KR Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 2016509809 Country of ref document: JP Kind code of ref document: A |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14909123 Country of ref document: EP Kind code of ref document: A1 |
|
REEP | Request for entry into the european phase |
Ref document number: 2014909123 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014909123 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 15123893 Country of ref document: US |
|
NENP | Non-entry into the national phase |
Ref country code: DE |