WO2022190406A1 - 端末装置 - Google Patents
端末装置 Download PDFInfo
- Publication number
- WO2022190406A1 WO2022190406A1 PCT/JP2021/028304 JP2021028304W WO2022190406A1 WO 2022190406 A1 WO2022190406 A1 WO 2022190406A1 JP 2021028304 W JP2021028304 W JP 2021028304W WO 2022190406 A1 WO2022190406 A1 WO 2022190406A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- screen
- terminal
- unit
- finger
- user
- Prior art date
Links
- 230000003287 optical effect Effects 0.000 claims abstract description 53
- 238000003384 imaging method Methods 0.000 claims description 523
- 238000012545 processing Methods 0.000 claims description 252
- 238000000034 method Methods 0.000 claims description 211
- 238000003860 storage Methods 0.000 claims description 167
- 210000001508 eye Anatomy 0.000 claims description 160
- 230000008569 process Effects 0.000 claims description 142
- 238000004891 communication Methods 0.000 claims description 124
- 238000013075 data extraction Methods 0.000 claims description 87
- 210000001747 pupil Anatomy 0.000 claims description 56
- 230000000007 visual effect Effects 0.000 claims description 37
- 238000000605 extraction Methods 0.000 claims description 25
- 210000003128 head Anatomy 0.000 claims description 21
- 210000000988 bone and bone Anatomy 0.000 claims description 9
- 238000001514 detection method Methods 0.000 claims description 7
- 239000011521 glass Substances 0.000 abstract description 45
- 238000010586 diagram Methods 0.000 description 101
- 230000006870 function Effects 0.000 description 82
- 210000000554 iris Anatomy 0.000 description 46
- 230000000694 effects Effects 0.000 description 35
- 239000000284 extract Substances 0.000 description 27
- 238000006243 chemical reaction Methods 0.000 description 20
- 239000004973 liquid crystal related substance Substances 0.000 description 18
- 238000012937 correction Methods 0.000 description 12
- 230000004913 activation Effects 0.000 description 6
- 230000009467 reduction Effects 0.000 description 6
- 240000006829 Ficus sundaica Species 0.000 description 3
- 210000000695 crystalline len Anatomy 0.000 description 3
- 230000005611 electricity Effects 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000003825 pressing Methods 0.000 description 2
- 239000000853 adhesive Substances 0.000 description 1
- 230000001070 adhesive effect Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 210000005252 bulbus oculi Anatomy 0.000 description 1
- 230000006835 compression Effects 0.000 description 1
- 238000007906 compression Methods 0.000 description 1
- 210000005069 ears Anatomy 0.000 description 1
- 210000004905 finger nail Anatomy 0.000 description 1
- 238000004519 manufacturing process Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000282 nail Anatomy 0.000 description 1
- 238000010079 rubber tapping Methods 0.000 description 1
- 239000012780 transparent material Substances 0.000 description 1
- 230000003936 working memory Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/02—Input arrangements using manually operated switches, e.g. using keyboards or dials
- G06F3/023—Arrangements for converting discrete items of information into a coded form, e.g. arrangements for interpreting keyboard generated codes as alphanumeric codes, operand codes or instruction codes
- G06F3/0233—Character input methods
- G06F3/0236—Character input methods using selection techniques to select from displayed items
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/03—Arrangements for converting the position or the displacement of a member into a coded form
- G06F3/033—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
- G06F3/0354—Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
- G06F3/03547—Touch pads, in which fingers can move on a surface
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/048—Interaction techniques based on graphical user interfaces [GUI]
- G06F3/0487—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
- G06F3/0488—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
- G06F3/04886—Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures by partitioning the display area of the touch-screen or the surface of the digitising tablet into independently controllable areas, e.g. virtual keyboards or menus
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/7243—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages
- H04M1/72436—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality with interactive means for internal management of messages for text messaging, e.g. short messaging services [SMS] or e-mails
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0138—Head-up displays characterised by optical features comprising image capture systems, e.g. camera
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B2027/0178—Eyeglass type
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F2203/00—Indexing scheme relating to G06F3/00 - G06F3/048
- G06F2203/048—Indexing scheme relating to G06F3/048
- G06F2203/04803—Split screen, i.e. subdividing the display area or the window area into separate subareas
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
- G06F3/167—Audio in a user interface, e.g. using voice commands for navigating, audio feedback
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M1/00—Substation equipment, e.g. for use by subscribers
- H04M1/72—Mobile telephones; Cordless telephones, i.e. devices for establishing wireless links to base stations without route selection
- H04M1/724—User interfaces specially adapted for cordless or mobile telephones
- H04M1/72403—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality
- H04M1/72409—User interfaces specially adapted for cordless or mobile telephones with means for local support of applications that increase the functionality by interfacing with external accessories
- H04M1/724094—Interfacing with a device worn on the user's body to provide access to telephonic functionalities, e.g. accepting a call, reading or composing a message
- H04M1/724097—Worn on the head
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2201/00—Electronic components, circuits, software, systems or apparatus used in telephone systems
- H04M2201/40—Electronic components, circuits, software, systems or apparatus used in telephone systems using speech recognition
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04M—TELEPHONIC COMMUNICATION
- H04M2250/00—Details of telephonic subscriber devices
- H04M2250/22—Details of telephonic subscriber devices including a touch pad, a touch sensor or a touch detector
Definitions
- the present invention relates to a terminal device having attachments to be worn on the user's head, such as eyeglasses and face shields.
- the glasses-type terminals that are currently being developed and sold have a complicated configuration and require many components, resulting in a large number of components and a multi-step manufacturing process. Therefore, conventional spectacles-type terminals are expensive, which is one of the reasons why spectacles-type terminals are not widely used.
- the present invention has been made based on the above circumstances, and it is an object of the present invention to provide a terminal device that has a simple configuration and can be manufactured with a small number of parts.
- glasses-type terminals currently being developed and sold, when the user of the glasses-type terminal operates the displayed screen, he or she may issue an instruction by voice or use a touch pad provided at the base of the handle of the glasses. is being tapped.
- a user inputs characters on a screen that appears to be floating in his/her field of view, for example, in voice operation, words have many homonyms and there are individual differences in pronunciation. Not everyone can perform accurate character input.
- the spectacles-type terminal may not be able to accurately recognize the content of the voice due to external noise or the like.
- the touch pad provided on the handle of the glasses is not large enough to allow text input. For this reason, the conventional spectacles-type terminal has a problem that it is difficult to use the screen displayed in the field of view to input characters for e-mail, for example.
- the present invention has been devised based on the above circumstances, and can be manufactured with a simple structure and a small number of parts. It is an object of the present invention to provide a terminal device capable of performing
- the display device includes a projection device having a display device and a hologram sheet or a half mirror, and the hologram sheet or half mirror is arranged in front of the user's face and within the user's field of view,
- the original screen may be displayed on the hologram sheet or half mirror by projecting the image from the projection device onto the hologram sheet or half mirror.
- the display device includes a projection device having a display device, an optical system, and a projection section for projecting an original screen displayed on the display device via the optical system.
- a transmissive screen for projecting an original screen displayed on the display device via the optical system.
- a hologram sheet for projecting an original screen displayed on the display device via the optical system.
- a hologram film for projecting an original screen displayed on the display device via the optical system.
- a hologram optical element for projecting an original screen displayed on the display device via the optical system.
- the display device includes a projection device having a display device, an optical system, and a projection unit for projecting an original screen displayed on the display device via the optical system. It may be composed of a part or all of a reflecting mirror, a prism, a light guide plate, or a waveguide. Moreover, you may use optical systems other than these.
- the display device may be a transmissive or transparent display device, and the display device may be arranged in front of the user's face and within the user's field of view.
- the terminal device of the present invention since the attachment to be worn on the user's head and the terminal on which the display unit is mounted are configured separately, existing mobile terminals such as smartphones and tablet terminals can be used as the terminal. can be used.
- an existing portable terminal or the like as a terminal in this manner, the number of parts of the wearable object can be reduced, and the structure of the wearable object can be simplified.
- a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- the terminal device of the present invention further includes a communication unit for allowing various devices provided on the wearable to perform wireless communication with the outside. may perform wireless communication with the terminal via the communication unit.
- a communication unit for allowing various devices provided on the wearable to perform wireless communication with the outside. may perform wireless communication with the terminal via the communication unit.
- the terminal and the display device each perform data communication after performing authentication based on identification information sent from the other party when performing wireless communication.
- the terminal device of the present invention when the terminal displays the screen displayed on the display unit as the original screen on the display device, the screen is displayed on the display unit according to the screen display setting of the display device made by the user.
- a screen that simplifies the current screen, a part of the screen displayed on the display unit, or a screen that enlarges the characters and/or charts on the screen displayed on the display unit may be displayed on the display device. good.
- the screen on the display unit can be may be maintained as it is or the display may be turned off.
- the display unit when the screen displayed on the display unit is displayed as the original screen on the display device, when the user designates the screen to be displayed on the display device, the display unit The specified screen may be displayed on the display device separately from the screen currently displayed on the screen.
- the terminal is a mobile terminal, and has a function of acquiring location information relating to its own location, and a function of moving the user from the current location to the user based on the map information and the location information stored in the storage unit. It is desirable to have a function of creating a screen for guiding to the set destination and displaying it on the display unit.
- the terminal is a mobile terminal, and has a function of acquiring location information about its own location, and searches for stores around the current location based on the map information and location information stored in the storage unit. It is desirable to have a function of displaying the information about the store obtained by doing so on the display unit.
- the display device may be detachably attached to the wearable object.
- the wearable object captures an image of the finger or input pointing tool that performed the operation, and stores image data obtained by imaging the finger or the input pointing tool.
- the terminal a storage unit that stores various data including data related to the original screen;
- an imaging device captures an image of a finger or an input pointing tool operated by a user on a visible screen
- the operation with the finger or the input pointing tool can be used as various operations based on the image data obtained by imaging.
- an operation determination unit that determines what kind of operation is performed among An imaging range that can be captured by an imaging device based on image data obtained by imaging a finger or an input pointer operated by a user on a visible screen by the imaging device.
- a position data generation unit that generates position data of the finger or input indicator within the range;
- the operation determination unit determines that the operation at each predetermined position is the predetermined operation based on the image data.
- data relating to the visible screen for specifying the position and size of the visible screen is generated and stored as reference data in the storage unit.
- a reference data generator for When the user performs an operation on the visible screen with a finger or an input pointing tool, the data related to the details of the operation with the finger or the input pointing tool determined by the operation determination unit and generated by the position data generation unit Based on the obtained position data of the finger or the input indicator, the reference data related to the visible screen stored in the storage unit, and the data related to the original screen corresponding to the visible screen stored in the storage unit, By specifying the range of the visible screen within the imaging range and checking at which position within the specified range of the visible screen the finger or input indicator is operated by the finger or the input indicator.
- an input control unit that recognizes the content of an input instruction corresponding to an operation by and controls the screen displayed on the display unit and the original screen displayed on the display device according to the content of the recognized input instruction; , may be provided.
- the operation determination unit determines and obtains the relevant input. Based on the data on the content of the operation with the finger or the input pointing tool, the position data of the finger or the input pointing tool generated by the position data generating unit, and the reference data on the viewing screen stored in the storage unit, the The content of the input instruction corresponding to the operation with the finger or the input instruction tool is recognized, and the original screen displayed on the display device is controlled according to the content of the recognized input instruction.
- the user can perform the same operation on the visual screen that appears to be floating in the air as he or she would operate on the screen displayed on a normal touch panel. can be entered. Therefore, when the terminal device of the present invention is used, the user can operate various screens such as character input operation and enlargement/reduction by performing operations on the visible screen in the same way as a normal smartphone terminal or tablet terminal. Operation can be performed easily and accurately.
- the terminal controls the imaging device to adjust the imaging range of the imaging device, and controls the imaging device to adjust the depth of field, which is the range in the depth direction in which the subject is in focus. It is desirable to have a function. By using these functions to limit the target imaged by the imaging device to only a finger or an input pointing tool operated on the viewing screen, consideration can be given to the privacy of others.
- the wearable item By imaging the user's eyes, the wearable item includes the original screen reflected in the user's eyes when the user operates the visible screen with a finger or a predetermined input pointing tool, and the finger.
- an imaging device is provided that acquires an image of the input pointing tool and outputs the acquired image data to the terminal wirelessly or by wire,
- the terminal a storage unit that stores various data including data related to the original screen;
- an imaging device captures an image of an original screen and a finger or an input indicator reflected in the user's eyes, various operations by the finger or the input indicator are performed based on a series of image data obtained by the imaging.
- an operation determination unit that determines what type of operation is performed among operations
- an operation position specifying unit that specifies which position in the original screen is the position where the When the user performs an operation on the viewing screen with a finger or an input pointing tool, the data on the details of the operation with the finger or the input pointing tool obtained by the operation determination unit and the data obtained by the operation position specifying unit Based on data representing a position within the original screen to be operated by the finger or the input pointing tool and data related to the original screen stored in the storage unit, an operation is performed on the visual screen. Recognizing the content of an input instruction corresponding to the operation with the finger or input instruction tool, and controlling the screen displayed on the display unit and the original screen displayed on the display device according to the content of the recognized input instruction an input control unit that performs may be provided.
- the input control unit of the terminal controls the finger or the input obtained by the operation determination unit.
- the user can perform the same operation on the visual screen that appears to be floating in the air as he or she would operate on the screen displayed on a normal touch panel. can be entered. Therefore, when the terminal device of the present invention is used, the user can operate various screens such as character input operation and enlargement/reduction by performing operations on the visible screen in the same way as a normal smartphone terminal or tablet terminal. Operation can be performed easily and accurately.
- the operation position specifying unit captures an image based on a series of image data obtained by the imaging. Finding the range of the original screen within the imaging range that is the range that the device can capture and the position of the finger or input pointer within the imaging range, and obtaining the range of the original screen and the imaging range within the obtained imaging range Alternatively, the position of the original screen to be operated by the finger or the input pointing tool may be specified based on the position of the finger or the input pointing tool in the original screen.
- the terminal further includes an image data extraction unit for extracting image data in which a finger or an input indicator is present from a series of image data obtained by imaging with the imaging device,
- the operation determination unit determines what type of operation the operation with the finger or the input indicator is based on the series of image data extracted by the image data extraction unit, and identifies the operation position. It is preferable that the unit identifies a position to be operated by the finger or the input pointer within the original screen based on the series of image data extracted by the image data extraction unit.
- the series of image data extracted by the image data extraction unit includes only image data in which the finger or pointing tool exists, so that the operation determination unit and the operation position identification unit can efficiently perform the processing. can do well.
- the terminal when the terminal captures an image of the user's eye with the imaging device before displaying the original screen on the display device, the terminal generates image data of the iris and pupil based on the image data obtained by the imaging.
- an iris/pupil image data generation unit for storing in a storage unit; and image data obtained by imaging the original screen reflected in the user's eyes and the finger or the input pointing tool when the imaging device captures the image.
- an image difference extraction unit that generates image data from which the iris and pupil images have been removed by performing a process of extracting the difference based on the iris and pupil image data stored in the storage unit; Further, the image data extraction unit may perform image data extraction processing using a series of image data generated by the image difference extraction unit. As a result, the images of the iris and pupil are removed from the image data generated by the image difference extraction unit, so the image data extraction unit extracts image data in which a finger or an input pointer exists. can be easily done.
- the image difference extraction unit preferably generates image data from which the contact lens image is removed in addition to the iris and pupil images. That is, before the terminal displays the original screen on the display device, when an image of the eye of the user wearing the contact lens is imaged by the imaging device, the contact lens image is obtained based on the image data obtained by the imaging device.
- an iris/pupil image data generation unit that generates image data of the iris and pupil and stores them in a storage unit; Images of the contact lens, iris, and pupil are extracted based on the image data obtained by the imaging and the image data of the contact lens, iris, and pupil stored in the storage unit.
- an image difference extracting unit for generating image data from which is removed, wherein the image data extracting unit performs image data extraction processing using a series of image data generated by the image difference extracting unit.
- the terminal determines whether or not the image of the user's eyes exists in the image data obtained by imaging with the imaging device, and the image data in which the image of the user's eyes does not exist is detected by the imaging device.
- An eye presence/absence determination unit that detects that the image has been acquired continuously for a certain period of time, and the image data without the image of the user's eyes continues for a certain period of time in the imaging device by the eye presence/absence determination unit.
- a notification control unit that controls the notification device to generate sound or vibration from the notification device when it is detected that the information has been acquired by the user.
- the notification control unit causes the eye presence/absence determination unit to send image data in which the image of the driver's eyes does not exist to the imaging device. , it is determined that the driver is asleep, and drowsy driving can be prevented by generating an alarm or vibration from the notification device.
- the terminal includes a position detection unit that detects the contact position when a touch operation is performed on the screen of the display unit and outputs contact position information indicating the detected position to the input control unit.
- Data relating to the image of the touch pad displayed on the display unit is stored in the storage unit, and the input control unit displays the image of the touch pad on the display unit when the original screen is displayed on the display unit.
- the terminal includes a movement information output unit that detects the movement direction of the terminal, measures the movement amount of the terminal, and outputs movement information indicating the detected movement direction and the measured movement amount to the input control unit. is provided, and data representing the correspondence relationship between the movement information of the terminal and the operation related to the cursor displayed on the display device is stored in the storage unit.
- a movement information output unit that detects the movement direction of the terminal, measures the movement amount of the terminal, and outputs movement information indicating the detected movement direction and the measured movement amount to the input control unit.
- data representing the correspondence relationship between the movement information of the terminal and the operation related to the cursor displayed on the display device is stored in the storage unit.
- the original screen displayed on the display device includes a screen corresponding to the operation unit of the remote controller for the remotely controllable device, and the terminal uses the screen corresponding to the operation unit of the remote controller as the original screen.
- Displayed on a display device when an operation is performed on a visible screen corresponding to the original screen, a command signal indicating the content of the operation is generated, and the generated command signal is wirelessly transmitted to a remotely controllable device.
- a transmitting remote control may also be provided.
- the display device and the imaging device may be detachably attached to the wearable object.
- the terminal device of the present invention may further include a touch pad section as an input device for the terminal, and the touch pad section may be detachably attached to the mounting section.
- the terminal device of the present invention further comprises a sound output device that converts an electrical signal output from the terminal into sound and transmits the sound to the user through the ear or by bone conduction.
- the device may be provided on the wearable.
- the terminal device of the present invention further comprises a sound input device that converts the user's voice into an electrical signal and outputs the electrical signal to the terminal. good.
- the terminal device of the present invention since the attachment to be worn on the user's head and the terminal on which the display unit is mounted are configured separately, existing mobile terminals such as smartphones and tablet terminals can be used as the terminal. can be used.
- an existing portable terminal or the like as a terminal in this manner, the number of parts of the wearable object can be reduced, and the structure of the wearable object can be simplified.
- a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- FIG. 1 is a schematic perspective view of a terminal device which is the first embodiment of the present invention.
- FIG. 2(a) is a schematic plan view of the terminal device of the first embodiment
- FIG. 2(b) is a schematic right side view of the terminal device.
- FIG. 3 is a schematic diagram for explaining how the original screen is projected onto the hologram sheet of the display device (display device) in the terminal device of the first embodiment.
- FIG. 4 is a schematic block diagram of the terminal device of the first embodiment.
- FIG. 5 is a flowchart for explaining the procedure of processing for displaying a screen on the display device according to the display device control program in the terminal device of the first embodiment.
- FIG. 1 is a schematic perspective view of a terminal device which is the first embodiment of the present invention.
- FIG. 2(a) is a schematic plan view of the terminal device of the first embodiment
- FIG. 2(b) is a schematic right side view of the terminal device.
- FIG. 3 is a schematic diagram for explaining how the original
- FIG. 6 is a diagram showing an example of a visible screen when a part of the screen of the display unit is displayed on the display device.
- FIG. 7 is a diagram showing an example of a visible screen when a screen obtained by enlarging characters displayed on the screen of the display unit is displayed on the display device.
- FIG. 8 is a diagram showing an example of a visible screen when the screen of the application program for route guidance is displayed on the display device.
- FIG. 9 is a diagram showing an example of a hologram sheet attached to the lens portion of eyeglasses in the terminal device of the present invention.
- FIG. 10 is a schematic perspective view of a terminal device according to a second embodiment of the invention.
- FIG. 11 is a schematic block diagram of the terminal device of the second embodiment.
- FIG. 12 is a flowchart for explaining the procedure of processing for displaying a screen on the display device according to the display device control program in the terminal device of the second embodiment.
- FIG. 13 is a schematic perspective view of a terminal device according to a third embodiment of the invention.
- FIG. 14 is a schematic block diagram of the terminal device of the third embodiment.
- FIG. 15 is a diagram showing an example of a character input screen.
- FIG. 16 is a diagram showing an example of a search screen displayed on the character input screen.
- 17A and 17B are diagrams for explaining aspects of a touch operation performed on the visible screen.
- FIG. FIG. 18 is a diagram for explaining a touch operation performed on the visible screen.
- FIG. 19 is a flow chart for explaining the procedure of the reference data setting process in the terminal device of the third embodiment.
- FIG. 20 is a diagram showing an example of the original screen displayed during the process of setting the reference data.
- FIG. 21 is a flow chart for explaining a character input processing procedure using a visible screen in the terminal device of the third embodiment.
- FIG. 22 is a flowchart for explaining the procedure of screen display processing using a visible screen in the terminal device of the third embodiment.
- FIG. 23 is a schematic block diagram of a terminal device according to the fourth embodiment of the invention.
- FIG. 24 is a schematic block diagram of a terminal device according to the fifth embodiment of the invention.
- FIG. 25 is a schematic block diagram of a terminal device according to the sixth embodiment of the present invention.
- FIG. 26 is a schematic block diagram of a terminal device according to the seventh embodiment of the invention.
- FIG. 21 is a flow chart for explaining a character input processing procedure using a visible screen in the terminal device of the third embodiment.
- FIG. 22 is a flowchart for explaining the procedure of screen display processing using a visible screen in the terminal device of the third embodiment
- FIG. 27 is a schematic block diagram of a terminal device according to the eighth embodiment of the invention.
- FIG. 28 is a schematic block diagram of a terminal device according to the ninth embodiment of the present invention.
- FIG. 29 is a diagram for explaining the process of converting the X coordinate of the position data to the X coordinate of the position data on the reference screen by the shift correction unit in the ninth embodiment.
- FIG. 30 is a diagram for explaining the process of converting the Y coordinate of the position data to the Y coordinate of the position data on the reference screen by the deviation correction unit in the ninth embodiment.
- FIG. 31 is a schematic block diagram of a terminal device according to the tenth embodiment of the present invention.
- FIG. 32 is a schematic block diagram of a terminal device according to the eleventh embodiment of the present invention.
- FIG. 33 is a schematic block diagram of a terminal device according to the twelfth embodiment of the present invention.
- FIG. 34 is a schematic block diagram of a terminal device according to the thirteenth embodiment of the present invention.
- FIG. 35 is a diagram showing an example of the original screen for setting the reference data displayed during the process of setting the reference data in the thirteenth embodiment.
- FIG. 36 is a flow chart for explaining the character input processing procedure using the visible screen in the terminal device of the thirteenth embodiment.
- FIG. 37 is a flowchart for explaining the procedure of screen display processing using the visible screen in the terminal device of the thirteenth embodiment.
- FIG. 38 is a schematic block diagram of a terminal device according to the fourteenth embodiment of the present invention.
- FIG. 39 is a schematic block diagram of a terminal device according to the fifteenth embodiment of the present invention.
- FIG. 40 is a schematic block diagram of a terminal device according to the sixteenth embodiment of the present invention.
- FIG. 41(a) is a schematic plan view of the terminal device according to the seventeenth embodiment of the present invention, and FIG. 41(b) is a schematic right side view of the terminal device.
- FIG. 42 is a schematic perspective view of the terminal device of the seventeenth embodiment.
- FIG. 43 is a schematic block diagram of the terminal device of the seventeenth embodiment.
- FIG. 44(a) is a diagram showing an example of the original screen reflected in the eyes
- FIG. 44(b) is a diagram showing an example of the imaging range of the imaging device in the terminal device of the seventeenth embodiment.
- FIG. 44(a) is a diagram showing an example of the original screen reflected in the eyes
- FIG. 44(b) is a diagram showing an example of the imaging range of the imaging device in the terminal device of the seventeenth
- FIG. 45 is a diagram for explaining the mounting location of the touch pad section.
- FIG. 46 is a diagram for explaining the configuration of the touch pad section.
- FIG. 47 is a flow chart for explaining a character input processing procedure using a visible screen in the terminal device of the seventeenth embodiment.
- FIG. 48 is a flow chart for explaining the procedure of screen display processing using the visible screen in the terminal device of the seventeenth embodiment.
- FIG. 49 is a diagram showing an example of the original screen desirable for the operation position specifying unit to specify the range of the original screen.
- FIG. 50 is a schematic block diagram of a terminal device according to the eighteenth embodiment of the present invention.
- FIG. 51 is a flow chart for explaining the character input processing procedure using the visible screen in the terminal device of the eighteenth embodiment.
- FIG. 52 is a flowchart for explaining the procedure of screen display processing using the visible screen in the terminal device of the eighteenth embodiment.
- FIG. 53 is a schematic block diagram of a terminal device according to the nineteenth embodiment of the present invention.
- FIG. 54 is a flow chart for explaining the character input processing procedure using the visible screen in the terminal device of the nineteenth embodiment.
- FIG. 55 is a flow chart for explaining the procedure of screen display processing using the visible screen in the terminal device of the nineteenth embodiment.
- FIG. 56 is a schematic block diagram of a terminal device according to the twenty-tenth embodiment of the present invention.
- FIG. 57 is a flow chart for explaining a character input processing procedure using a visible screen in the terminal device of the twenty-tenth embodiment.
- FIG. 59(a) is a schematic plan view of a terminal device according to the twenty-first embodiment of the present invention
- FIG. 59(b) is a schematic right side view of the terminal device.
- 60 is a schematic perspective view of the terminal device shown in FIG. 59.
- FIG. 61(a) is a schematic plan view of the terminal device of the seventeenth embodiment in which a display device for projecting an image onto a half mirror and an imaging lens are attached near the half mirror
- FIG. 61(b) is the terminal It is a schematic right side view of an apparatus.
- 62 is a schematic perspective view of the terminal device shown in FIG.
- FIG. FIG. 63 is a schematic perspective view of a terminal device which is the twenty-second embodiment of the present invention.
- FIG. 64 is a diagram showing an example of a large-sized hologram sheet attached to the terminal device of the twenty-second embodiment.
- Figure 65 shows the terminal device of Figure 64 configured such that a portion of the shield portion of the face shield is translucent or non-transparent.
- FIG. 66 is a schematic perspective view of the terminal device of the second embodiment using a face shield as an attachment.
- FIG. 67 is a schematic perspective view of a terminal device according to a twenty-first embodiment in which a face shield is used as an attachment and a display device and an imaging device are wirelessly connected to the terminal.
- FIG. 68 is a schematic perspective view of a terminal device according to a twenty-first embodiment in which a face shield is used as an attachment and a display device and an imaging device are connected to the terminal by wire.
- FIG. 69(a) is a schematic perspective view of a terminal device having two display devices of a type having a hologram sheet
- FIG. 69(b) is a schematic perspective view of a terminal device having two display devices of a type having a half mirror.
- FIG. 70 is a diagram showing an example of a hologram sheet attached to the lenses of eyeglasses in a terminal device having two display devices.
- FIG. 71 is a diagram for explaining a method of attaching a solar cell to a terminal device equipped with spectacles as accessories.
- FIG. 72 is a diagram for explaining a method of attaching a solar cell to a terminal device equipped with spectacles as accessories.
- FIG. 73 is a diagram for explaining a method of attaching a solar cell to a terminal device equipped with spectacles as attachments.
- FIG. 74 is a diagram for explaining a method of mounting a solar cell in a terminal device having a face shield as a mounting device.
- FIG. 75 is a diagram for explaining a method of mounting a solar cell in a terminal device having a face shield as a mounting device.
- FIG. 76 is a diagram for explaining the mounting location of the touch pad section in a terminal device using a face shield as an attachment.
- FIG. 77 is a diagram showing a state in which a remote control screen of an air conditioner is used as an original screen and a user operates a visible screen corresponding to the remote control screen.
- FIG. 78 is a diagram showing an example of the original screen for the operation screen when making a call with a mobile phone.
- FIG. 79 is a diagram showing how the screen of the terminal is used as a touch pad when a smartphone is used as the terminal.
- FIG. 80 is a diagram showing how, when a smart phone is used as a terminal, the terminal is used as a mouse to instruct the movement of the cursor.
- FIG. 81 is a diagram showing an example of a hologram sheet detachably attached to the lens portion of spectacles.
- FIG. 82 is a diagram showing an example of a hologram sheet detachably attached to the lens portion of spectacles.
- FIG. 1 is a schematic perspective view of the terminal device according to the first embodiment of the present invention
- FIG. 2(a) is a schematic plan view of the terminal device according to the first embodiment
- FIG. 2(b) is a schematic right side view of the terminal device. It is a diagram.
- FIG. 3 is a schematic diagram for explaining how the original screen is projected onto the hologram sheet of the display device (display device) in the terminal device of the first embodiment
- FIG. 4 is the terminal device of the first embodiment.
- 1 is a schematic block diagram; FIG.
- the terminal device of the present invention includes a wearable item worn on the head of a user (user), a display device (display device) provided on the wearable item, and a display section configured separately from the wearable item. and a terminal on which the In the first embodiment, consider the case where the wearable item is spectacles. Therefore, the terminal device 1A of the first embodiment, as shown in FIGS. , a terminal 30 on which a display unit 31 is mounted, and a communication unit 40 provided in the spectacles 10 .
- the spectacles 10 are general ones having two lens portions 11, 11, as shown in FIGS.
- the lens attached to the lens unit 11 may be a convex lens or a concave lens for correcting vision, or may be simple glass or plastic that does not have a vision correction function, or may be a lens that protects the eyes from sunlight. It may be a lens for sunglasses for protection.
- the display device 20 includes, for example, a small projector (projection device) 21 having a liquid crystal panel (display device), an optical system 22, and a portion of light (image). and a reflecting hologram sheet (or hologram film) 23 .
- a small projector projection device
- the small projector 21 for example, a transmissive liquid crystal system (LCD) projector or a reflective liquid crystal system (LCOS) projector can be used.
- the optical system 22 for example, a lens, a reflecting mirror, a prism, or the like can be used that is configured from a part or all of them.
- the hologram sheet (or hologram film) 23 serves as a projecting section on which the original screen displayed on the liquid crystal panel of the compact projector 21 is projected via an optical system.
- the compact projector 21 and the optical system 22 are arranged in one housing 100 , and the housing 100 is attached to the handle of the glasses 10 .
- the housing 100 is attachable to and detachable from the spectacles 10 . That is, the small projector 21 and the optical system 22 are detachably attached to the spectacles 10 .
- a hologram sheet and a hologram film are generally distinguished by a difference in thickness, in the present specification, the hologram sheet is defined as a concept including both the hologram sheet and the hologram film, and the term hologram sheet is used. to decide.
- the hologram sheet 23 is placed in front of the user's face and within the user's field of view. Specifically, the hologram sheet 23 is attached to the right-eye lens portion 11 of the spectacles 10, as shown in FIGS.
- a rectangular hologram sheet (for example, 1 cm long and 1.5 cm wide) is used as the hologram sheet 23 .
- Images and videos displayed on the liquid crystal panel of the small projector 21 are projected onto the hologram sheet 23 via the optical system 22, as shown in FIG.
- the image or video displayed on the liquid crystal panel of the compact projector 21 is projected onto the entire hologram sheet 23 . That is, the hologram sheet 23 itself serves as a projection range of an image or the like by the small projector 21 .
- a very small screen is displayed on this hologram sheet 23 .
- the user can see the image or video reflected by the hologram sheet 23 .
- the user can see the translucent screen, which is a very small screen image displayed on the hologram sheet 23, as if it were floating in the air. .
- This floating translucent screen is to the user the equivalent of a 25 inch screen viewed from 8 feet away.
- the screen that appears to be floating is translucent, but in general, the screen does not have to be translucent.
- a case where this screen that appears to be floating is displayed in the upper right position of the user's field of view as shown in FIG.
- the display device 20 displays an original screen corresponding to a visible screen that appears to the user to be floating in the air.
- the housing 100 is provided with a communication unit 40, a power supply unit (not shown) such as a battery, and a power switch (not shown).
- the communication unit 40 is for wireless communication between various devices (the display device 20 in the first embodiment) provided on the wearable item (the glasses 10) and the outside.
- the terminal 30 has a function of wirelessly communicating with the outside, and the display device 20 can wirelessly communicate with the terminal 30 via the communication section 40 . Control of the display device 20 is performed by the terminal 30 through wireless communication.
- the power supply unit is for supplying power to various devices (the display device 20 and the communication unit 40 in the first embodiment) provided on the wearing object (glasses 10).
- the power switch turns on/off power supply from the power supply unit to the display device 20 and the communication unit 40 . This power switch is attached to a predetermined location on the surface of the housing 100 .
- the terminal 30 is configured separately from the spectacles 10 instead of being provided in the spectacles 10 . Therefore, as the terminal 30, an existing mobile terminal such as a smart phone or a tablet terminal is used.
- This terminal 30 has a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34, as shown in FIG.
- a display unit 31 is a liquid crystal display device provided on the surface of the terminal 30 .
- a touch panel is provided on the screen of the display unit 31 . This touch panel detects a position of contact when a contact operation (touch operation) is performed on the screen of the display unit 31, and outputs contact position information indicating the detected position to the control unit .
- a detector is provided.
- Various screens such as a home screen, a menu screen, an application screen, and a character input screen are displayed on the screen of the display unit 31 .
- the user can give various instructions to the terminal 30 by performing touch operations on these screens.
- the touch operation refers to various operations such as a tap operation, a double tap operation, a long press operation (long tap), a drag operation, a flick operation, a pinch-in operation, and a pinch-out operation.
- the terminal 30 has a function of performing wireless communication with the outside, and this function is realized by the communication unit 32.
- the terminal 30 can perform wireless communication with the display device 20 via the communication section 32 and the communication section 40 provided in the housing 100 .
- terminal 30 is wirelessly connected to display device 20 .
- Bluetooth can be used as a method of wireless communication between the terminal 30 and the display device 20 . From the viewpoint of ensuring security, each of the terminal 30 and the display device 20 performs data communication after performing authentication based on identification information sent from the other party when performing wireless communication.
- the storage unit 33 stores various programs, data, and the like.
- the storage unit 33 stores a special display device control program for the terminal 30 to control the display device 20 .
- This display device control program is an application program for causing the control unit 34 to realize the function of controlling the display device 20 so that the screen displayed on the display unit 31 is displayed on the display device 20 as the original screen M. .
- the display device control program is executed by the control unit 34 , the screen displayed on the display unit 31 is displayed not only on the display unit 31 but also on the display device 20 .
- buttons for setting the screen display of the display device 20 include a button B1 for instructing that a simplified screen of the screen displayed on the display unit 31 should be displayed on the display device 20; A button B2 for instructing that a part of the screen displayed on the display unit 31 should be displayed on the display device 20, and a button B2 for displaying characters and charts (drawings, photographs, tables, etc.) on the screen displayed on the display unit 31. There is also a button B3 for instructing that the enlarged screen should be displayed on the display device 20 .
- a button B4 for setting that the display unit 31 should be turned off when the screen displayed on the display unit 31 is displayed on the display device 20 as the original screen M A button B5 for instructing to end the program is provided.
- the button B4 when displaying the contents of the screen displayed on the display unit 31 on the display device 20, it is possible to select whether to keep the display of the screen displayed on the display unit 31 as it is or turn off the display unit 31. can be set.
- the display unit 31 of the terminal 30 is lit, the user can turn off the display unit 31 by pressing the power button of the terminal 30 .
- the display unit 31 of the terminal 30 is turned off, the user can turn off the display unit 31 by pressing the power button of the terminal 30 .
- the display device control program when executed, the screen displayed on the display unit 31 is displayed on the display device 20. As a result, the same screen is displayed on the display unit 31 and the display device 20. will be displayed.
- the setting screen of the display device control program the user can specify that the display device 20 should display a screen with different contents from the screen displayed on the display unit 31. be.
- the setting screen of the display device control program has a field for designating a screen to be displayed on the display device 20 .
- the control unit 34 causes the screen designated by the user to be displayed on the display device 20 separately from the screen currently displayed on the display unit 31. do. That is, in this case, different screens are displayed on the display unit 31 and the display device 20, respectively.
- the control unit 34 is equipped with a central processing unit (CPU) and the like, controls the terminal 30 in general, and also controls the display device 20 . For example, when the user performs a touch operation on the display unit 31, the control unit 34 recognizes the content of the instruction given by the operation, and executes processing according to the recognized content. Further, the control unit 34 controls the display device 20 to display the screen displayed on the display unit 31 as the original screen M on the display device 20 by executing the display device control program.
- CPU central processing unit
- control unit 34 has a display control unit 341 as shown in FIG.
- the display control unit 341 controls display on the display unit 31 and the display device 20 .
- the display control unit 341 executes the display device control program stored in the storage unit 33 to display the screen displayed on the display unit 31. is displayed on the display device 20 as the original screen M.
- the display control section 341 controls the display section 31 and the display device 20 according to the settings.
- FIG. 5 is a flow chart for explaining a procedure for displaying a screen on the display device 20 according to the display device control program in the terminal device 1A of the first embodiment.
- the user turns on the power switch provided on the housing 100 .
- the display device 20 and the communication unit 40 are powered on.
- the user performs the operation while wearing the spectacles 10 .
- the user operates the terminal 30 to display a menu screen on the display section 31 .
- the user taps the display device control program icon on the menu screen to select the display device control program.
- the control unit 34 of the terminal 30 activates the display device control program (S11).
- the control unit 34 performs processing according to the display device control program.
- the control unit 34 first performs a process of confirming the connection state between the terminal 30 and the display device 20 (S12). When the connection is confirmed, the control unit 34 requests transmission of identification information from the display device 20, and performs authentication processing based on the identification information sent from the display device 20 (S13). When the display device 20 is thus authenticated, the control unit 34 displays a setting screen for the display device control program on the display unit 31 . Then, data relating to the screen currently displayed on the display unit 31 is wirelessly transmitted to the display device 20, and the screen displayed on the display unit 31 is displayed on the display device 20 as the original screen M (S14). Thereby, the user can see the visible screen S corresponding to the original screen M through the glasses 10 as if floating in the air.
- the control unit 34 executes the application program, displays the screen of the application program on the display unit 31, and displays data about the screen. is wirelessly transmitted to the display device 20, and the same screen as the screen displayed on the display unit 31 is displayed on the display device 20. - ⁇ In this way, the user can see the visible screen S for the screen of the application program through the glasses 10 as if it were floating in the air.
- control unit 34 transmits data related to a part of the screen displayed on display unit 31 to display device 20 , thereby allowing a part of the screen displayed on display unit 31 to be displayed. section will be displayed on the display device 20 .
- FIG. 6 is a diagram showing an example of a visible screen S when part of the screen of the display unit 31 is displayed on the display device 20.
- approximately half of the screen displayed on the display unit 31 is displayed on the display device 20, and when the user sees approximately half of the screen displayed on the display device 20, the approximately half of the screen appears. can be recognized.
- the substantially half of the screen displayed on the display device 20 is displayed larger than the substantially half of the screen when the entire screen is displayed on the display device 20, so that the user can see the content of the visually recognized screen S. becomes easier to recognize.
- FIG. 7 is a diagram showing an example of a visible screen S when a screen obtained by enlarging characters displayed on the screen of the display unit 31 is displayed on the display device 20. As shown in FIG. In this case, as shown in FIG. 7, the user can see the screen on which the characters are enlarged as the visible screen S, so that the characters on the screen can be accurately recognized.
- FIG. 7 shows an example of a screen including only characters, when a screen including charts is displayed on the display unit 31, the user visually recognizes a screen in which not only the characters but also the charts are enlarged. Since it can be viewed as a screen S, the contents of characters and charts can be accurately recognized.
- the user When ending the screen display on the display device 20, the user displays a display device control program setting screen on the display unit 31 of the terminal 30, and ends the display device control program provided on the setting screen. Tap the button B5 instructing what to do.
- the control unit 34 receives the signal to terminate the display device control program (S15), it terminates the display device control program (S16).
- the control unit 34 recognizes that the button B5 has been touched, The display device control program may be terminated.
- the control unit 34 stops transmitting screen-related data to the display device 20 and nothing is displayed on the screen of the display device 20 .
- the user turns off the power switch provided on the housing 100 .
- a predetermined icon is displayed at a predetermined position (for example, a lower corner position) on the screen so that the user can point to the visible screen S.
- the control unit 34 recognizes that the tap operation has been performed on the icon, and controls the power supply unit to transmit from the power supply unit to the display device 20 and the communication unit 40 It is also possible to turn off the power supply of
- the terminal 30 is a mobile terminal and has a GPS (Global Positioning System) function for acquiring position information regarding its own position.
- GPS Global Positioning System
- a screen for guiding the user from the current position to the destination set by the user is created and displayed.
- the terminal 30 has installed a road guidance application program for causing the control unit 34 to realize the function to be displayed on the unit 31 .
- the control unit 34 starts the route guidance application program and executes the route guidance program.
- a screen for guiding the user to the destination set by the user is displayed on the display unit 31 as the screen of the application program for the application program, and is displayed as the original screen M on the display device 20 .
- FIG. 8 is a diagram showing an example of a visible screen S when the screen of the application program for route guidance is displayed as the original screen M on the display device 20. As shown in FIG. In FIG. 8, an image of an arrow indicating the direction in which the user should go is displayed as a screen for guiding the user to the destination. The user can reach the destination by looking at the visual recognition screen S and following the direction of the arrow displayed there.
- the display unit 31 and the display device 20 display a screen (for example, an arrow image screen) for guiding the user to the destination.
- the control unit 34 causes the display unit 31 to display map information indicating the user's current location or map information indicating the user's current location and destination when the route guidance application program is executed. Then, a screen (for example, an arrow image screen) for guiding the user to the destination may be displayed on the display device 20 .
- shops around the current position are searched, and the information about the shops obtained by the search is displayed.
- the terminal 30 has installed a store search application program for causing the control unit 34 to realize the function of displaying on the display unit 31 .
- the control unit 34 starts the store search application program, and the store search application program is executed.
- a screen containing information about shops around the current position is displayed on the display unit 31 as a screen of the search application program, and is displayed as the original screen M on the display device 20 . By looking at the visible screen S corresponding to the original screen M, the user can obtain information about shops around the current position.
- the map information does not necessarily have to be stored in the storage unit 33 in advance.
- the control unit 34 may access a predetermined site on the Internet and use the map information on that site.
- the terminal and the glasses as wearables are configured separately, existing mobile terminals such as smartphones and tablet terminals can be used as terminals.
- an existing mobile terminal or the like as a terminal in this manner, the number of parts of the spectacles used as a wearable object can be reduced, and the structure of the spectacles can be simplified.
- a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- the hologram sheet 23 in the display device 20 is attached to the lens portion 11 of the spectacles 10, but the hologram sheet 23 is embedded in or integrated with the lens portion 11.
- FIG. 9 is a diagram showing an example of the hologram sheet 23 attached to the lens portion 11 of the spectacles 10 in the terminal device of the present invention. In the example shown in FIG.
- a small rectangular hologram sheet 23 is used, and the hologram sheet 23 is vertically long and attached to the upper part of the lens part 11 and slightly to the right.
- a large rectangular hologram sheet (for example, 1.5 cm long and 2.5 cm wide) is used as the hologram sheet 23, and the hologram sheet 23 is horizontally long and the upper portion of the lens portion 11 is placed. It is pasted slightly to the right.
- the hologram sheet 23 is attached to the entire surface of the lens portion 11 .
- the image or video displayed on the liquid crystal panel of the small projector 21 is projected onto the entire hologram sheet 23 .
- an image or video may be projected onto a part of the hologram sheet 23 shown in FIGS. 9(a) to 9(c).
- the hologram sheet 23 may be detachably attached to the spectacles 10 in addition to the small projector 21 and the optical system 22 by using an adhesive material.
- FIG. 10 is a schematic perspective view of the terminal device according to the second embodiment of the present invention
- FIG. 11 is a schematic block diagram of the terminal device according to the second embodiment.
- symbol is attached
- the terminal device 1B includes spectacles 10 as an attachment worn on the user's head, a display device 20 provided on the spectacles 10, and the spectacles 10
- a terminal 30 having a display unit 31 and a cable 50 connecting between the display device 20 and the terminal 30 are provided.
- the main difference between the terminal device 1B of the second embodiment and the terminal device 1A of the first embodiment is that the display device 20 and the terminal 30 are connected by wire using a cable 50.
- Other configurations of the terminal device 1B of the second embodiment are the same as those of the terminal device 1A of the first embodiment.
- the display device 20 includes a small projector 21, an optical system 22, and a hologram sheet 23, as shown in FIG.
- the compact projector 21 and the optical system 22 are arranged in a housing 100 , and the housing 100 is attached to the handle portion of the spectacles 10 .
- the housing 100 can be attached to and detached from the glasses.
- a connection terminal (not shown) for connecting the cable 50 to the display device 20 is provided at a predetermined location on the surface of the housing 100 .
- the display device 20 is controlled by the terminal 30 through wired communication using the cable 50 .
- power to the display device 20 is supplied from the terminal 30 via the cable 50 . Therefore, in the second embodiment, the power supply unit and the power switch in the first embodiment are not provided in the housing 100 . Note that, even in this case, the power supply unit may be provided in the housing 100 .
- the terminal 30 has a display unit 31, a communication unit 32, a storage unit 33, a control unit 34, and a connection terminal (not shown) as an interface.
- a touch panel is provided on the screen of the display unit 31 .
- a cable 50 is connected to the connection terminal of the terminal 30 .
- the display device 20 and the terminal 30 are connected by a cable 50 , and the terminal 30 can communicate with the display device 20 via this cable 50 .
- HDMI (registered trademark) terminals can be used as the connection terminals of the terminal 30 and the connection terminals provided in the housing 100
- an HDMI (registered trademark) cable can be used as the cable 50 .
- a USB terminal can be used as the connection terminal of the terminal 30 and the connection terminal provided in the housing 100, and a USB cable can be used as the cable 50.
- the storage unit 33 stores various programs and data.
- a special display device control program for the terminal 30 to control the display device 20 is stored in the storage unit 33 .
- the display device control program is executed by the control section 34, the screen displayed on the display section 31 is displayed not only on the display section 31 but also on the display device 20.
- FIG. The setting screen of the display device control program is the same as that in the first embodiment, so detailed description thereof will be omitted here.
- the control unit 34 controls the terminal 30 in general and also controls the display device 20 .
- the control section 34 has a display control section 341 as shown in FIG.
- the display control unit 341 controls display on the display unit 31 and the display device 20 .
- the display control unit 341 executes the display device control program stored in the storage unit 33 to display the screen displayed on the display unit 31. is displayed on the display device 20 as the contents of the original screen M.
- the user wearing the spectacles 10 can see the visible screen S corresponding to the original screen M as if it were floating in the air.
- FIG. 12 is a flowchart for explaining the procedure of processing for displaying a screen on the display device 20 according to the display device control program in the terminal device 1B of the second embodiment.
- the user performs the following operations while wearing the glasses 10.
- the user performs settings for starting power supply to the display device 20 from the home screen of the terminal 30 .
- power is supplied from the terminal 30 to the display device 20, and the power of the display device 20 is turned on.
- the housing 100 is provided with a power supply unit, the power supply unit provided in the housing 100 covers all or part of the power supplied to the display device 20 .
- a power switch may be provided in the housing 100, and when the power switch is pressed, the power of the display device 20 may be turned on.
- the user operates the terminal 30 to display a menu screen on the display section 31 .
- the user taps the display device control program icon on the menu screen to select the display device control program.
- the control unit 34 of the terminal 30 Upon receiving the signal indicating that the display device control program has been selected, the control unit 34 of the terminal 30 activates the display device control program (S21). When the display device control program is activated, the control unit 34 performs processing according to the display device control program. Specifically, the control unit 34 first performs a process of confirming the connection state between the terminal 30 and the display device 20 (S22). When the connection is confirmed, the control unit 34 displays the setting screen of the display device control program on the display unit 31 . Data about the screen currently displayed on the display unit 31 is transmitted to the display device 20 via the cable 50, and the content of the screen displayed on the display unit 31 is used as the content of the original screen M on the display device. 20 (S23).
- the control unit 34 executes the application program, displays the screen of the application program on the display unit 31, and displays data about the screen. is transmitted to the display device 20 via the cable 50, and the same screen as the screen displayed on the display unit 31 is displayed on the display device 20. - ⁇ In this way, the user can see the visible screen S for the screen of the application program through the glasses 10 as if it were floating in the air.
- the user When ending the screen display on the display device 20, the user displays a display device control program setting screen on the display unit 31 of the terminal 30, and ends the display device control program provided on the setting screen. Tap button B5 to the effect that it does.
- the control unit 34 receives the signal to terminate the display device control program (S24), it terminates the display device control program (S25).
- the control unit 34 recognizes that the button B5 has been touched, and The display device control program may be ended. As a result, the control unit 34 stops transmitting screen-related data to the display device 20 and nothing is displayed on the screen of the display device 20 .
- the user makes settings for stopping power supply to the display device 20 from the home screen of the terminal 30 .
- the power of the display device 20 is turned off.
- a predetermined icon is displayed at a predetermined position (for example, a lower corner position) on the screen so that the user can point to the visible screen S.
- the control unit 34 recognizes that the icon has been tapped, and turns off power supply from the terminal 30 to the display device 20 via the cable 50. is also possible.
- the housing 100 is provided with a power supply unit, the user taps the icon on the visible screen S with a finger, and the control unit 34 performs the tap operation on the icon.
- the control unit 34 When recognizing that, the control unit 34 turns off the power supply from the terminal 30 to the display device 20 via the cable 50, and controls the power supply unit via the cable 50 to restore the housing 100.
- the power supply from the power supply unit to the display device 20 may be turned off.
- the power supply unit of the housing 100 supplies all of the power to be supplied to the display device 20
- the user taps the icon on the viewing screen S with a finger, and the control unit 34 touches the icon.
- the control unit 34 controls the power supply unit via the cable 50 to turn off the power supply from the power supply unit of the housing 100 to the display device 20. You may do so.
- the terminal and the glasses as wearables are configured separately, so that the existing mobile terminal such as a smartphone or a tablet terminal can be used as the terminal. can be used.
- the existing mobile terminal such as a smartphone or a tablet terminal
- the number of parts of the spectacles used as a wearable object can be reduced, and the structure of the spectacles can be simplified.
- a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- FIG. 13 is a schematic perspective view of the terminal device according to the third embodiment of the invention
- FIG. 14 is a schematic block diagram of the terminal device according to the third embodiment.
- parts having the same functions as those in the first embodiment described above are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 2A includes spectacles 10 as an attachment worn on the user's head, a display device 20 provided on the spectacles 10, and a spectacles 10
- a terminal 30A having a display unit 31, a communication unit 40, and an imaging device 60 for capturing an image in front of the user are provided.
- the main differences between the terminal device 2A of the third embodiment and the terminal device 1A of the first embodiment are that the terminal device 2A has an imaging device 60, that the terminal 30A controls the imaging device 60, that the user is that an instruction corresponding to the operation can be input by performing a touch operation on the visible screen.
- Other configurations of the terminal device 2A of the third embodiment are the same as those of the terminal device 1A of the first embodiment.
- the display device 20 includes a small projector 21 having a liquid crystal panel (display device), an optical system 22, and a hologram sheet 23 that partially reflects light (image).
- the compact projector 21 and the optical system 22 are arranged in one housing 100 , and the housing 100 is attached to the handle of the glasses 10 .
- the hologram sheet 23 is attached to the right-eye lens portion 11 of the spectacles 10, as shown in FIG.
- the housing 100 is provided with a communication unit 40, a power supply unit (not shown) such as a battery, and a power switch (not shown).
- the communication unit 40 is for wireless communication between various devices (the display device 20 and the imaging device 60 in the third embodiment) provided in the spectacles 10 and the terminal 30A.
- the display device 20 and the imaging device 60 perform wireless communication with the terminal 30A via the communication section 40.
- FIG. The display device 20 and imaging device 60 are controlled by the terminal 30A through wireless communication.
- the power supply unit is for supplying power to various devices (in the third embodiment, the display device 20, the communication unit 40, and the imaging device 60) provided in the spectacles 10.
- FIG. Also, the power switch turns on/off power supply from the power supply unit to the display device 20, the communication unit 40, and the imaging device 60.
- This power switch is attached to a predetermined location on the surface of the housing 100 .
- the terminal 30A has a display section 31, a communication section 32, a storage section 33, and a control section 34A.
- the display unit 31 is a liquid crystal display device provided on the surface of the terminal 30A, and the screen of the display unit 31 is provided with a touch panel.
- Various screens such as a home screen, a menu screen, an application screen, and a character input screen are displayed on the screen of the display unit 31 .
- FIG. 15 is a diagram showing an example of a character input screen.
- the character input screen 200 has a keyboard image 210 and a display area 220 for displaying input characters.
- the keyboard image 210 is provided with a plurality of character key images associated with each character (including symbols) and a plurality of function key images provided with specific functions.
- the keyboard image 210 employs the QWERTY layout as the layout of the character key images.
- the keyboard image 210 may be a keyboard image of hiragana syllabary, a keyboard image of each country's language, a ten-key image, or a key image similar to the key arrangement of a mobile phone.
- a search screen for example, is displayed in the display area 220 .
- FIG. 16 is a diagram showing an example of a search screen displayed on the character input screen 200. As shown in FIG.
- This search screen 221 is for searching Internet sites, and has a keyword input section 2211 and a search result display section 2212 for displaying search results. While viewing the character input screen 200 displayed on the display unit 31 , the user can use the key image of the keyboard image 210 to input a keyword into the keyword input unit 2211 .
- the user can give various instructions to the control unit 34A of the terminal 30A by performing a touch operation on the screen of the display unit 31 with a finger.
- the control unit 34A of the terminal 30A by performing operations. Below, how the control unit 34A recognizes the content of an instruction by a touch operation performed on the viewing screen S by the user will be described in detail.
- the imaging device 60 captures an image of the finger that performed the operation when the user performs an operation on the viewing screen S, and outputs the image data obtained by the imaging to the terminal.
- the imaging device 60 is provided on the handle portion of the glasses 10 adjacent to the display device 20, as shown in FIG. Further, the imaging device 60 includes a camera section 61, an image processing section 62, and a camera control section 63, as shown in FIG.
- the camera section 61 has a lens and an imaging device.
- the image processing unit 62 corrects the color and gradation of the captured image based on the image data captured by the camera unit 61, and performs image processing such as compression of the image data. is.
- the camera control section 63 controls the image processing section 62 and controls exchange of image data with the control section 34A of the terminal 30A.
- the image processing unit 62 is provided in the imaging device 60, but the image processing unit 62 is provided in the control unit 34A of the terminal 30A instead of the imaging device 60. may
- the imaging device 60 can capture a part (or substantially the entire field of view) of the user's field of view as an imaging range that can be captured by the imaging device 60 .
- the imaging device 60 is configured to detect the position of the visual screen S recognized by the user, specifically, for example, the position of the finger of the hand when the user reaches out to touch the visual screen S with his/her hand. It is configured such that a subject located at a position that is a substantially constant distance away from the imaging device 60 along the depth direction is brought into focus.
- the focused range (depth of field) is limited to a narrow range.
- the in-focus position is set at a position about 40 cm away from the imaging device 60, and the depth of field is in the range of about 5 cm.
- the imaging device 60 restricts the focus range to a narrow range only when performing operations for setting reference data, inputting characters, and displaying a screen, which will be described later. be done.
- the focus range is not limited to a narrow range.
- the imaging device 60 for example, by manually changing the setting using a distance ring (focusing ring) in the same way as a normal camera, a device capable of switching the in-focus position is used. good too.
- the position of the imaging device 60 that is in focus is set to the position of the visible screen S recognized by the user. Therefore, when the user performs an operation on the viewing screen S with a finger, the image capturing device 60 captures an image of the finger performing the operation in focus. Image data obtained by imaging with the imaging device 60 is sent to the control section 34A of the terminal 30A by wireless communication, and stored in the storage section 33 by the control section 34A.
- the imaging device 60 has a still image shooting function and a moving image shooting function. can do.
- the control unit 34A includes a central processing unit (CPU) and the like, controls the terminal 30A in general, and also controls the display device 20 and the imaging device 60. For example, when the user performs a touch operation on the display unit 31, the control unit 34A recognizes the content of the instruction given by the operation, and executes processing according to the recognized content. Further, the control unit 34A controls the display device 20 so that the content of the screen displayed on the display unit 31 is displayed on the display device 20 as the content of the original screen M by executing the program for controlling the display device. . Specifically, as shown in FIG. 14, the control unit 34A includes a display control unit 341, an image data extraction unit 342, an operation determination unit 343, a position data generation unit 344, a reference data generation unit 345, and an input control unit 346 .
- CPU central processing unit
- the display control unit 341 controls display on the display unit 31 and the display device 20 . Specifically, when the user instructs the activation of the display device control program, the display control unit 341 executes the display device control program stored in the storage unit 33 to display the screen displayed on the display unit 31. is displayed on the display device 20 as the contents of the original screen M. Thereby, the user wearing the spectacles 10 can see the visible screen S corresponding to the original screen M as if it were floating in the air.
- the image data extraction unit 342 extracts image data based on the image data obtained by capturing an image of an object in focus with the image capturing device 60 when the user operates the viewing screen S with a finger. is used to determine whether or not the subject is a finger, and image data in which the finger exists is extracted. A general image recognition method is used to determine whether the subject is a finger.
- the depth of field of the imaging device 60 is limited to a narrow range. It is considered that they are located at a distance. In this manner, the image data extraction unit 342 extracts image data in which the finger is at a position separated from the imaging device 60 by a substantially constant distance along the depth direction. Further, the operation determination unit 343 , the position data generation unit 344 and the reference data generation unit 345 perform processing based on the image data extracted by the image data extraction unit 342 .
- the operation determination unit 343 selects the image data obtained by the image pickup device 60 when the image pickup device 60 picks up an image of the user's finger that is operated on the visible screen S and is extracted by the image data extraction unit 342. Based on the object, it is determined what kind of operation the operation with the finger is among various kinds of operations. For this determination, for example, a general image recognition method is used. Thereby, the operation determination unit 343 can recognize which operation is performed by the finger, such as a tap operation, a double-tap operation, or a long-press operation. Data about the content of the recognized finger operation is stored in the storage unit 33 .
- the position data generation unit 344 is image data obtained by imaging the finger that the user has operated on the visible screen S when the imaging device 60 images the finger, and is extracted by the image data extraction unit 342. Position data of the finger (fingertip) in the imaging range of the imaging device 60 is generated based on the data.
- an XY coordinate system is set within the imaging range of the imaging device 60, with the horizontal direction being the X axis direction and the vertical direction being the Y axis direction.
- the origin of this XY coordinate system is, for example, the lower left point in the imaging range.
- the position data generator 344 acquires finger position data in this XY coordinate system.
- the Z-axis direction is set in the depth direction in this XY coordinate system, thereby constructing an XYZ coordinate system.
- the reference data generation unit 345 creates an image in which the operation determination unit 343 determines that the operation at each predetermined position is the predetermined operation when the user performs an operation with a finger at one or more predetermined positions on the visual recognition screen S.
- the position data of the finger generated by the position data generation unit 344 based on the data, data relating to the visual recognition screen S is generated.
- Data related to the generated visible screen S is stored in the storage unit 33 as reference data.
- reference data data that can specify the position and size of the visible screen S is used. For example, when the user performs an operation with a finger on the four corners of the outer frame of the visible screen S, position data of the finger at each position of the four corners can be used as reference data.
- the finger position data represents finger position information on a plane parallel to the XY plane (substantially parallel to the user's body) at a position separated from the imaging device 60 by a substantially constant distance along the Z-axis direction. can be considered to exist.
- the position data of the finger at that one point and the original screen M corresponding to the visible image S are displayed.
- Data related to the size of the visible screen S (for example, the vertical width and horizontal width calculated or measured in advance) obtained from the data can be used as the reference data.
- the input control unit 346 generates data regarding the details of the finger operation determined by the operation determination unit 343 and the position data generation unit 344 when the user performs an operation on the visible screen S with a finger. based on the obtained position data of the finger, the reference data related to the visible screen S stored in the storage unit 33, and the data related to the original screen M corresponding to the visible screen S stored in the storage unit 33 , specify the range of the visible screen S within the imaging range, and check at which position within the specified range of the visible range S the operation with the finger is performed, thereby corresponding to the operation with the finger.
- the contents of the input instruction are recognized, and the screen displayed on the display unit 31 and the original screen M displayed on the display device 20 are controlled according to the recognized contents of the input instruction.
- the input control unit 346 can also recognize the range of the keyboard image 210 on the character input screen 200, the area of each character key image, and the like. Therefore, for example, when the user touches a character key image with a finger on the keyboard image 210 as the viewing screen S, the input control unit 346 determines that the position of the finger obtained from the finger position data corresponds to the position of the keyboard image. By checking which character key image area in 210 corresponds, the operated character key can be specified.
- the input control unit 346 recognizes the content of the input instruction corresponding to the finger operation performed by the user on the visible screen S with the finger, first, the input instruction stored in the storage unit 33 is Based on the reference data related to the visible screen S, a reference screen corresponding to the visible screen S is generated on a virtual plane corresponding to the imaging range of the imaging device 60. Next, the position data extraction unit 344 By checking to which position on the reference screen the generated position data of the finger corresponds, the position on the visible screen S operated by the finger may be specified.
- the storage unit 33 stores, for example, a reference data setting processing program for performing reference data setting processing, and a character input screen 200 when the visible screen S is the character input screen 200.
- the data stored in the storage unit 33 includes, for example, image data of various original screens M, data related to each original screen M (specifically, size, shape, content, configuration of the original screen M). etc.) and various image data used when creating an original screen for setting reference data, which will be described later. Furthermore, this storage unit 33 is also used as a working memory.
- the input control unit 346 determines the content of the operation with the finger determined by the operation determination unit 343. data and the position data of the finger generated by the position data generation unit 344, the reference data for the viewing screen S stored in the storage unit 33, and the viewing screen S stored in the storage unit 33.
- the content of the input instruction corresponding to the operation with the finger is recognized based on the data related to the original screen M, and the screen displayed on the display unit 31 is controlled and the display device 20 is controlled according to the content of the recognized input instruction. control the original screen M to be displayed on the .
- the user inputs an instruction corresponding to the operation by performing the same operation on the visible screen S that the user sees as the operation on the screen displayed on the normal touch panel. be able to.
- the input control unit 346 recognizes an instruction corresponding to the touch operation in the same manner as when the visible screen S is displayed on the touch panel. be able to.
- the input control unit 346 recognizes an instruction to enlarge or reduce the original screen M corresponding to the visible screen S, and the user When a long-press operation is performed with a finger on the visible screen S, an instruction to display an option menu screen as the original screen M is recognized, and the user performs a drag operation or a flick operation on the visible screen S with the finger. When done, an instruction to scroll and display the original screen M is recognized. Further, when the user performs a finger touch operation on the character key image on the character input screen 200, the input control unit 346 responds to the operation in the same manner as when the character input screen 200 is displayed on the touch panel. A process of recognizing an instruction to input, that is, an input instruction of the character key, and displaying the input-instructed character on the original screen M is performed.
- the user performs a touch operation with a finger on the visible screen S that appears to be floating in the air.
- Touch operations can also be performed in ways that cannot be performed.
- 17 and 18 are diagrams for explaining aspects of touch operations performed on the visible screen S.
- FIG. Normally, a user performs a touch operation with one finger from the front side of the visible screen S as shown in FIG. Touch operation can be performed with one finger. Further, the user performs a touch operation with a plurality of fingers from the front side of the visual recognition screen S as shown in FIG. You can perform touch operations with your finger.
- the control unit 34A of the terminal 30A executes the display device control program, and the display is displayed on the display unit 31.
- the content of the displayed screen is displayed on the display device 20 as the content of the original screen M.
- the processing of displaying a screen on the display device 20 according to the display device control program in the terminal device 2A of the third embodiment is the same as the processing in the terminal device 1A of the first embodiment. Therefore, detailed description thereof is omitted here.
- FIG. 19 is a flowchart for explaining the procedure of the reference data setting process in the terminal device 2A of the third embodiment.
- the display device control program is being executed in the terminal 30A. That is, the display device 20, the communication unit 40, and the imaging device 60 are in the power-on state, and the communication between the terminal 30A, the display device 20, and the imaging device 60 is valid.
- the user operates the terminal 30A to display the menu screen on the display unit 31. Then, the user taps the icon of the reference data setting processing program on the menu screen to select the reference data setting processing program.
- the control unit 34A of the terminal 30A reads the reference data setting processing program from the storage unit 33, and sets the reference data according to the processing flow shown in FIG. process.
- the user operates the terminal 30A to select a screen (for example, the character input screen 200) on which the reference data is to be set, and causes the display device 20 to display that screen as the original screen M.
- the display control unit 341 creates a new original screen M (original screen for setting reference data) by adding, for example, a circle image to one or more predetermined positions on the original screen M, It is displayed on the display device 20 (S31).
- the circle image is a mark indicating that the user should operate the circle position with the finger.
- FIG. 20 is a diagram showing an example of the original screen M displayed during the process of setting the reference data. This example shows the case where the original screen M is the character input screen 200 .
- the original character input screen 200 is shown in FIG.
- the character input screen 201 (original screen for setting reference data) shown in FIG. 20(b) is changed. It will be displayed on the display device 20 .
- images showing circles and numbers are added at the four corner positions.
- the user displays the visual screen S (visual screen for setting reference data) corresponding to the character input screen 201, that is, the screen shown in FIG. 20(b).
- a screen having the same contents as the character input screen 201 shown in FIG. 20B circle images are displayed at the four corners of the character input screen 201, but as shown in FIG. may be displayed.
- the control unit 34A starts the imaging operation of the imaging device 60 (S32).
- the user sees the character input screen 201 shown in FIG. 20B as the visual screen S for setting the reference data
- each circle to which a number is attached is indicated by a number.
- a predetermined operation for example, a tap operation, is performed with fingers in order.
- the user performs a predetermined operation in order to notify the control unit 34A of the position where the user is operating.
- Such user operations are imaged by the imaging device 60 .
- the imaging device 60 images an in-focus subject.
- the image processing unit 62 performs predetermined image processing on the image data obtained by the imaging, and the image data subjected to the image processing is transmitted from the imaging device 60 to the control unit 34A of the terminal 30A by wireless communication. (S33).
- the image data extraction unit 342 determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60, and extracts image data in which the finger exists (S34). ).
- the imaging device 60 sends the image data obtained by imaging the subject in focus to the image data extracting section 342 . Therefore, the image data extraction unit 342 extracts image data in which the finger is at a position separated from the imaging device 60 by a substantially constant distance along the Z-axis direction.
- the operation determination unit 343 determines whether the operation with the finger is a predetermined operation (here, a tap operation) based on the image data extracted by the image data extraction unit 342 .
- the operation determination unit 343 performs such determination processing and determines whether or not the finger tap operation on all four circles has been normally recognized (S35). For example, when it is determined that the finger operation is a tap operation only once, twice, or three times within a predetermined period of time, or when the image data in which the finger exists is extracted from the image data extraction unit 342 is not sent within a predetermined time, the operation determination unit 343 determines that the finger tap operation on all four circles has not been properly recognized. When the operation determination unit 343 determines that the finger tap operation on all of the four circles has been normally recognized, it stores data regarding the content of the finger operation in the storage unit 33, and the finger tap operation is normally recognized. A signal to that effect is sent to the display control unit 341 .
- the display control unit 341 adds to the original screen M an image showing a green lamp indicating that the finger tap operation has been properly recognized, and displays it on the display device 20 (S36). At this time, the display control unit 341 displays, on the original screen M, an image showing characters or figures indicating that the tap operation with the finger has been properly recognized, together with the image showing the green lamp or instead of the image. You may make it add.
- the position data generation unit 344 determines each finger (fingertip) in the imaging range of the imaging device 60 based on the image data in which the operation determination unit 343 determines that the operation in each circle is the predetermined operation. ) position data (XY coordinates) is generated (S37). Then, the reference data generation unit 345 stores the four position data thus generated in the storage unit 33 as reference data regarding the visual recognition screen S currently displayed (S38). Since the reference data specifies the position and size of the visual screen S, the control unit 34A can determine the visual screen S viewed by the user within the imaging range of the imaging device 60 by using the reference data. It becomes possible to recognize the existing range of When the processing of step S38 is performed, the reference data setting processing ends.
- step S35 if the operation determination unit 343 determines that the finger tap operation on all four circles is not properly recognized, it sends a signal to that effect to the display control unit 341. Then, the display control unit 341 adds to the original screen M an image showing a red lamp indicating that the finger tap operation has not been properly recognized, and displays it on the display device 20 (S39). When the user sees the image showing the red lamp, the user has to tap each circle with a finger again on the viewing screen S for setting the reference data. At this time, the display control unit 341 displays an image showing characters or figures indicating that the tap operation with the finger has not been properly recognized on the original screen M, together with the image showing the red lamp or instead of the image. may be added to
- step S39 the control unit 34A determines whether the current process of step S35 is the first process (S40). If the processing in step S35 this time is the first processing, the process proceeds to step S32. On the other hand, if the current process of step S35 is not the first process, the control unit 34b determines whether the current process of step S35 is the second process (S41). If the process in step S35 this time is the second process, the process proceeds to step S32. On the other hand, if the process in step S35 this time is not the second process, the reference data setting process ends. That is, when the red lamp is displayed in the viewing screen S, the user is given two more opportunities to perform finger operations. If the finger operation is still not recognized normally, the reference data setting process may be executed again.
- the operation determination unit 343 determines whether or not the finger operation on each circle is a tap operation, and the finger tap operation on all four circles is performed normally. Although the case of determining whether or not recognition has been performed has been described, each time a finger operation is performed on each circle, the operation determination unit 343 determines whether or not the operation is a tap operation, and determines whether the tap operation is normally performed. You may make it judge whether it was recognized. In this case, every time the operation determination unit 343 determines that the tap operation with the finger on each circle has been properly recognized, the display control unit 341 generates an image indicating that the tap operation on the circle has been properly recognized.
- an image indicating that the tap operation on the circle has not been recognized normally is displayed. It is desirable to display it on the original screen M.
- the image indicating that the tap operation on each circle has been properly recognized include an image in which the circle is displayed in reverse video, an image in which the circle is displayed in green, and the like.
- an image indicating that the tap operation on each circle was not recognized normally for example, an image in which the color of the circle is displayed in red can be cited.
- Position data may be acquired by performing a predetermined operation with a finger on one, two, or three locations.
- data about the size of the visible screen S must be calculated in advance from the data of the original screen M corresponding to the visible screen S and stored in the storage unit 33 . Then, each of the acquired position data and the data regarding the size of the visible screen S constitute the reference data.
- FIG. 21 is a flow chart for explaining the character input processing procedure using the visible screen S in the terminal device 2A of the third embodiment.
- the display device control program is being executed in the terminal 30A. That is, the display device 20, the communication unit 40, and the imaging device 60 are in the power-on state, and the communication between the terminal 30A, the display device 20, and the imaging device 60 is valid.
- the user operates the terminal 30A to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34A of the terminal 30A reads the character input processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Process character input. This character input processing may be automatically executed when the character input screen 200 as the original screen M is displayed on the display device 20 .
- the control unit 34A displays the character input screen 200 as the original screen M on the display device 20, and determines whether or not reference data regarding the visible screen S corresponding to the original screen M is stored in the storage unit 33. (S51). If the reference data related to the visible screen S is not stored in the storage unit 33, the control unit 34A reads out the reference data setting processing program from the storage unit 33, and performs the reference data setting processing according to the processing flow shown in FIG. (S52). After that, the process proceeds to step S51.
- the reference data setting process is executed when the reference data related to the visible screen S is not stored in the storage unit 33. However, the reference data related to the visible screen S is stored in the storage unit 33 Even if it is stored, the reference data setting process may be executed to generate the reference data again when instructed by the user.
- the control section 34A starts the imaging operation of the imaging device 60 (S53).
- the user performs a predetermined operation, for example, a tap operation, with a finger on the keyboard image 210 of the character input screen 200 that is the visible screen S.
- the user performs a predetermined operation in order to notify the control unit 34b of the position where the user is operating.
- Such an operation by the user is imaged by the imaging device 60 and the obtained image data is sent to the image processing section 62 .
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60 to the control unit 34A by wireless communication (S54).
- the image data extraction unit 342 determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60, and extracts image data in which the finger exists (S55). ). That is, the image data extracting unit 342 extracts image data of the finger at a position separated from the imaging device 60 by a substantially constant distance along the Z-axis direction.
- the operation determination unit 343 determines whether the operation with the finger is a predetermined operation (here, a tap operation). This determination is made within a predetermined time. Then, if the operation with the finger is the tap operation, the operation determination unit 343 determines that the operation for character input has been recognized normally.
- the operation determination unit 343 determines that the character input operation is normally recognized
- the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and also determines that the character input operation is normally recognized.
- a signal to that effect is sent to the display control unit 341 .
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a green lamp, which means that the character input operation has been recognized normally, and displays it on the display device 20 (S58). ).
- the display control unit 341 adds to the original screen M, together with the image showing the green lamp or in place of the image, an image showing a character or figure indicating that the character input operation has been properly recognized. You may make it
- the operation determination unit 343 determines in the process of step S56 that the operation for character input has not been properly recognized within a predetermined time, it sends a signal to that effect to the display control unit 341. At this time, for example, even if the image data in which the finger exists is not sent from the image data extraction unit 342 within a predetermined time, the operation determination unit 343 determines that the tap operation is not normally recognized. I judge.
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a red lamp, which means that the character input operation was not recognized normally, and displays it on the display device 20 ( S57). After that, the process proceeds to step S62. At this time, the display control unit 341 creates an image showing characters or figures indicating that the character input operation has not been properly recognized, together with the image showing the red lamp or instead of the image. It may be added to the screen M.
- the position data generation unit 344 detects the position of the finger (fingertip) in the imaging range of the imaging device 60 based on the image data in which the operation determination unit 343 determines that the finger operation is a tap operation. Position data is generated (S59). The finger position data thus generated is stored in the storage unit 33 .
- the input control unit 346 stores in the storage unit 33 the data regarding the content of the operation with the finger obtained by the operation determination unit 343 and the position data of the finger generated by the position data generation unit 344.
- the content of the input instruction corresponding to the operation with the finger is recognized based on the reference data related to the visible screen S stored in the storage unit 33 and the data related to the original screen M corresponding to the visible screen S stored in the storage unit 33. (S60). For example, when the user taps a character key image in the keyboard image 210 with a finger, the input control unit 346 determines which character in the keyboard image 210 the finger position obtained from the finger position data is.
- the input control unit 346 sends a signal regarding the content of the recognized input instruction to the display control unit 341, and the display control unit 341 displays the original screen M corresponding to the content of the input instruction on the display device 20 (S61). .
- step S61 or step S57 the control unit 34A determines whether or not it has received an instruction from the user to terminate character input using the visible screen S (S62). If an instruction to end character input has been received, the character input process using the visible screen S ends. On the other hand, if the instruction to end character input has not been received, the process proceeds to step S53, and character input processing using the visible screen S is continued. Note that the user operates the terminal 30A to give an instruction to end character input.
- FIG. 22 is a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 2A of the third embodiment.
- the terminal 30A is executing a display device control program. That is, the display device 20, the communication unit 40, and the imaging device 60 are in the power-on state, and the communication between the terminal 30A, the display device 20, and the imaging device 60 is valid.
- the user operates the terminal 30A to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34A of the terminal 30A reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing. This screen display process may be automatically executed when the original screen M is displayed on the display device 20 .
- the user causes the display device 20 to display a desired screen by operating the terminal 30A.
- the control unit 34A determines whether or not reference data regarding the visible screen S corresponding to the displayed screen (original screen M) is stored in the storage unit 33 (S71). If the reference data related to the visible screen S is not stored in the storage unit 33, the control unit 34A reads out the reference data setting processing program from the storage unit 33, and performs the reference data setting processing according to the processing flow shown in FIG. (S72). After that, the process proceeds to step S71.
- the reference data setting process is executed when the reference data related to the visible screen S is not stored in the storage unit 33. However, the reference data related to the visible screen S is stored in the storage unit 33 Even if it is stored, the reference data setting process may be executed to generate the reference data again when instructed by the user.
- the control section 34A starts the imaging operation of the imaging device 60 (S73).
- a user performs a desired operation on the visible screen S with a finger.
- Such an operation by the user is imaged by the imaging device 60 and the obtained image data is sent to the image processing section 62 .
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60 to the control unit 34A by wireless communication (S74).
- the image data extraction unit 342 determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60, and extracts image data in which the finger exists (S75). ). That is, the image data extracting unit 342 extracts image data of the finger at a position separated from the imaging device 60 by a substantially constant distance along the Z-axis direction.
- the operation determination unit 343 determines the details of the operation performed by the finger. Then, the operation determination unit 343 determines whether or not the finger operation is normally recognized (S76).
- the operation determination unit 343 determines that the operation with the finger has been normally recognized
- the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and outputs a signal indicating that the operation with the finger has been normally recognized.
- Send to the display control unit 341 the display control unit 341 .
- the display control unit 341 adds an image showing a green lamp, which means that the finger operation has been properly recognized, to the original screen M and displays it on the display device 20 (S78). Note that the display control unit 341 adds to the original screen M an image showing characters or figures indicating that the operation with the finger has been correctly recognized, together with the image showing the green lamp or in place of the image. may
- the operation determination unit 343 determines in the process of step S76 that the finger operation was not properly recognized, it sends a signal to that effect to the display control unit 341. At this time, for example, even if the image data in which the finger exists is not sent from the image data extraction unit 72 within a predetermined time, the operation determination unit 343 determines that the tap operation is not normally recognized. I judge.
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a red lamp indicating that the finger operation has not been properly recognized, and displays it on the display device 20 (S77). After that, the process proceeds to step S82. At this time, the display control unit 71 displays an image showing characters or figures indicating that the operation with the finger has not been properly recognized on the original screen M, together with the image showing the red lamp or instead of the image. You may make it add.
- the position data generation unit 344 After the processing of step S78, the position data generation unit 344 generates position data of the finger (fingertip) in the imaging range of the imaging device 60 based on the image data for which the operation determination unit 343 has determined the content of the finger operation. Generate (S79). The finger position data thus generated is stored in the storage unit 33 .
- the input control unit 346 stores in the storage unit 33 the data regarding the content of the operation with the finger obtained by the operation determination unit 343 and the position data of the finger generated by the position data generation unit 344.
- the content of the instruction corresponding to the operation with the finger is recognized based on the reference data related to the visible screen S stored in the storage unit 33 and the data related to the original screen M corresponding to the visible screen S stored in the storage unit 33. (S80). For example, when the user performs a double-tap operation with a finger on the visible screen S, the input control unit 346 identifies that the current operation is a double-tap operation, and enlarges (or reduces) the original screen M. ) has been instructed to do so. After that, the input control unit 346 sends a signal regarding the content of the recognized instruction to the display control unit 341, and the display control unit 341 displays the original screen M corresponding to the content of the instruction on the display device 20 (S81).
- step S81 or step S77 the control unit 34A determines whether or not an instruction to end the operation for displaying the screen using the visible screen S has been received from the user (S82). If the instruction to end the operation for displaying the screen has been received, the process of displaying the screen using the visible screen S ends. On the other hand, if the instruction to end the operation for displaying the screen has not been received, the process proceeds to step S73, and the process of displaying the screen using the visible screen S is continued. Note that the user operates the terminal 30A to give an instruction to end the operation for screen display using the visible screen S. FIG.
- the terminal device of the third embodiment has the same functions and effects as those of the first embodiment. That is, in the terminal device of the third embodiment, since the terminal and the glasses as wearable objects are configured separately, existing mobile terminals such as smartphones and tablet terminals can be used as the terminal. By using an existing mobile terminal or the like as a terminal in this manner, the number of parts of the spectacles as a wearable object can be reduced, and the structure of the spectacles can be simplified. Further, by using a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- the terminal device of the third embodiment when the input control unit of the terminal performs an operation with a finger on the visible screen of the user, the content of the operation with the finger obtained by determination by the operation determination unit data and the position data of the finger generated by the position data generation unit, reference data related to the visible screen stored in the storage unit, and data related to the original screen corresponding to the visible screen stored in the storage unit , the content of the input instruction corresponding to the operation with the finger is recognized, and the original screen displayed on the display device is controlled according to the content of the recognized input instruction. For this reason, the user performs an operation on the visible screen that appears to be floating in the air in the same manner as operating on a screen displayed on a normal touch panel, thereby providing an instruction corresponding to the operation.
- the terminal device of the third embodiment can be entered. Therefore, by using the terminal device of the third embodiment, the user can perform various functions such as character input operation and enlargement/reduction by performing operations on the viewing screen in the same way as with a normal smartphone terminal or tablet terminal. Screen operations can be performed easily and accurately.
- the terminal has a function of controlling the imaging device to adjust the imaging range of the imaging device, and a depth direction range in which the subject is focused by controlling the imaging device. It is desirable to have the ability to adjust the depth of field. By using these functions, it is possible to limit the target imaged by the imaging device to only the finger operating on the visible screen, thereby protecting the privacy of others.
- FIG. 23 is a schematic block diagram of a terminal device according to the fourth embodiment of the invention.
- parts having the same functions as those in the above-described third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 2B includes spectacles 10 as an attachment worn on the user's head, a display device 20 provided on the spectacles 10, and a separate body from the spectacles 10.
- a terminal 30A having a display unit 31, a cable 50, and an imaging device 60 for imaging the front of the user.
- the cable 50 connects between the display device 20 and the terminal 30A and between the imaging device 60 and the terminal 30A.
- the main difference between the terminal device 2B of the fourth embodiment and the terminal device 2A of the third embodiment is that the display device 20 and the imaging device 60 are connected to the terminal 30A using a cable 50 instead of wirelessly. It is a point that is connected by Other configurations of the terminal device 2B of the fourth embodiment are the same as those of the terminal device 2A of the third embodiment.
- the terminal 30A has a connection terminal (not shown) as an interface, and the cable 50 is connected to this connection terminal. Power to the display device 20 and the imaging device 60 is supplied from the terminal 30A via this cable 50.
- the control unit 34A of the terminal 30A executes the display device control program, and the display is displayed on the display unit 31.
- the content of the displayed screen is displayed on the display device 20 as the content of the original screen M.
- the process of displaying a screen on the display device 20 according to the display device control program in the terminal device 2B of the fourth embodiment is the same as the processing in the terminal device 1B of the second embodiment. Therefore, detailed description thereof is omitted here.
- the user can perform the process of setting the reference data, the process of inputting characters using the visible screen S, or the process of displaying the screen using the visible screen S while the display device control program is running.
- These processing procedures in the terminal device 2B of the fourth embodiment are the same as the processing flows shown in FIGS. 19, 21, and 22 in the third embodiment. Therefore, detailed description thereof is omitted here.
- the terminal device of the fourth embodiment has the same functions and effects as the terminal device of the third embodiment.
- FIG. 24 is a schematic block diagram of a terminal device according to the fifth embodiment of the invention.
- the schematic perspective view of the terminal device of the fifth embodiment is substantially the same as the schematic perspective view of the terminal device of the third embodiment shown in FIG. 13, except that some components are not shown. . Therefore, here, FIG. 13 is also used as a schematic perspective view of the terminal device of the fifth embodiment.
- parts having the same functions as those in the third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 3A includes spectacles 10 as an attachment worn on the user's head, a display device 20 provided on the spectacles 10, and the spectacles 10
- a terminal 30A having a display unit 31, a communication unit 40, an imaging device 60 for imaging the front of the user, a microphone unit (sound input device) 70, and a speaker unit (sound output device) 80.
- the main difference between the terminal device 3A of the fifth embodiment and the terminal device 2A of the third embodiment is that the terminal device 3A has a microphone section 70 and a speaker section 80, and the terminal device 30A has a microphone section 70 and a speaker section 80. This is the point of controlling the speaker unit 80 .
- Other configurations of the terminal device 3A are the same as those of the third embodiment.
- the microphone section 70 and the speaker section 80 are provided on the handle portion of the glasses 10 .
- the microphone section 70 and the speaker section 80 are not shown.
- the microphone unit 70 converts the user's voice into an electrical signal and outputs the electrical signal to the terminal 30A.
- An electrical signal representing voice input from the microphone section 70 is sent to the control section 34A of the terminal 30A via the communication section 40, and the control section 34A analyzes the content of the electrical signal.
- the speaker unit 80 is of a bone conduction type that converts an electric signal output from the terminal 30A into sound and transmits the sound to the user by vibration of bones.
- the speaker unit 80 is not limited to one that transmits sound to the user using bone vibration, and a normal speaker, earphone, headphone, or the like that transmits sound to the user through the user's ear may be used. is also possible.
- the control unit 34A When voice is input from the microphone unit 70, the control unit 34A recognizes the content of the electrical signal representing the input voice, and executes processing according to the recognized content. For example, when the user gives a voice instruction from the microphone unit 70 to display a desired screen, the display control unit 341 displays the instructed screen on the display unit 31 . Also, when the user instructs to execute a desired application program (for example, a display device control program, a reference data setting processing program, a character input processing program, a screen display processing program) through the microphone unit 70 by voice. Then, the control unit 34A reads out the designated application program from the storage unit 33 and executes it.
- a desired application program for example, a display device control program, a reference data setting processing program, a character input processing program, a screen display processing program
- the user speaks from the microphone unit 70 to the effect that the application program currently being executed (for example, the display device control program, the reference data setting processing program, the character input processing program, the screen display processing program) will be terminated.
- the control unit 34A terminates execution of the instructed application program.
- control unit 34A controls the sound emitted by the speaker unit 80.
- the control section 34A can display the information on the display section 31 and emit a sound corresponding to the information from the speaker section 80.
- the terminal device of the fifth embodiment has the same functions and effects as those of the third embodiment.
- the user since the microphone and the speaker are provided in the glasses, the user can give an instruction to the terminal through the microphone without operating the terminal, and can also receive instructions from the terminal. Information can be obtained as sound from the speaker unit.
- FIG. 25 is a schematic block diagram of a terminal device according to the sixth embodiment of the present invention.
- parts having the same functions as those in the fifth embodiment described above are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 3B of the sixth embodiment includes spectacles 10 as an attachment worn on the user's head, a display device 20 provided on the spectacles 10, and a separate device from the spectacles 10.
- a terminal 30A having a display unit 31, a cable 50, an imaging device 60 for imaging the front of the user, a microphone unit 70, and a speaker unit 80 are provided.
- the cable 50 connects the display device 20, the imaging device 60, the microphone section 70, the speaker section 80, and the terminal 30A.
- the main difference between the terminal device 3B of the sixth embodiment and the terminal device 3A of the fifth embodiment is that the display device 20, the imaging device 60, the microphone section 70, and the speaker section 80 are connected to the terminal 30A by cable instead of wireless. 50 is used for wired connection.
- Other configurations of the terminal device 3B of the sixth embodiment are the same as those of the terminal device 3A of the fifth embodiment.
- the terminal 30A has a connection terminal (not shown) as an interface, and the cable 50 is connected to this connection terminal. Power to the display device 20, the imaging device 60, the microphone section 70, and the speaker section 80 is supplied from the terminal 30A through the cable 50.
- FIG. 1 A connection terminal (not shown) as an interface, and the cable 50 is connected to this connection terminal.
- the terminal device of the sixth embodiment has the same functions and effects as those of the fifth embodiment.
- FIG. 26 is a schematic block diagram of a terminal device according to the seventh embodiment of the invention.
- the schematic perspective view of the terminal device of the seventh embodiment is substantially the same as the schematic perspective view of the terminal device of the third embodiment shown in FIG. Therefore, here, FIG. 13 is also used as a schematic perspective view of the terminal device of the seventh embodiment.
- parts having the same functions as those in the third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 4A of the seventh embodiment has spectacles 10, a display device 20 provided on the spectacles 10, and a display unit 31 configured separately from the spectacles 10. It includes a terminal 30B, a communication unit 40, and an imaging device 60A for imaging the front of the user. Further, the terminal 30B includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34B. , a position data generator 344 , a reference data generator 345 , and an input controller 346 . Further, the imaging device 60A includes a camera section 61, an image processing section 62, and a camera control section 63A having an autofocus control section 631.
- the terminal device 4A of the seventh embodiment differs from the terminal device 2A of the third embodiment in that the camera control section 63A of the imaging device 60A is provided with an autofocus control section 631, and the image data extraction section 342B is , an image in which the subject is a finger and the finger is located at a position substantially constant along the Z-axis direction from the imaging device 60A, from among the image data sent from the imaging device 60A to the terminal 30B by wireless communication. The point is to extract the data.
- the autofocus control section 631 controls the camera section 61 so as to automatically focus on a subject at a predetermined position within the imaging range.
- the imaging device 60A has a large number of focus points so as to automatically focus on any position within the imaging range. Therefore, when the user performs an operation on the viewing screen S with a finger, the imaging device 60A automatically focuses on the finger performing the operation, and images the finger in focus. can do.
- the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject. This calculated distance data is associated with the image data. Image data obtained by imaging with the imaging device 60A and distance data associated therewith are sent to the control unit 34B of the terminal 30B.
- an active method in which the object is irradiated with infrared rays, ultrasonic waves, etc. and the distance is detected from the time until the reflected wave returns or the irradiation angle, or an image captured by the lens of the camera unit 61 is used.
- Any passive method such as a phase difference detection method or a contrast detection method, may be used.
- the image data extracting unit 342B extracts image data obtained by capturing an image of an object in focus with the image capturing device 60A when the user operates the viewing screen S with a finger. and determines whether the subject is a finger based on the distance data associated with the image data obtained by imaging the subject along the Z-axis direction from the imaging device 60A. By judging whether or not the object is separated by a substantially constant distance, image data in which the subject is a finger and the finger is separated from the imaging device 60A by a substantially constant distance along the Z-axis direction is extracted. It is a thing. To determine whether the subject is a finger or not, a general image recognition method is used as in the case of the third embodiment.
- the substantially constant distance refers to the distance from the imaging device 60A to the position of the visible screen S recognized by the user. is the distance in the Z-axis direction.
- the substantially constant distance is set to a distance within a range of about 40 cm ⁇ 5 cm from the imaging device 60A.
- the image data extracting unit 342B eliminates the image data of the finger operating at a position extremely near or far from the position where the visible screen S is displayed, and It is possible to extract image data of a finger that is performing an appropriate operation.
- the operation determination unit 343, the position data generation unit 344, and the reference data generation unit 345 perform processing based on the image data extracted by the image data extraction unit 342B.
- the reference data generation unit 345 causes the operation determination unit 343 to determine whether the operation at each predetermined position is a predetermined position when the user performs an operation with a finger at one or a plurality of predetermined positions on the visual recognition screen S. Using the position data of the finger generated by the position data generation unit 344 based on the image data determined to be an operation, data relating to the visible screen S is generated as reference data. For example, when the user performs an operation with a finger on the four corners of the outer frame of the visible screen S, position data of the finger at each position of the four corners can be used as reference data.
- the image data extracted by the image data extracting unit 342B is the image of the finger at a position away from the imaging device 60A by a substantially constant distance along the Z-axis direction. Therefore, the position data of the finger at each of the four corners is obtained on a plane parallel to the XY plane (substantially parallel to the user's body) at a position separated from the imaging device 60A by a substantially constant distance along the Z-axis direction. can be considered to represent the positional information of the finger of the
- the flowchart for explaining the procedure of the reference data setting process in the terminal device 4A of the seventh embodiment is substantially the same as that of the third embodiment shown in FIG.
- the reference data setting process in the seventh embodiment differs from the reference data setting process in the third embodiment in that the processing (steps S32 and S33) in the imaging device 60A and the extraction of image data by the image data extraction unit 342B. processing (step S34). Therefore, in the following, items of the reference data setting process in the seventh embodiment that are different from the reference data setting process in the third embodiment will be described using the flowchart shown in FIG.
- the display device control program is being executed in the terminal 30B. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60A are in a power-on state, and communication between the terminal 30B, the display device 20, and the imaging device 60A is valid.
- the control unit 34B starts the imaging operation of the imaging device 60A (S32).
- the character input screen 201 shown in FIG. 20B as the visual screen S for setting the reference data
- each circle to which a number is attached is indicated by a number.
- a predetermined operation for example, a tap operation, is performed with fingers in order.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject.
- the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject, and associates the calculated distance data with the image data.
- the image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34B of the terminal 30B by wireless communication (S33).
- step S34 the image data extraction unit 342B first determines whether the object is a finger based on the image data obtained by imaging with the imaging device 60A, and extracts image data in which the finger exists. Extract. After that, the image data extraction unit 342B determines whether the subject is away from the imaging device 60A by a substantially constant distance along the Z-axis direction based on the distance data associated with the extracted image data in which the finger exists. By determining whether or not, image data in which the subject is a finger and the finger is separated from the imaging device 60A by a substantially constant distance along the Z-axis direction is extracted. In the reference data setting process of the seventh embodiment, the contents of each process after step S35 are the same as those of the third embodiment.
- the flowchart for explaining the character input processing procedure using the visible screen S in the terminal device 4A of the seventh embodiment is substantially the same as that of the third embodiment shown in FIG.
- Character input processing using the visible screen S in the seventh embodiment differs from character input processing using the visible screen S in the third embodiment in that the processing in the imaging device 60A (steps S53 and S54), and image data extraction processing (step S55) by the image data extraction unit 342B. Therefore, in the following, the flowchart shown in FIG. 21 will be used to describe the character input processing using the visible screen S in the seventh embodiment that differs from the character input processing using the visible screen S in the third embodiment. explain.
- the display device control program is being executed in the terminal 30B. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60A are in a power-on state, and communication between the terminal 30B, the display device 20, and the imaging device 60A is valid.
- the control section 34B starts the imaging operation of the imaging device 60A (S53).
- the user performs a predetermined operation, such as a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject, and associates the calculated distance data with the image data.
- the image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34B of the terminal 30B by wireless communication (S54).
- step S55 the image data extraction unit 342B first determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60A, thereby extracting image data in which the finger exists. Extract. After that, the image data extraction unit 342B determines whether the subject is away from the imaging device 60A by a substantially constant distance along the Z-axis direction based on the distance data associated with the extracted image data in which the finger exists. By determining whether or not, image data in which the subject is a finger and the finger is separated from the imaging device 60A by a substantially constant distance along the Z-axis direction is extracted. In the character input processing of the seventh embodiment, the contents of each processing after step S56 are the same as those of the third embodiment.
- a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 4A of the seventh embodiment is substantially the same as that of the third embodiment shown in FIG.
- the screen display processing using the visible screen S in the seventh embodiment differs from the screen display processing using the visible screen S in the third embodiment in that the processing in the imaging device 60A (steps S73 and S74), and image data extraction processing (step S75) by the image data extraction unit 342B. Therefore, in the following, the flowchart shown in FIG. 22 will be used to describe the screen display processing using the visible screen S in the seventh embodiment that differs from the screen display processing using the visible screen S in the third embodiment. explain.
- the display device control program is being executed in the terminal 30B. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60A are in a power-on state, and communication between the terminal 30B, the display device 20, and the imaging device 60A is valid.
- the control unit 34B starts the imaging operation of the imaging device 60A (S73).
- a user performs a desired operation on the visible screen S with a finger.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject, and associates the calculated distance data with the image data.
- the image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34B of the terminal 30B by wireless communication (S74).
- step S75 the image data extraction unit 342B first determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60A, and extracts image data in which the finger exists. Extract. After that, the image data extraction unit 342B determines whether the subject is away from the imaging device 60A by a substantially constant distance along the Z-axis direction based on the distance data associated with the extracted image data in which the finger exists. By determining whether or not, image data in which the subject is a finger and the finger is separated from the imaging device 60A by a substantially constant distance along the Z-axis direction is extracted. In the screen display processing of the seventh embodiment, the contents of each processing after step S76 are the same as those of the third embodiment.
- the terminal device of the seventh embodiment has the same functions and effects as the terminal device of the third embodiment.
- the imaging device has an autofocus control unit capable of automatically focusing on a subject, and when the subject automatically focused by the autofocus control unit is captured, , calculating the distance data to the imaged object, and outputting the calculated distance data together with the image data obtained by the image pickup, thereby more accurately focusing on the finger (fingertip), which is the object, Since the subject can be imaged, the control unit can more accurately generate reference data, perform character input processing, etc. based on the image data and distance data obtained by the imaging. can be done.
- FIG. 27 is a schematic block diagram of a terminal device according to the eighth embodiment of the invention.
- parts having the same functions as those in the above-described seventh embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 4B is a terminal 30B having glasses 10, a display device 20 provided on the glasses 10, and a display unit 31 configured separately from the glasses 10. , a cable 50, and an imaging device 60A for imaging the front of the user.
- the cable 50 connects between the display device 20 and the imaging device 60A and the terminal 30B.
- the main difference between the terminal device 4B of the eighth embodiment and the terminal device 4A of the seventh embodiment is that the display device 20 and imaging device 60A are connected to the terminal 30B not wirelessly but by wire using the cable 50.
- the point is that Other configurations of the terminal device 4B of the eighth embodiment are the same as those of the terminal device 4A of the seventh embodiment.
- the terminal 30B has a connection terminal (not shown) as an interface.
- a cable 50 is connected to this connection terminal. Power to the display device 20 and the imaging device 60A is supplied from the terminal 30B via this cable 50.
- the terminal device of the eighth embodiment has the same functions and effects as those of the seventh embodiment.
- FIG. 28 is a schematic block diagram of a terminal device according to the ninth embodiment of the present invention.
- the schematic perspective view of the terminal device of the ninth embodiment is substantially the same as the schematic perspective view of the terminal device of the third embodiment shown in FIG. Therefore, here, FIG. 13 is also used as a schematic perspective view of the terminal device of the ninth embodiment.
- parts having the same functions as those in the seventh embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 5A of the ninth embodiment includes glasses 10, a display device 20 provided on the glasses 10, and a display unit 31 configured separately from the glasses 10. It includes a terminal 30C, a communication unit 40, and an imaging device 60A for imaging the front of the user. Further, the terminal 30C includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34C. , a position data generator 344C, a reference data generator 345, an input controller 346, and a deviation corrector 347C.
- control section 34C includes a deviation correction section 347C.
- the plane corresponding to the visible screen S obtained based on the reference data regarding the visible screen S stored in the storage unit 33 is called the "reference screen K".
- the screen to be actually operated (hereinafter also referred to as "operation screen T") is positioned in front of the reference screen K obtained based on the reference data.
- the operation screen T may be recognized as being in the back, or may be located in the back, and may be operated with a finger.
- the shift correction unit 347C performs a process of converting finger position data obtained by the user's finger operation on the operation screen T into position data on the reference screen K.
- FIG. the finger position data obtained by the user's finger operation on the operation screen T is generated by the position data generator 344C.
- FIG. 29 is a diagram for explaining the process of converting the X coordinate of the position data to the X coordinate of the position data on the reference screen K by the deviation correction unit 347C in the ninth embodiment.
- FIG. 11 is a diagram for explaining a process of converting a Y coordinate of position data to a Y coordinate of position data on a reference screen K by a correction unit 347C;
- FIGS. 29 and 30 show the case where the user recognizes that the operation screen T is located behind the reference screen K.
- FIG. 29 is a diagram for explaining the process of converting the X coordinate of the position data to the X coordinate of the position data on the reference screen K by the deviation correction unit 347C in the ninth embodiment.
- FIG. 11 is a diagram for explaining a process of converting a Y coordinate of position data to a Y coordinate of position data on a reference screen K by a correction unit 347C;
- FIGS. 29 and 30 show the case where the user recognizes that the operation screen T is located behind the reference screen
- point Cc is the center position of the camera unit 61
- point Mc is the center position of the original screen M
- point Ec is the center position of the user's pupil.
- a point pc is the center position of the reference screen K
- a point Pc is the center position of the operation screen T.
- point Pc, point pc, point Mc, and point Ec are on the same straight line.
- W is the distance in the X-axis direction between the center position of the camera unit 61 and the center position of the original screen M
- H is the distance in the Y-axis direction between the center position of the camera unit 61 and the center position of the original screen M
- L is is the distance in the Z-axis direction between the original screen M and the reference screen K
- ⁇ is the distance in the Z-axis direction between the user's pupil and the original screen M
- the values of W, H, and ⁇ are stored in the storage unit 33 in advance, and the value of L is obtained when the reference data is generated and stored in the storage unit 33 .
- the position data generator 344C acquires the XY coordinates when the actual finger position is projected onto the reference screen K as the finger position data. Therefore, the position data generator 344C calculates the position data of the point p1 as the position data of the point P.
- the distance in the Z-axis direction between the point P and the original screen M is the image data used when the position data of this point P is generated. obtained from the distance data associated with . Since the point p0 is a position on the reference screen K corresponding to the point P on the operation screen T, what the deviation correction section 347C should do is to obtain the position data of the point p0 from the position data of the point p1. .
- the position coordinates of point P are (X, Y), the position coordinates of point p0 are (x0, y0), the position coordinates of point pc are (xc, yc), and the coordinates of point Pc are (Xc, Yc ), and the position coordinates of the point p1 are (x1, y1).
- this position coordinate (xc, yc) is known and stored in the storage unit 33 .
- dX be the distance between the points Pd and P in the X-axis direction
- dX be the distance between the points Pd and P in the Y-axis direction.
- the deviation correction unit 347C converts the value of the position data (x1, y1) of the point p1 generated by the position data generation unit 344C and the value of the distance Z in the Z-axis direction between the point P and the original screen M into the above ( Position data (x0, y0) of the point p0 can be obtained by substituting in the formulas 1) and (2).
- the input control unit 346 When the user performs an operation with a finger, the input control unit 346 provides data regarding the content of the finger operation determined by the operation determination unit 343, Position data (x0, y0), reference data related to the reference screen K (visible screen S) stored in the storage unit 33, and data related to the original screen M corresponding to the visible screen S stored in the storage unit 33 , the content of the input instruction corresponding to the operation with the finger is recognized, and the original screen M displayed on the display device 20 is controlled according to the content of the recognized input instruction.
- the terminal device of the ninth embodiment has the same functions and effects as the terminal device of the seventh embodiment.
- the ninth embodiment when the user gives an instruction to the visible screen with the finger, the user can recognize whether the position of the user's finger is in front of or behind the position of the reference screen K. Even if there is a deviation between the operation screen T and the reference screen K, the slip correction unit obtains the position of the user's finger on the reference screen K, and the input control unit Accurately recognize the content of instructions.
- the position data generator 344C acquires the XY coordinates when the position of the finger actually operated by the user is projected onto the reference screen K as the position data of the finger.
- the position data generation unit 344 may acquire the XY coordinates when the position of the finger actually operated by the user is projected onto the reference screen K as the position data of the finger.
- FIG. 31 is a schematic block diagram of a terminal device according to the tenth embodiment of the present invention.
- parts having the same functions as those in the above-described ninth embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 5B of the tenth embodiment is a terminal 30C having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10. , a cable 50, and an imaging device 60A for imaging the front of the user.
- the cable 50 connects between the display device 20 and the imaging device 60A and the terminal 30C.
- the main difference between the terminal device 5B of the tenth embodiment and the terminal device 5A of the ninth embodiment is that the display device 20 and imaging device 60A and the terminal 30C are connected not wirelessly, but wiredly using the cable 50.
- Other configurations of the terminal device 5B of the tenth embodiment are the same as those of the terminal device 5A of the ninth embodiment.
- the terminal 30C has a connection terminal (not shown) as an interface.
- a cable 50 is connected to this connection terminal. Power to the display device 20 and the imaging device 60A is supplied from the terminal 30C via this cable 50.
- the terminal device of the tenth embodiment has the same functions and effects as those of the ninth embodiment.
- FIG. 32 is a schematic block diagram of a terminal device according to the eleventh embodiment of the present invention.
- parts having the same functions as those in the ninth embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 6A is a terminal having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10.
- 30C a communication unit 40, an imaging device 60A for capturing an image in front of the user, a microphone unit (sound input device) 70, and a speaker unit (sound output device) 80.
- the microphone section 70 and the speaker section 80 are the same as those in the fifth embodiment.
- the terminal device 6A of the eleventh embodiment differs from the terminal device 5A of the ninth embodiment mainly in that the terminal device 6A includes a microphone section 70 and a speaker section 80, and a control section 34C of the terminal 30C. performs processing according to the content of the electric signal representing the voice input from the microphone section 70, and controls the sound emitted from the speaker section 80 by the control section 34C.
- Other configurations of the terminal device 6A of the eleventh embodiment are the same as those of the terminal device 5A of the ninth embodiment.
- the terminal device of the eleventh embodiment has the same functions and effects as those of the ninth embodiment.
- the user since the microphone and the speaker are provided on the glasses, the user can give an instruction to the terminal through the microphone without operating the terminal, and can can be obtained as sound from the speaker unit.
- FIG. 33 is a schematic block diagram of a terminal device according to the twelfth embodiment of the present invention.
- parts having the same functions as those in the eleventh embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 6B according to the twelfth embodiment is, as shown in FIG. 30C, a cable 50, an imaging device 60A for imaging the front of the user, a microphone section 70, and a speaker section 80.
- the cable 50 connects the display device 20, the imaging device 60A, the microphone section 70, the speaker section 80, and the terminal 30C.
- the main difference between the terminal device 6B of the twelfth embodiment and the terminal device 6A of the eleventh embodiment is that the display device 20, the imaging device 60A, the microphone unit 70 and the speaker unit 80 are connected to the terminal 30C instead of wirelessly. , and a cable 50 for wired connection.
- Other configurations of the terminal device 6B of the twelfth embodiment are the same as those of the terminal device 6A of the eleventh embodiment.
- the terminal 30C has a connection terminal (not shown) as an interface.
- a cable 50 is connected to this connection terminal. Power to the display device 20, the imaging device 60A, the microphone section 70, and the speaker section 80 is supplied from the terminal 30C via the cable 50.
- the terminal device of the twelfth embodiment has the same functions and effects as those of the eleventh embodiment.
- FIG. 34 is a schematic block diagram of a terminal device according to the thirteenth embodiment of the present invention.
- the schematic perspective view of the terminal device of the thirteenth embodiment is substantially the same as the schematic perspective view of the terminal device of the third embodiment shown in FIG. Therefore, here, FIG. 13 is also used as a schematic perspective view of the terminal device of the thirteenth embodiment.
- parts having the same functions as those in the third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 7A of the thirteenth embodiment as shown in FIGS. terminal 30D, communication unit 40, and imaging device 60A for imaging the front of the user.
- the terminal 30D also includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34D. , a position data generation unit 344, a reference data generation unit 345D, an input control unit 346D, and a distance determination unit 348D.
- the imaging device 60A has a camera section 61, an image processing section 62, and a camera control section 63A. This imaging device 60A is the same as that in the seventh embodiment.
- the main differences between the terminal device 7A of the thirteenth embodiment and the terminal device 2A of the third embodiment are that the camera control section 63A includes an autofocus control section 631, and the reference data generation section 345D of the control section 34D. generates data that can specify the position and size in space as data (reference data) related to the visible screen, and when the control unit 34D performs an operation on the visible screen S with the user's finger.
- a distance determination unit 348D is provided to determine whether the position of the finger is within a substantially constant distance from the plane representing the visible screen S obtained using the reference data.
- Other configurations of the terminal device 7A of the thirteenth embodiment are the same as those of the terminal device 2A of the third embodiment.
- the autofocus control section 631 is the same as the autofocus control section in the seventh embodiment, and controls the camera section 61 so as to automatically focus on a subject at a predetermined position within the imaging range. Therefore, when the user performs an operation on the viewing screen S with a finger, the imaging device 60A automatically focuses on the finger performing the operation, and images the finger in focus. can do. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, the autofocus control unit 631 calculates distance data to the captured subject. This calculated distance data is associated with the image data. Image data obtained by imaging with the imaging device 60A and distance data associated therewith are sent from the imaging device 60A to the control unit 34D by wireless communication.
- the reference data generation unit 345D determines whether the operation at each predetermined position is a predetermined operation. Position data of the finger at each position generated by the position data generation unit 344 based on the image data determined to be, and distance data associated with the image data used when generating the position data of the finger is used to generate data for identifying the position and size in the three-dimensional space as data relating to the visible screen S, and stored in the storage unit 33 as reference data.
- coordinate information (three-dimensional data) in the XYZ coordinate system is constructed based on the finger position data (two-dimensional position data) and the distance data (one-dimensional position data). Then, XYZ coordinate information (three-dimensional data) at the three positions can be used as reference data. Further, using such reference data, it is also possible to calculate the equation of the plane representing the visible screen S in the XYZ coordinate system. In general, the plane representing the visible screen S specified in this way is not necessarily parallel to the XY plane. In the thirteenth embodiment, the plane corresponding to the visible screen S obtained based on the reference data regarding the visible screen S will be referred to as the "reference screen".
- the distance determination unit 348D determines whether the operation with the finger is performed by the operation determination unit 343. Finger position data generated by the position data generation unit 344 based on image data determined to be an operation, distance data associated with the image data used to generate the finger position data, Based on a plane (reference screen) corresponding to the visual recognition screen S obtained based on reference data regarding the visual recognition screen S, the finger is positioned at a predetermined substantially constant distance from the plane (reference screen) corresponding to the visual recognition screen S. It determines whether or not it exists within the distance.
- the substantially constant distance when determining whether or not the finger exists within the substantially constant distance from the reference screen is the distance at which it can be recognized that the user is properly operating the visible screen S. be.
- the substantially constant distance is set to about 5 cm, for example.
- the input control unit 346D performs an operation when the user performs an operation on the visible screen S with a finger and the distance determination unit 348D determines that the finger is present within a substantially constant distance from the reference screen.
- Data regarding the content of the operation with the finger obtained by determination by the determination unit 343, position data of the finger generated by the position data generation unit 344 based on the image data used in the determination, and Distance data associated with the image data used, reference data related to the visual recognition screen S stored in the storage unit 33, and original screen M corresponding to the visual recognition screen S stored in the storage unit 33.
- the content of the input instruction corresponding to the operation with the finger is recognized, and the original screen M displayed on the display device 20 is controlled according to the content of the recognized input instruction.
- the flowchart for explaining the procedure of the reference data setting process in the terminal device 7A of the thirteenth embodiment is substantially the same as that of the third embodiment shown in FIG.
- the reference data setting process in the thirteenth embodiment differs from the reference data setting process in the third embodiment in that the processing (steps S32 and S33) in the imaging device 60A and the reference data generation unit 345D. generation processing (step S38). Therefore, in the following, items of the reference data setting process in the thirteenth embodiment that are different from the reference data setting process in the third embodiment will be described using the flowchart shown in FIG.
- FIG. 35 is a diagram showing an example of the original screen M for setting reference data displayed during the process of setting reference data in the thirteenth embodiment.
- the original screen M for setting the reference data is the character input screen 201, and images showing circles and numbers are added at three predetermined positions among the four corner positions.
- circle images are displayed at three predetermined positions among the four corners of the character input screen 201, but as shown in FIG.
- the keyboard image 210 of the character input screen 201 It is also possible to display circle images at three predetermined positions among the four corners.
- the control unit 34D starts the imaging operation of the imaging device 60A (S32).
- the user sees a character input screen 201 on which images showing circles and numbers are added at three predetermined positions among the four corner positions as a visual screen S for setting reference data, as shown in FIG. 35(a).
- a predetermined operation such as a tap operation, is performed with a finger on each numbered circle in numerical order.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, the autofocus control unit 631 calculates distance data to the captured subject, and associates the calculated distance data with the image data. The image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34D by wireless communication (S33).
- step S38 the reference data generation unit 345D generates the finger position data at the three predetermined positions generated by the position data generation unit 344 in the process of step S37, and the image used when generating the finger position data. Using the distance data associated with the data, reference data relating to the currently displayed visible screen S is generated and stored in the storage unit 33 .
- FIG. 36 is a flow chart for explaining the character input processing procedure using the visible screen S in the terminal device 7A of the thirteenth embodiment.
- the display device control program is being executed in the terminal 30D. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60A are in a power-on state, and communication between the terminal 30D, the display device 20, and the imaging device 60A is valid.
- the user operates the terminal 30D to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34D of the terminal 30D reads the character input processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Process character input. This character input processing may be automatically executed when the character input screen 200 as the original screen M is displayed on the display device 20 .
- the control unit 34D displays the character input screen 200 as the original screen M on the display device 20, and determines whether or not reference data regarding the visible screen S corresponding to the original screen M is stored in the storage unit 33. (S121). If the reference data related to the visible screen S is not stored in the storage unit 33, the control unit 34D reads out the reference data setting processing program from the storage unit 33, and performs the reference data setting processing (S122). After that, the process proceeds to step S121.
- the reference data setting process is executed when the reference data related to the visual recognition screen S is not stored in the storage unit 33. , the reference data may be generated again by executing the reference data setting process upon receiving an instruction from the user.
- the control section 34D starts the imaging operation of the imaging device 60D (S123).
- the user performs a predetermined operation, such as a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject, and associates the calculated distance data with the image data.
- the image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34D by wireless communication (S124).
- the image data extraction unit 342 determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60A, and extracts image data in which the finger exists (S125). ).
- the operation determination unit 343 determines whether the operation with the finger is a predetermined operation (here, a tap operation). This determination is made within a predetermined time. Then, if the operation with the finger is the tap operation, the operation determination unit 343 determines that the operation for character input has been recognized normally. It is determined that the operation for retrieving was not normally recognized (S126).
- the operation determination unit 343 determines that the character input operation is normally recognized
- the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and also determines that the character input operation is normally recognized.
- a signal to that effect is sent to the display control unit 341 .
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a green lamp, which means that the character input operation has been recognized normally, and displays it on the display device 20 (S128). ).
- the display control unit 341 adds to the original screen M, together with the image showing the green lamp or in place of the image, an image showing a character or figure indicating that the character input operation has been properly recognized. You may make it
- the operation determination unit 343 determines in the process of step S126 that the character input operation has not been correctly recognized within the predetermined time, it sends a signal to that effect to the display control unit 341.
- the operation determination unit 343 determines that the tap operation is not normally recognized. I judge.
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a red lamp, which means that the character input operation was not recognized normally, and displays it on the display device 20 ( S127). After that, the process proceeds to step S133.
- the display control unit 71 creates an image showing a character or figure indicating that the character input operation has not been properly recognized, together with the image showing the red lamp or instead of the image. It may be added to the screen M.
- the position data generation unit 344 detects the position of the finger (fingertip) in the imaging range of the imaging device 60A based on the image data in which the operation determination unit 343 determines that the operation with the finger is the tap operation. Position data is generated (S129). The finger position data thus generated is stored in the storage unit 33 .
- the distance determination unit 348D stores the finger position data generated by the position data generation unit 344, the distance data associated with the image data used to generate the finger position data, the storage unit 33 Determining whether or not the finger exists within a predetermined substantially constant distance from a plane (reference screen) corresponding to the visible screen S based on the reference data related to the visible screen S stored in the (S130).
- the distance determination unit 348D determines that the finger is away from the reference screen by a substantially constant distance, it determines that the user is not properly operating the visible screen S, and then proceeds to step S127. .
- step S130 when the distance determination unit 348D determines that the finger is within a substantially constant distance from the reference screen, it recognizes that the user is properly operating the visible screen S, and then , the process proceeds to step S131.
- step S131 the input control unit 346D generates data regarding the content of the finger operation determined by the operation determination unit 343, position data of the finger generated by the position data generation unit 344, and Distance data associated with the image data used, reference data related to the visual recognition screen S stored in the storage unit 33, and original screen M corresponding to the visual recognition screen S stored in the storage unit 33. Based on the data, the content of the input instruction corresponding to the operation with the finger is recognized. For example, when the user taps a character key image on the keyboard image 210 with a finger, the input control unit 346D determines which character on the keyboard image 210 the finger position obtained from the finger position data is.
- the input control unit 346D sends a signal regarding the content of the recognized input instruction to the display control unit 341, and the display control unit 341 displays the original screen M according to the content of the input instruction on the display device 20 (S132). .
- step S132 or step S127 the control unit 34D determines whether or not an instruction to end character input using the visible screen S has been received from the user (S133). If the instruction to end the character input is received, the character input process is terminated. On the other hand, if the instruction to end the character input has not been received, the process proceeds to step S123, and character input processing using the visible screen S is continued. Note that the user operates the terminal 30D to give an instruction to end character input.
- FIG. 37 is a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 7A of the thirteenth embodiment.
- the display device control program is being executed in the terminal 30D. That is, the display device 20, the communication unit 40, and the imaging device 60D are in a power-on state, and communication between the terminal 30D, the display device 20, and the imaging device 60A is valid.
- the user operates the terminal 30D to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34D of the terminal 30D reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing. This screen display process may be automatically executed when the original screen M is displayed on the display device 20 .
- the control unit 34D determines whether or not reference data regarding the visible screen S corresponding to the displayed screen (original screen M) is stored in the storage unit 33 (S141). If the reference data regarding the visible screen S is not stored in the storage unit 33, the control unit 34D reads out the reference data setting processing program from the storage unit 33, and performs the reference data setting processing (S142). After that, the process proceeds to step S141.
- the reference data setting process is executed when the reference data related to the visual recognition screen S is not stored in the storage unit 33. , the reference data may be generated again by executing the reference data setting process upon receiving an instruction from the user.
- the control section 34D starts the imaging operation of the imaging device 60A (S143).
- the user performs a predetermined operation, such as a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S.
- the user's operation is imaged by the imaging device 60A.
- the autofocus control unit 631 controls the camera unit 61 so as to automatically focus on the subject within the imaging range, and the imaging device 60A images the focused subject. Also, when the autofocus control unit 631 captures an image of an automatically focused subject, it calculates distance data to the captured subject, and associates the calculated distance data with the image data.
- the image data obtained by this imaging is sent to the image processing section 62, and the image processing section 62 performs predetermined image processing on the image data. Then, the image data subjected to the image processing and the distance data associated therewith are sent from the imaging device 60A to the control section 34D by wireless communication (S144).
- the image data extraction unit 342 determines whether the subject is a finger based on the image data obtained by imaging with the imaging device 60A, and extracts image data in which the finger exists (S145). ).
- the operation determination unit 343 determines the details of the operation performed by the finger. Then, the operation determination unit 343 determines whether or not the finger operation is normally recognized (S146).
- the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and outputs a signal indicating that the operation with the finger has been normally recognized. Send to the display control unit 341 .
- the display control unit 341 When the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a green lamp indicating that the finger operation has been properly recognized, and displays it on the display device 20 (S148). Note that the display control unit 341 adds to the original screen M an image showing characters or graphics indicating that the operation with the finger has been correctly recognized, together with the image showing the green lamp or instead of the image. may
- the operation determination unit 343 determines in the process of step S146 that the finger operation was not properly recognized, it sends a signal to that effect to the display control unit 341.
- the operation determination unit 343 determines that the tap operation is not normally recognized. I judge.
- the display control unit 341 adds to the original screen M an image showing a red lamp indicating that the finger operation was not properly recognized, and displays it on the display device 20 (S147).
- the process proceeds to step S153.
- the display control unit 341 displays an image showing characters or figures indicating that the operation with the finger was not properly recognized on the original screen M, together with the image showing the red lamp or instead of the image. You may make it add.
- the position data generation unit 344 After the process of step S148, the position data generation unit 344 generates position data of the finger (fingertip) in the imaging range of the imaging device 60A based on the image data for which the operation determination unit 343 has determined the content of the finger operation. Generate (S149). The finger position data thus generated is stored in the storage unit 33 .
- the distance determination unit 348D stores the finger position data generated by the position data generation unit 344, the distance data associated with the image data used to generate the finger position data, the storage unit 33 Determining whether or not the finger exists within a predetermined substantially constant distance from a plane (reference screen) corresponding to the visible screen S based on the reference data related to the visible screen S stored in the (S150).
- the distance determination unit 348D determines that the finger is away from the reference screen by a substantially constant distance, it determines that the user is not properly operating the visible screen S, and then proceeds to step S147. .
- step S150 when the distance determination unit 348D determines that the finger is within a substantially constant distance from the reference screen, it recognizes that the user is properly operating the visible screen S, and then , the process proceeds to step S151.
- step S151 the input control unit 346D generates data regarding the content of the finger operation determined by the operation determination unit 343, position data of the finger generated by the position data generation unit 344, and Distance data associated with the image data used, reference data related to the visual recognition screen S stored in the storage unit 33, and original screen M corresponding to the visual recognition screen S stored in the storage unit 33. Based on the data, the content of the input instruction corresponding to the operation with the finger is recognized. For example, when the user performs a double-tap operation with a finger on the visible screen S, the input control unit 346D specifies that the current operation is a double-tap operation, and enlarges (or reduces) the original screen M. ) has been instructed to do so. After that, the input control unit 346D sends a signal regarding the content of the recognized instruction to the display control unit 341, and the display control unit 341 displays the original screen M according to the content of the instruction on the display device 20 (S152).
- step S152 or step S147 the control unit 34D determines whether or not an instruction to end the operation for displaying the screen using the visible screen S has been received from the user (S153). If the instruction to end the operation for displaying the screen has been received, the processing for displaying the screen ends. On the other hand, if the instruction to end the operation for displaying the screen has not been received, the process proceeds to step S143, and the process of displaying the screen using the visible screen S is continued. Note that the user operates the terminal 30D to give an instruction to end the operation for screen display using the visible screen S. FIG.
- the terminal device of the thirteenth embodiment has the same functions and effects as the terminal device of the third embodiment.
- the reference data generation unit generates, as the reference data, data that can specify the position and size of the visible screen S in the space, so that the user can specify the visible screen S by pointing. , for example, the two corners on the left side of the visible screen S are operated on the front side, and the two corners on the right side of the visible screen S are operated on the far side. Even a person with habits can generate reference data that matches the habits of the user.
- FIG. 38 is a schematic block diagram of a terminal device according to the fourteenth embodiment of the present invention.
- parts having the same functions as those in the thirteenth embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 7B of the fourteenth embodiment is a terminal device having spectacles 10, a display device 20 provided on the spectacles 10, and a display unit 31 configured separately from the spectacles 10. 30D, a cable 50, and an imaging device 60A for imaging the front of the user.
- the cable 50 connects between the display device 20 and the imaging device 60A and the terminal 30D.
- the main difference between the terminal device 7B of the fourteenth embodiment and the terminal device 7A of the thirteenth embodiment is that the display device 20 and imaging device 60A are connected to the terminal 30D not wirelessly, but by wire using the cable 50. It is the connecting point.
- Other configurations of the terminal device 7B of the fourteenth embodiment are the same as those of the terminal device 7A of the thirteenth embodiment.
- the terminal 30D has a connection terminal (not shown) as an interface.
- a cable 50 is connected to this connection terminal. Power to the display device 20 and the imaging device 60A is supplied from the terminal 30D via this cable 50.
- the terminal device of the fourteenth embodiment has the same actions and effects as those of the thirteenth embodiment.
- FIG. 39 is a schematic block diagram of a terminal device according to the fifteenth embodiment of the present invention.
- the same reference numerals are assigned to elements having the same functions as those of the thirteenth embodiment, and detailed description thereof will be omitted.
- a terminal device 8A is a terminal having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10. 30D, a communication unit 40, an imaging device 60A for imaging the front of the user, a microphone unit 70, and a speaker unit 80.
- the microphone section 70 and the speaker section 80 are the same as those in the fifth embodiment.
- the terminal device 8A of the fifteenth embodiment differs from the terminal device 7A of the thirteenth embodiment mainly in that the terminal device 8A includes a microphone section 70 and a speaker section 80, and a control section of the terminal 30D. 34D executes processing according to the content of the electrical signal representing the voice input from the microphone section 70, and the control section 34D controls the sound emitted from the speaker section 80.
- Other configurations of the terminal device 8A of the fifteenth embodiment are the same as those of the terminal device 7A of the thirteenth embodiment.
- the terminal device of the fifteenth embodiment has the same functions and effects as those of the thirteenth embodiment.
- the user since the microphone and the speaker are provided in the glasses, the user can give instructions to the terminal through the microphone without operating the terminal, and can be obtained as sound from the speaker unit.
- FIG. 40 is a schematic block diagram of a terminal device according to the sixteenth embodiment of the present invention.
- the same reference numerals are given to the parts having the same functions as those of the fifteenth embodiment described above, and detailed description thereof will be omitted.
- a terminal device 8B is a terminal having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10.
- 30D a cable 50, an imaging device 60A for imaging the front of the user, a microphone section 70, and a speaker section 80.
- the cable 50 connects the display device 20, the imaging device 60A, the microphone section 70, the speaker section 80, and the terminal 30D.
- the main difference between the terminal device 8B of the sixteenth embodiment and the terminal device 8A of the fifteenth embodiment is that the display device 20, the imaging device 60A, the microphone section 70 and the speaker section 80 are connected to the terminal 30D instead of wirelessly. , and a cable 50 for wired connection.
- Other configurations of the terminal device 8B of the sixteenth embodiment are the same as those of the terminal device 8A of the fifteenth embodiment.
- the terminal 30D has a connection terminal (not shown) as an interface.
- a cable 50 is connected to this connection terminal. Power to the display device 20, the imaging device 60A, the microphone section 70, and the speaker section 80 is supplied from the terminal 30D via the cable 50.
- the terminal device of the sixteenth embodiment has the same functions and effects as those of the fifteenth embodiment.
- FIG. 41(a) is a schematic plan view of the terminal device according to the seventeenth embodiment of the present invention
- FIG. 41(b) is a schematic right side view of the terminal device.
- FIG. 42 is a schematic perspective view of the terminal device of the seventeenth embodiment.
- FIG. 43 is a schematic block diagram of the terminal device of the seventeenth embodiment.
- parts having the same functions as those in the above-described third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 9 includes spectacles 10 as an attachment worn on the user's head, and a display device 20 provided on the spectacles 10.
- a terminal 30E having a display unit 31, a communication unit 40, an imaging device 60B for imaging the user's eyes, a microphone unit (sound input device) 70, and a speaker, which are configured separately from the glasses 10.
- a unit (sound output device) 80 and a touch pad unit 90 are provided.
- the main difference between the terminal device 9 of the seventeenth embodiment and the terminal device 2A of the third embodiment is that the terminal device 9 does not use the imaging device 60 for imaging the front of the user, but rather the user's eyes.
- the terminal 30E recognizes the content of the operation performed on the viewing screen S by the user based on the image data captured by the imaging device 60B; the microphone unit 70; The difference is that the speaker section 80 and the touch pad section 90 are provided.
- Other configurations of the terminal device 9 of the seventeenth embodiment are the same as those of the terminal device 2A of the third embodiment.
- the imaging device 60B is provided on the handle portion of the spectacles 10, as shown in FIGS.
- the imaging device 60B is arranged in the housing 100 together with the small projector 21 and the optical system 22 of the display device 20.
- the imaging device 60B includes a camera section 61, an image processing section 62, and a camera control section 63, as shown in FIG.
- the imaging device 60B captures an image of the user's eye (at least part of it), thereby capturing the original screen reflected in the user's eyes and the finger when the user operates the visible screen S with the finger. It is for getting images about .
- the original screen displayed on the display device 20 and the fingers of the user operating on the visible screen are usually part of the eye that can be seen from the outside, and are composed of colored irises and pupils. reflected in the part.
- the imaging device 60B actually captures an image of the iris and pupil (or part thereof) of the user's eye, and obtains the original screen and the image of the finger.
- the imaging device 60B is configured to bring the user's eyes into focus in advance.
- the imaging device 60B is attached to the spectacles 10 so that the user's eyes are positioned on the optical axis of the lens of the camera unit 61, and the focus position is adjusted so that the eyes are in focus.
- the imaging range which is the range that can be imaged by the imaging device 60B, is a range that includes at least part of the eye.
- FIG. 44(a) is a diagram showing an example of the original screen reflected in the eyes
- FIG. 44(b) is a diagram showing an example of the imaging range of the imaging device 60B in the terminal device 9 of the seventeenth embodiment. Therefore, when the user operates the visible screen S with a finger, the imaging device 60B can capture an image of the original screen reflected in the user's eyes and the finger in focus.
- Image data obtained by imaging with the imaging device 60B is transmitted from the imaging device 60B to the terminal 30E by wireless communication and stored in the terminal 30E.
- the imaging device 60B in the seventeenth embodiment has a still image shooting function and a moving image shooting function. data can be obtained.
- the glasses 10 are provided with a microphone section 70, a bone conduction speaker section 80, and a touch pad section 90, as shown in FIG. In FIGS. 41 and 42, these parts are omitted for the sake of simplification.
- the microphone unit 70 is an input device for the terminal 30E, converts the user's voice into an electrical signal, and outputs the electrical signal to the terminal 30E.
- the terminal 30E can be operated by the user's voice instruction.
- An electrical signal representing voice input from the microphone section 70 is sent to the control section 34E of the terminal 30E via the communication section 40, and the control section 34E analyzes the content of the electrical signal.
- the speaker unit 80 is an output device of the terminal 30E, and is of a bone conduction type that converts an electrical signal output from the terminal 30E into sound and transmits the sound to the user by vibration of bones.
- the speaker unit 80 is not limited to one that uses bone vibration to transmit sound to the user, and it is also possible to use a normal speaker, earphone, headphone, or the like that transmits sound to the user through the user's ears.
- the touch pad unit 90 is an input device for the terminal 30E, and gives various instructions to the terminal 30E by the user's touch operation.
- FIG. 45 is a diagram for explaining the mounting location of the touch pad section 90
- FIG. 46 is a diagram for explaining the configuration of the touch pad section 90.
- the touch pad portion 90 is attached to the handle on the right side of the spectacles 10 as shown in FIGS. 45(a) and (b), or attached to the lower portion of the right eye lens portion 11 as shown in FIG. 45(c). be done.
- FIG. 45(a) is a schematic perspective view of a terminal device 9 having a touch pad section 90 attached to a handle
- FIG. 45(b) is a schematic side view of the terminal device 9.
- FIG. 45(c) is a schematic perspective view of the terminal device 9 having the touch pad section 90 attached to the lens section.
- the touch pad portion 90 is fixedly attached to the relevant portion, but may be detachably attached to the spectacles (wear) 10 .
- the touch pad section 90 one having a mouse function as shown in FIG. 46(a) can be used, but one having a simple keyboard function as shown in FIG. As shown in (c), one having both a mouse function and a simple keyboard function may be used.
- the touch pad section 90 is not limited to these, and may be one having a number panel, an operation panel, or the like.
- the terminal 30E has a display section 31, a communication section 32, a storage section 33, and a control section 34E.
- the storage unit 33 stores, for example, a display device control program for controlling the display device 20 so that the content of the screen displayed on the display unit 31 is displayed on the display device 20.
- a character input processing program for performing character input processing based on an operation performed on the character input screen 200 in the case of A screen display processing program and the like for performing screen display processing such as enlargement/reduction and switching of the corresponding original screen M are stored.
- the data stored in the storage unit 33 includes, for example, image data of various original screens M, data related to each original screen M (specifically, size, shape, content, configuration of the original screen M). etc.) is included.
- the control unit 34E includes a central processing unit (CPU) and the like, controls the terminal 30E in general, and also controls the display device 20 and the imaging device 60B. For example, when the user performs a touch operation on the display unit 31 or operates the touch pad unit 90, the control unit 34E recognizes the contents of an instruction given by the operation, and responds to the recognized contents. process. When voice is input from the microphone unit 70, the control unit 34E recognizes the content of the electrical signal representing the input voice, and executes processing according to the recognized content. Also, the control unit 34E controls the sound emitted by the speaker unit 80.
- CPU central processing unit
- control unit 34E controls the display device 20 so that the content of the screen displayed on the display unit 31 is displayed on the display device 20 as the content of the original screen M by executing the display device control program. do.
- the control unit 34E includes a display control unit 341, an image data extraction unit 342, an operation determination unit 343, an input control unit 346E, and an operation position specifying unit 349E. .
- the display control unit 341 controls display on the display unit 31 and the display device 20 . Specifically, when the user instructs the activation of the display device control program, the display control unit 341 executes the display device control program stored in the storage unit 33 to display the screen displayed on the display unit 31. is displayed on the display device 20 as the contents of the original screen M. Thereby, the user wearing the spectacles 10 can see the visible screen S corresponding to the original screen M as if it were floating in the air.
- the image data extracting unit 342 selects image data in which the finger is present from among a series of image data obtained by the imaging. is extracted. A general image recognition method is used to determine whether a finger exists in the image data. Further, the operation determining unit 343 and the operation position specifying unit 349E perform processing based on the image data extracted by the image data extracting unit 342. FIG.
- the operation determination unit 343 extracts a series of image data extracted by the image data extraction unit 342 when the imaging device 60E captures an image of the original screen M and the finger reflected in the user's eyes. Based on the object, it is determined what kind of operation the operation with the finger is, among various kinds of operations. For this determination, for example, a general image recognition method is used. Thereby, the operation determination unit 343 can recognize which operation is performed by the finger, such as a tap operation, a double-tap operation, or a long-press operation. Data about the content of the recognized finger operation is stored in the storage unit 33 . In order for the operation determination unit 343 to accurately recognize the operation content, it is desirable that the user performs various touch operations on the visible screen S slowly and with a slight overaction.
- the operation position specifying unit 349E extracts a series of image data obtained by the imaging device 60B when the imaging device 60B images the original screen M and the finger reflected in the user's eyes, and is extracted by the image data extraction unit 342. Based on what has been done, it is specified which position in the original screen M is the target of the operation with the finger. Specifically, in the seventeenth embodiment, the operation position specifying unit 349E first specifies the original screen M and the finger included in the image based on the image data using a general image recognition technique. Next, by checking where the finger is positioned within the specified range of the original screen M, data representing the position within the original screen M to be operated by the finger is created. The created data is stored in the storage unit 33 .
- the input control unit 345E when the user performs an operation on the visible screen S with a finger, determines data regarding the content of the finger operation determined by the operation determination unit 343 and the operation position identification unit 349E. Based on the obtained data representing the position to be operated by the finger in the original screen M and the data related to the original screen M stored in the storage unit 33, the visible screen S is displayed. Recognizing the content of the input instruction corresponding to the operation performed by the finger, and controlling the screen displayed on the display unit 31 and the original screen M displayed on the display device 20 according to the content of the recognized input instruction. is performed.
- the input control unit 345E recognizes an instruction corresponding to the touch operation in the same manner as when the visible screen S is displayed on the touch panel.
- the visible screen S is the character input screen 200 shown in FIG. 15, it is assumed that the user has performed an operation of tapping a desired character key image on the character input screen 200 with a finger.
- the input control unit 345E recognizes that the current operation is a tap operation based on the data regarding the content of the operation with the finger obtained by the operation determination unit 343.
- the input control unit 345E controls the content of the character input screen 200 (for example, the configuration of the keyboard image 210 in the character input screen 200, each character arrangement of key images, etc.) can be recognized. For this reason, the input control unit 345E determines, based on the data representing the position to be operated by the finger in the original screen M obtained by the operation position specifying unit 349E, that the position represented by the data is the character input position. By checking which position on the screen 200 corresponds, the operated character key can be specified. In this way, since the tap operation has been performed by the user and the target of the operation is the specified character key, the input control unit 345E inputs the character represented by the character key in this operation. It can be recognized that an instruction has been given. As a result, the input control unit 345E displays on the display device 20 the original screen M in which the character represented by the character key is input in the display area 220 of the character input screen 200. FIG.
- the input control unit 345E recognizes an instruction to enlarge or reduce the original screen M corresponding to the visible screen S, and allows the user to double-tap the visible screen S.
- an instruction to display an option menu screen as the original screen M is recognized.
- the instruction to scroll and display the original screen M is recognized.
- the user performs the same operation on the visible screen S that the user sees as he/she operates on the screen displayed on a normal touch panel.
- an instruction corresponding to the operation can be input.
- the user performs a touch operation with a finger on the visible screen S that appears to be floating in the air. It is also possible to perform a touch operation in a manner that cannot be performed with Normally, a user performs a touch operation with one finger from the front side of the visible screen S as shown in FIG. Touch operation can be performed with one finger. Further, the user performs a touch operation with a plurality of fingers from the front side of the visual recognition screen S as shown in FIG. You can perform touch operations with your finger.
- FIG. 47 is a flow chart for explaining the character input processing procedure using the visible screen S in the terminal device 9 of the seventeenth embodiment.
- the display device control program is being executed in the terminal 30E. That is, the display device 20, the communication unit 40, and the imaging device 60B are in the power-on state, and the communication between the terminal 30E, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30E to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34E of the terminal 30E reads out the character input processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Process character input. This character input processing may be automatically executed when the character input screen 200 as the original screen M is displayed on the display device 20 .
- the control unit 34E displays the character input screen 200 as the original screen M on the display device 20, and also controls the imaging device 60B to start the imaging operation of imaging the user's eyes (S221).
- the user performs a predetermined operation, for example, a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- the user performs a predetermined operation in order to notify the control unit 34E of the position where the user is operating.
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34E by wireless communication (S222).
- the image data extraction unit 342 determines whether or not a finger is present in the sent image data using a general image recognition technique, and a series of images obtained by imaging with the imaging device 60B.
- Image data in which the finger exists is extracted from the image data (S223). That is, the image data extraction unit 342 extracts only image data representing the content of the user's finger operation.
- the operation determination unit 343 determines whether the operation with the finger is a predetermined operation (here, a tap operation). This determination is made within a predetermined time. Then, if the operation with the finger is the tap operation, the operation determination unit 343 determines that the operation for character input has been recognized normally.
- the operation determination unit 343 determines that the character input operation is normally recognized, the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and also determines that the character input operation is normally recognized. A signal to that effect is sent to the display control unit 341 .
- the display control unit 341 receives the signal, the display control unit 341 adds, to the original screen M, an image showing a green lamp, which means that the character input operation has been properly recognized, and displays it on the display device 20 (S226). ).
- the display control unit 341 adds to the original screen M, together with the image showing the green lamp or in place of the image, an image showing a character or figure indicating that the character input operation has been properly recognized. You may make it Alternatively, the control unit 34E emits a specific notification sound from the speaker unit 80 along with the display of an image indicating that the character input operation has been properly recognized, or in place of the display of the image. good too.
- the operation determination unit 343 determines in the process of step S224 that the character input operation was not properly recognized within the predetermined time, it sends a signal to that effect to the display control unit 341. At this time, for example, even if the image data in which the finger exists is not sent from the image data extraction unit 342 within a predetermined time, the operation determination unit 343 determines that the tap operation is not normally recognized. I judge. When the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a red lamp, which means that the character input operation was not recognized normally, and displays it on the display device 20 ( S225). After that, the process proceeds to step S230.
- the display control unit 341 creates an image showing a character or figure indicating that the character input operation has not been properly recognized together with the image showing the red lamp or instead of the image. It may be added to the screen M.
- the control unit 34E emits a specific notification sound from the speaker unit 80 along with the display of an image indicating that the character input operation has not been properly recognized, or instead of displaying the image.
- the operation position specifying unit 349E determines that the position targeted by the finger operation is the original position based on the image data in which the operation determination unit 343 determines that the operation with the finger is the tap operation.
- the position within the screen M is identified, and data representing the position to be operated by the finger within the original screen is generated (S227).
- the data representing the position to be operated thus generated is stored in the storage unit 33 .
- the input control unit 345E determines the data regarding the content of the finger operation determined by the operation determination unit 343, and the data regarding the operation performed by the finger within the original screen M obtained by the operation position specifying unit 349E. Based on the data representing the target position and the data on the original screen M stored in the storage unit 33, the content of the input instruction corresponding to the finger operation is recognized (S228). For example, when the user taps a character key image in the keyboard image 210 with a finger, the input control unit 345E detects the position of the finger obtained from the data representing the target position of the finger operation. corresponds to which character key image area in the keyboard image 210, the character key on which the tap operation was performed this time is identified, and the input of the identified character key is recognized.
- the input control unit 345E sends a signal regarding the content of the recognized input instruction to the display control unit 341, and the display control unit 341 displays the original screen M according to the content of the input instruction on the display device 20 (S229). .
- step S229 or step S225 the control unit 34E determines whether or not an instruction to end character input using the visible screen S has been received from the user (S230). If an instruction to end character input has been received, the character input process using the visible screen S ends. On the other hand, if an instruction to end character input has not been received, the process proceeds to step S221, and character input processing using the visible screen S is continued. Note that the user gives an instruction to end character input by operating the terminal 30E, by giving an instruction by voice, or by performing a touch operation on the touch pad section 90, for example.
- FIG. 48 is a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 9 of the seventeenth embodiment.
- the display device control program is being executed in the terminal 30E. That is, the display device 20, the communication unit 40, and the imaging device 60B are in the power-on state, and the communication between the terminal 30E, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30E to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34E of the terminal 30E reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing. This screen display process may be automatically executed when the original screen M is displayed on the display device 20 .
- the control unit 34E controls the imaging device 60B to start an imaging operation of imaging the user's eyes (S241).
- the user performs a desired operation with a finger on the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34E by wireless communication (S242).
- the image data extraction unit 342 determines whether or not a finger is present in the sent image data using a general image recognition technique, and a series of images obtained by imaging with the imaging device 60B.
- Image data in which the finger exists is extracted from the image data (S243). That is, the image data extraction unit 342 extracts only image data representing the content of the user's finger operation.
- the operation determination unit 343 determines the details of the operation performed by the finger. This determination is made within a predetermined time. Then, the operation determination unit 343 determines whether or not the operation with the finger is normally recognized (S244).
- the operation determination unit 343 determines that the operation with the finger has been normally recognized, the operation determination unit 343 stores data regarding the content of the operation with the finger in the storage unit 33, and outputs a signal indicating that the operation with the finger has been normally recognized. Send to the display control unit 341 .
- the display control unit 341 receives the signal, the display control unit 341 adds to the original screen M an image showing a green lamp, which means that the finger operation has been properly recognized, and displays it on the display device 20 (S246).
- the display control unit 341 adds to the original screen M an image showing characters or figures indicating that the operation with the finger has been correctly recognized, together with the image showing the green lamp or in place of the image.
- the control unit 34E may emit a specific notification sound from the speaker unit 80 along with the display of the image indicating that the finger operation has been properly recognized, or instead of displaying the image.
- the operation determination unit 343 determines in the process of step S244 that the finger operation was not properly recognized, it sends a signal to that effect to the display control unit 341. At this time, for example, even if the image data in which the finger exists is not sent from the image data extraction unit 342 within a predetermined time, the operation determination unit 343 determines that the finger operation is normally recognized. Decide it wasn't. Upon receiving the signal, the display control unit 341 adds to the original screen M an image showing a red lamp indicating that the finger operation was not properly recognized, and displays it on the display device 20 (S245). After that, the process proceeds to step S250.
- the display control unit 341 displays an image showing characters or figures indicating that the finger operation was not properly recognized on the original screen M together with the image showing the red lamp or instead of the image. You may make it add.
- the control unit 34E may emit a specific notification sound from the speaker unit 80 along with the display of an image indicating that the finger operation has not been properly recognized, or instead of displaying the image. .
- the operation position specifying unit 349E determines whether the position targeted for the finger operation is within the original screen M based on the image data for which the content of the finger operation has been determined by the operation determination unit 343. , and generates data representing the position to be operated by the finger in the original screen M (S247).
- the data representing the position to be operated thus generated is stored in the storage unit 33 .
- the input control unit 345E determines the data regarding the content of the finger operation determined by the operation determining unit 343, and the data regarding the operation performed by the finger within the original screen M obtained by the operation position specifying unit 349E. Based on the data representing the target position and the data on the original screen M stored in the storage unit 33, the content of the instruction corresponding to the finger operation is recognized (S248). For example, when the user performs a double-tap operation with a finger on the visible screen S, the input control unit 345E identifies that the current operation is a double-tap operation, and enlarges (or reduces) the original screen M. ) has been instructed to do so. After that, the input control unit 345E sends a signal regarding the content of the recognized instruction to the display control unit 341, and the display control unit 341 displays the original screen M according to the content of the instruction on the display device 20 (S249).
- step S249 or step S245 the control unit 34E determines whether or not an instruction to end the operation for displaying the screen has been received from the user (S250). If the instruction to end the operation for displaying the screen has been received, the processing for displaying the screen ends. On the other hand, if the instruction to end the operation for displaying the screen has not been received, the process proceeds to step S241 to continue the screen displaying process.
- the user gives an instruction to end the operation for displaying the screen using the visible screen S, for example, by operating the terminal 30E, by giving a voice instruction, or by performing a touch operation on the touch pad section 90.
- control unit 34E performs character input processing when the user instructs to input characters, and the control unit 34E performs an operation for screen display when the user instructs to perform an operation for screen display.
- control unit 34E may automatically switch between character input processing and screen display processing.
- the terminal device of the seventeenth embodiment has the same functions and effects as those of the first embodiment. That is, in the terminal device of the seventeenth embodiment, since the terminal and the glasses as wearables are configured separately, existing mobile terminals such as smartphones and tablet terminals can be used as the terminal. By using an existing mobile terminal or the like as a terminal in this manner, the number of parts of the spectacles used as a wearable object can be reduced, and the structure of the spectacles can be simplified. Further, by using a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved.
- the operation determination unit captures a series of images obtained by imaging the original screen and the finger reflected in the user's eyes by the imaging device. Based on the image data, it is determined what kind of operation the operation with the finger is among various operations, and the operation position identification unit causes the image capturing device to image the original screen and the finger reflected in the user's eyes. Then, based on a series of image data obtained by the imaging, it is specified which position in the original screen is the position to be operated by the finger.
- the input control unit provides data regarding the content of the operation with the finger obtained by the operation determination unit and the original data obtained by the operation position identification unit.
- the input control unit provides data regarding the content of the operation with the finger obtained by the operation determination unit and the original data obtained by the operation position identification unit.
- the original data obtained by the operation position identification unit Corresponds to the operation performed by the finger on the visible screen based on the data representing the target position of the finger operation on the screen and the data related to the original screen stored in the storage unit.
- the content of the input instruction to be input is recognized, and the original screen to be displayed on the display device is controlled according to the content of the recognized input instruction. For this reason, the user performs an operation on the visible screen that appears to be floating in the air in the same manner as operating on a screen displayed on a normal touch panel, thereby issuing an instruction corresponding to the operation. can be entered. Therefore, by using the terminal device of the seventeenth embodiment, the user can perform various operations such as character input operation and enlargement/reduction by performing operations on the visible
- the display device and imaging device are wirelessly connected to the terminal.
- the terminal device includes the microphone section, the speaker section, and the touch pad section. It does not have to be equipped with a part.
- the imaging device is configured to focus on the user's eyes in advance
- an imaging device having an autofocus function may be used as the imaging device.
- the camera control unit of the imaging device has an autofocus control unit that automatically focuses on the user's eyes.
- This autofocus control section generally controls the camera section so as to automatically focus on a subject at a predetermined position within the imaging range. For example, as the camera section of the imaging device, one having one focus point in the center of the imaging range is used. Then, the imaging device is attached to the spectacles so that the user's eyes are positioned on the optical axis of the lens of the camera unit.
- the autofocus control unit When imaging is started, the autofocus control unit focuses so that the subject on the focus point, that is, the user's eyes, is in focus. Therefore, the imaging device can acquire an image while the user's eyes are in focus. Also, the autofocus control unit may recognize the eyes of the subject and automatically focus on the recognized eyes.
- the operation position specifying unit specifies the original screen and the finger included in the image based on the image data using a general image recognition technique, and then A case has been described in which, by checking where the finger is positioned within the range of the screen, data representing the position targeted for operation by the finger within the original screen is created.
- the operation position specifying unit one that performs the following processing can also be used. That is, when the imaging device captures an image of the original screen and the finger reflected in the user's eyes, the operation position specifying unit may capture an image based on a series of image data obtained by the imaging.
- the range of the original screen within the imaging range and the position of the finger within the imaging range are obtained, and based on the range of the original screen within the obtained imaging range and the position of the finger within the imaging range It is also possible to specify a position on the original screen that is the target of the operation with the finger.
- an XY coordinate system is set with the horizontal direction as the X-axis direction and the vertical direction as the Y-axis direction within the imaging range of the imaging device.
- the origin of this XY coordinate system is set at an arbitrary position.
- the operation position specifying unit first recognizes the image of the original screen based on the image data, and acquires the position data for each position of the four corners of the image of the original screen in the XY coordinate system.
- the acquired position data for each position of the four corners becomes data representing the range of the original screen within the imaging range.
- FIG. 44(b) an XY coordinate system is set with the horizontal direction as the X-axis direction and the vertical direction as the Y-axis direction within the imaging range of the imaging device.
- the origin of this XY coordinate system is set at an arbitrary position.
- the operation position specifying unit first recognizes the image of the original screen based on the image data, and acquires the position data for each position of the four corners of the image of the original screen in the
- FIG. 49 shows an example of the original screen desirable for the operation position specifying unit to specify the range of the original screen.
- the original screen as shown in FIG. 49(a), large circle marks or the like are displayed in advance at the positions of the four corners, or as shown in FIG. This makes it easier for the operation position specifying unit to recognize the image of the original screen.
- the operation position identifying unit recognizes the user's finger based on the image data and acquires position data of the user's finger in the XY coordinate system.
- the operation position specifying unit calculates the xy coordinates with the lower left point of the original screen as the origin, for example, based on the position data of the positions of the four corners in the XY coordinate system and the position data of the user's finger in the XY coordinate system. Calculate the position data of the user's finger in the system.
- the position data of the user's finger in the xy coordinate system calculated in this manner becomes data specifying the position of the target of the finger operation in the original screen.
- FIG. 50 is a schematic block diagram of a terminal device according to the eighteenth embodiment of the present invention.
- parts having the same functions as those in the seventeenth embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- a terminal device 9A is a terminal having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10.
- 30F a communication unit 40, an imaging device 60B for imaging the user's eyes, a microphone unit 70, a speaker unit 80, and a touch pad unit 90.
- the terminal 30F includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34F.
- the control unit 34F includes a display control unit 341, an iris/pupil image data generation unit 351, an image It has a difference extraction unit 352, an image data extraction unit 342, an operation determination unit 343, an input control unit 346E, and an operation position identification unit 349E.
- the main difference between the terminal device 9A of the eighteenth embodiment and the terminal device 9 of the seventeenth embodiment is that the control unit 34F of the terminal 30F includes an iris/pupil image data generation unit 351 and an image difference extraction unit 352.
- the control unit 34F of the terminal 30F includes an iris/pupil image data generation unit 351 and an image difference extraction unit 352.
- Other configurations of the terminal device 9A of the eighteenth embodiment are the same as those of the terminal device 9 of the seventeenth embodiment.
- the iris/pupil image data generation unit 351 when an image of the user's eye is imaged by the imaging device 60B before the original screen M is displayed on the display device 20, generates an image of the iris and the pupil based on the image data obtained by the imaging. image data is generated and stored in the storage unit 33 .
- it is desirable that the timing at which the imaging device 60B captures the image data is immediately before the original screen M is displayed on the display device 20.
- the state of the user's iris and pupil included in the image data of the iris and pupil generated by the iris/pupil image data generation unit 351 is substantially the same as the state when the user operates the viewing screen S.
- the image difference extracting unit 352 stores image data obtained by imaging the original screen M and the finger reflected in the user's eyes and the image data stored in the storage unit 33 when the imaging device 60B images the finger. Based on the image data of the iris and pupil, a process of extracting the difference between them is performed to generate image data from which the images of the iris and pupil have been removed. That is, the image data from which the difference is extracted is image data of only the original screen M and the finger, with the images of the user's iris and pupils, which are unnecessary for image recognition, removed.
- the image data extraction unit 342 performs image data extraction processing using a series of image data generated by the image difference extraction unit 352 .
- FIG. 51 is a flow chart for explaining the character input processing procedure using the visible screen S in the terminal device 9A of the eighteenth embodiment.
- the same step numbers are assigned to the same processes as those of the flowchart of FIG. 47 in the seventeenth embodiment described above, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30F. That is, the display device 20, the communication unit 40, and the imaging device 60B are in the power-on state, and the communication between the terminal 30F, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30F to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34F of the terminal 30F reads the character input processing program from the storage unit 33 and uses the visible screen S according to the processing flow shown in FIG. Process character input.
- the control unit 34F causes, for example, the speaker unit 80 to generate a voice saying "Please look in the direction in which the original screen is displayed for a few seconds.” This allows the user to look in the direction in which the original screen is displayed according to the voice instruction.
- the control unit 34F temporarily puts the display device 20 into a state in which nothing is displayed, and controls the imaging device 60B to capture an image of the user's eyes with the imaging device 60B.
- the iris/pupil image data generation unit 351 generates image data of the iris and pupil based on the image data obtained by imaging the user's eyes with the imaging device 60B (S2201).
- the generated image data of the iris and pupil are stored in the storage unit 33 .
- the display control unit 341 displays the character input screen 200 as the original screen M on the display device 20 (S2202). Thereby, the user can see the visible screen S corresponding to the original screen M as if it were floating in the air.
- the control unit 34F starts an imaging operation for imaging the user's eyes by controlling the imaging device 60B (S221). The user performs a predetermined operation, for example, a tap operation, with a finger on the keyboard image 210 of the character input screen 200 that is the visible screen S.
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes. Then, the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62. FIG. Then, the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34F by wireless communication (S222).
- the image difference extraction unit 352 extracts the difference between the image data obtained by imaging with the imaging device 60B and the image data of the iris and pupil stored in the storage unit 33. (S2203). This provides image data from which the iris and pupil images have been removed.
- the image data extraction unit 342 determines whether or not a finger is present in the image data created by the image difference extraction unit 352 using a general image recognition technique, and images the image data with the imaging device 60B. Image data in which a finger exists is extracted from the obtained series of image data (S223).
- the image data includes an image of the finger and the original screen M reflected in the user's eyes.
- the process proceeds to step S224.
- the processing after step S224 is the same as the processing according to the flowchart of FIG. 47 in the seventeenth embodiment.
- FIG. 52 is a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 9A of the eighteenth embodiment.
- the same step numbers are assigned to the same steps as those of the flowchart of FIG. 48 in the seventeenth embodiment, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30F. That is, the display device 20, the communication unit 40, and the imaging device 60B are in the power-on state, and the communication between the terminal 30F, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30F to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34F of the terminal 30F reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing.
- the control unit 34F causes the speaker unit 60 to say, for example, "Please look in the direction in which the original screen is displayed for a few seconds.” This allows the user to look in the direction in which the original screen is displayed according to the voice instruction.
- the control unit 34F temporarily puts the display device 20 into a state in which nothing is displayed, and controls the imaging device 60B to capture an image of the user's eyes with the imaging device 60B.
- the iris/pupil image data generation unit 351 generates image data of the iris and pupil based on the image data obtained by imaging the user's eyes with the imaging device 60B (S2401).
- the generated image data of the iris and pupil are stored in the storage unit 33 .
- the user operates the terminal 30F to display a desired screen on the display device 20.
- FIG. the display control unit 341 displays the screen as the original screen M on the display device 20 (S2402), and the user can see the visible screen S corresponding to the original screen M as if it were floating in the air. can.
- the control unit 34F starts an imaging operation for imaging the user's eyes by controlling the imaging device 60B (S241).
- a user performs a desired operation on the visible screen S with a finger.
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34F by wireless communication (S242).
- the image difference extraction unit 352 extracts the difference between the image data obtained by imaging with the imaging device 60B and the image data of the iris and pupil stored in the storage unit 33. (S2403). This provides image data from which the iris and pupil images have been removed.
- the image data extraction unit 342 determines whether or not a finger is present in the image data created by the image difference extraction unit 352 using a general image recognition technique, and images the image data with the imaging device 60B. Image data in which a finger exists is extracted from the obtained series of image data (S243).
- the image data includes an image of the finger and the original screen M reflected in the user's eyes.
- the process proceeds to step S244.
- the processing after step S244 is the same as the processing according to the flowchart of FIG. 48 in the seventeenth embodiment.
- the terminal device of the eighteenth embodiment has the same actions and effects as those of the seventeenth embodiment.
- the eyeglasses-type terminal of the eighteenth embodiment captures an image of the user's eyes with an imaging device before displaying the original screen on the display device
- the iris and An iris/pupil image data generation unit that generates image data of the pupil and stores it in a storage unit
- an image difference extraction unit that generates image data from which the iris and pupil images are removed by performing processing for extracting the difference between the data and the image data of the iris and pupil stored in the storage unit
- the image data extraction unit performs image data extraction processing using a series of image data generated by the image difference extraction unit.
- the image difference extraction unit may generate image data from which the contact lens image is removed in addition to the iris and pupil images. That is, when an image of the eye of the user wearing the contact lens is imaged by the imaging device before the original screen is displayed on the display device as the iris/pupil image data generation unit, image data obtained by imaging the eye.
- image data of the contact lens, iris, and pupil are generated based on and stored in the storage unit, and as the image difference extraction unit, when the imaging device captures the original screen and the finger reflected in the user's eyes Then, based on the image data obtained by the imaging and the image data of the contact lens, the iris and the pupil stored in the storage unit, a process of extracting the difference between them is performed to obtain the contact lens, the iris and the pupil. may be used to generate image data with the image removed. As a result, the images of the contact lens, the iris, and the pupil are removed from the image data generated by the image difference extraction unit. can be easily done.
- the display device and imaging device are connected wirelessly to the terminal.
- FIG. 53 is a schematic block diagram of a terminal device according to the nineteenth embodiment of the present invention.
- the same reference numerals are given to the parts having the same functions as those of the 17th embodiment, and the detailed explanation thereof will be omitted.
- the terminal device 9B of the nineteenth embodiment is a terminal device having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10.
- 30G a communication unit 40, an imaging device 60B for imaging the user's eyes, a microphone unit 70, a speaker unit 80, and a touch pad unit 90.
- the terminal 30G includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34G. , an operation determination unit 343, an input control unit 346E, and an operation position specifying unit 349E.
- the main difference between the terminal device 9B of the nineteenth embodiment and the terminal device 9 of the seventeenth embodiment is that the control section 34G of the terminal 30G includes an image conversion section 353.
- Other configurations of the terminal device 9B of the nineteenth embodiment are the same as those of the terminal device 9 of the seventeenth embodiment.
- the image conversion unit 353 captures the original screen M and the image of the finger reflected on the user's eyes, which originally have a spherical shape, so that the image is captured by the imaging device 60B. Image conversion is performed on image data. For example, this image conversion is performed by an image conversion formula created in advance using the curvature of the eye (eyeball surface) or the like. In the nineteenth embodiment, the image data extraction unit 342 performs predetermined processing using a series of image data after image conversion has been performed by the image conversion unit 353 .
- FIG. 54 is a flowchart for explaining the character input processing procedure using the visible screen S in the terminal device 9B of the nineteenth embodiment.
- the same step numbers are assigned to the same processes as those of the flowchart of FIG. 47 in the seventeenth embodiment, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30G. That is, the display device 20, the communication unit 40, and the imaging device 60B are in a power-on state, and communication between the terminal 30G, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30G to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34G of the terminal 30G reads the character input processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Process character input. This character input processing may be automatically executed when the character input screen 200 as the original screen M is displayed on the display device 20 .
- the control unit 34G displays the character input screen 200 as the original screen M on the display device 20, and controls the imaging device 60B to start the imaging operation of imaging the user's eyes.
- the user performs a predetermined operation, for example, a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34G by wireless communication (S222).
- the image conversion unit 353 converts the image data obtained by imaging with the imaging device 60B into the original screen M reflected in the user's eyes, which is originally spherical, and the image of the finger on a plane. Image conversion is performed so that the image is reflected (S2204). The image data after this image conversion is sent to the image data extraction unit 342 . Then, the image data extraction unit 342 determines whether or not the finger is present in the sent image data using a general image recognition method, and a series of images obtained by imaging with the imaging device 60B. Image data in which a finger exists is extracted from the data (S223). After that, the process proceeds to step S224. The processing after step S224 is the same as the processing according to the flowchart of FIG. 47 in the seventeenth embodiment.
- FIG. 55 is a flowchart for explaining the procedure of screen display processing using the visible screen S in the terminal device 9B of the nineteenth embodiment.
- the same step numbers are assigned to the same processes as those of the flowchart of FIG. 48 in the seventeenth embodiment, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30G. That is, the display device 20, the communication unit 40, and the imaging device 60B are in a power-on state, and communication between the terminal 30G, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30G to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34G of the terminal 30G reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing. This screen display process may be automatically executed when the original screen M is displayed on the display device 20 .
- the control unit 34G controls the imaging device 60B to start an imaging operation of imaging the user's eyes (S241).
- the user performs a desired operation with a finger on the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34G by wireless communication (S242).
- the image conversion unit 353 converts the image data obtained by imaging with the imaging device 60B into the original screen M reflected in the user's eyes, which is originally spherical, and the image of the finger on a plane. Image conversion is performed so that the image is reflected (S2404). The image data after this image conversion is sent to the image data extractor 342 . Then, the image data extraction unit 342 determines whether or not the finger is present in the sent image data using a general image recognition method, and a series of images obtained by imaging with the imaging device 60B. Image data in which a finger exists is extracted from the data (S243). After that, the process proceeds to step S244.
- the processing after step S244 is the same as the processing according to the flowchart of FIG. 48 in the seventeenth embodiment.
- the terminal device of the nineteenth embodiment has the same functions and effects as those of the seventeenth embodiment.
- an imaging device captures an image of the finger and the original screen reflected in the user's eyes, which originally have a spherical shape, so that the image appears on a plane.
- the image data extraction unit extracts image data using a series of image data subjected to image conversion by the image conversion unit. process.
- distortion is corrected in the series of image data extracted by the image data extraction unit, so that the operation position identification unit can accurately identify the position on the original screen that is the target of the finger operation. has the advantage of being able to
- image conversion unit in the nineteenth embodiment may be provided in the terminal of the terminal device in the eighteenth embodiment.
- FIG. 56 is a schematic block diagram of a terminal device according to the twenty-tenth embodiment of the present invention.
- the same reference numerals are given to the parts having the same functions as those of the 17th embodiment, and the detailed explanation thereof will be omitted.
- a terminal device 9C is a terminal having spectacles 10, a display device 20 provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10.
- 30H a communication unit 40, an imaging device 60B for imaging the user's eyes, a microphone unit 70, a speaker unit 80, and a touch pad unit 90.
- the terminal 30H includes a display unit 31, a communication unit 32, a storage unit 33, and a control unit 34H.
- the control unit 34H includes a display control unit 341, an eye presence/absence determination unit 354, It has a control unit 355, an image data extraction unit 342, an operation determination unit 343, an input control unit 346E, and an operation position specifying unit 349E.
- the main difference between the terminal device 9C of the 20th embodiment and the terminal device 9 of the 17th embodiment is that the control unit 34H of the terminal 30H has an eye presence/absence determination unit 354 and a notification control unit 355.
- the control unit 34H of the terminal 30H has an eye presence/absence determination unit 354 and a notification control unit 355.
- Other configurations of the terminal device 9C of the twenty-tenth embodiment are the same as those of the terminal device 9 of the seventeenth embodiment.
- the eye presence/absence determination unit 354 determines whether an image of the user's eyes exists in the image data obtained by imaging with the imaging device 60B using a general image recognition technique, and determines whether the image of the user's eyes exists. It is detected that image data in which no image exists has been continuously acquired by the imaging device 60B for a certain period of time. Further, when the eye presence/absence determination unit 354 detects that image data in which the image of the user's eyes does not exist has been continuously acquired by the imaging device 60B for a certain period of time, the notification control unit 355 It controls the speaker section (informing device) 80 to generate sound from the speaker section 80 .
- the fact that the imaging device 60B continuously acquires image data without the user's eyes present for a certain period of time may mean that the user has his/her eyes closed, for example, is sleeping. Therefore, for example, when the user is driving an automobile, the notification control unit 355 recognizes that the user is dozing off based on the detection result of the eye presence/absence determination unit 354, and generates an alarm sound from the speaker unit 80. By doing so, it is possible to prevent drowsy driving.
- FIG. 57 is a flowchart for explaining a character input process procedure using the visible screen S in the terminal device 9C of the twenty-tenth embodiment.
- the same step numbers are assigned to the same steps as those of the flowchart of FIG. 47 in the seventeenth embodiment, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30H. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60B are in a power-on state, and communication between the terminal 30H, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30H to display the menu screen on the display unit 31. Then, the user taps the icon of the character input processing program on the menu screen to select the character input processing program.
- the control unit 34H of the terminal 30H reads the character input processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Process character input. This character input processing may be automatically executed when the character input screen 200 as the original screen M is displayed on the display device 20 .
- the control unit 34H displays the character input screen 200 as the original screen M on the display device 20, and controls the imaging device 60B to start the imaging operation of imaging the user's eyes.
- the user performs a predetermined operation, for example, a tap operation, with a finger on the keyboard image 210 of the character input screen 200 which is the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- a predetermined operation for example, a tap operation
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34H by wireless communication (S222).
- the eye presence/absence determination unit 354 determines whether an eye image exists in the sent image data using a general image recognition method (S2205). When the eye presence/absence determination unit 354 determines that an image of an eye exists in the image data, it sends the image data to the image data extraction unit 342 . After that, the process proceeds to step S223. On the other hand, when the eye presence/absence determination unit 354 determines that the image of the eye does not exist in the image data, it does not send the image data to the image data extraction unit 342 . Then, in this case, the eye presence/absence determination unit 354 determines whether or not image data in which the image of the user's eyes does not exist has been continuously acquired by the imaging device 60B for a certain period of time (S2206).
- step S2205 If image data in which the image of the user's eyes does not exist has not been acquired continuously for a certain period of time by the imaging device 60B, the process proceeds to step S2205.
- the eye presence/absence determination unit 354 determines that image data without an image of the user's eyes has been continuously acquired for a certain period of time by the imaging device 60B, it sends a signal to that effect to the notification control unit 355. Send.
- the notification control unit 355 recognizes that the user is dozing off, and controls the speaker unit 80 to generate a predetermined warning sound from the speaker unit 80 (S2207). After that, the process proceeds to step S223.
- step S223 the image data extraction unit 342 extracts the image data (the image data in which the image of the user's eyes exists) sent from the eye presence/absence determination unit 354 using a general image recognition method. It is determined whether or not the finger exists, and image data in which the finger exists is extracted from a series of image data obtained by imaging with the imaging device 60B. After that, the process proceeds to step S224.
- the processing after step S224 is the same as the processing according to the flowchart of FIG. 47 in the seventeenth embodiment.
- FIG. 58 is a flow chart for explaining the procedure of screen display processing using the visible screen S in the terminal device 9C of the twenty-tenth embodiment.
- the same step numbers are assigned to the same processes as those of the flowchart of FIG. 48 in the seventeenth embodiment, and detailed description thereof will be omitted.
- the display device control program is being executed on the terminal 30H. That is, it is assumed that the display device 20, the communication unit 40, and the imaging device 60B are in a power-on state, and communication between the terminal 30H, the display device 20, and the imaging device 60B is valid.
- the user operates the terminal 30H to display the menu screen on the display unit 31. Then, the user taps the icon of the screen display processing program on the menu screen to select the screen display processing program.
- the control unit 34H of the terminal 30H reads out the screen display processing program from the storage unit 33, and uses the visible screen S according to the processing flow shown in FIG. Perform screen display processing. This screen display process may be automatically executed when the original screen M is displayed on the display device 20 .
- the control unit 34H controls the imaging device 60B to start an imaging operation of imaging the user's eyes (S241).
- the user performs a desired operation with a finger on the visible screen S corresponding to the original screen M currently displayed on the display device 20 .
- the original screen M or the original screen M and the finger performing the operation are reflected in the user's eyes.
- the image of the user's eyes is captured by the imaging device 60B, and the image data obtained by the imaging device 60B is sent to the image processing section 62.
- the image processing unit 62 performs predetermined image processing on the image data, and the image data subjected to the image processing is sent from the imaging device 60B to the control unit 34H by wireless communication (S242).
- the eye presence/absence determination unit 354 determines whether an eye image exists in the sent image data using a general image recognition method (S2405).
- the eye presence/absence determination unit 354 determines that an image of an eye exists in the image data, it sends the image data to the image data extraction unit 342 . After that, the process proceeds to step S243.
- the eye presence/absence determination unit 354 determines that the image of the eye does not exist in the image data, it does not send the image data to the image data extraction unit 342 . Then, in this case, the eye presence/absence determination unit 354 determines whether or not image data in which the image of the user's eyes does not exist has been continuously acquired by the imaging device 60B for a certain period of time (S2406).
- the eye presence/absence determination unit 354 determines that image data without an image of the user's eyes has been continuously acquired for a certain period of time by the imaging device 60B, it sends a signal to that effect to the notification control unit 355. Send.
- the notification control unit 355 recognizes that the user is dozing off, and controls the speaker unit 80 to generate a predetermined warning sound from the speaker unit 80 (S2407). After that, the process proceeds to step S243.
- step S243 the image data extracting unit 342 extracts the image data sent from the eye presence/absence determining unit 354 (image data in which the image of the user's eyes exists) using a general image recognition technique. It is determined whether or not the finger exists, and image data in which the finger exists is extracted from a series of image data obtained by imaging with the imaging device 60B. After that, the process proceeds to step S244.
- the processing after step S244 is the same as the processing according to the flowchart of FIG. 48 in the seventeenth embodiment.
- the terminal device of the 20th embodiment has the same actions and effects as those of the 17th embodiment.
- the terminal device of the 20th embodiment determines whether an image of the user's eyes exists in the image data obtained by imaging with the imaging device, and determines whether the image of the user's eyes does not exist.
- An eye presence/absence determination unit that detects that image data has been continuously acquired by the imaging device for a certain period of time, and an eye presence/absence determination unit detects image data in which an image of the user's eyes does not exist is detected by the imaging device.
- a notification control unit that controls the speaker unit to generate sound from the speaker unit when it is detected that the information has been acquired continuously for a certain period of time.
- the notification control unit determines whether the driver's eye image does not exist by the eye presence/absence determination unit.
- drowsy driving can be prevented by generating an alarm from the speaker section.
- the eye presence/absence determination unit and notification control unit in the 20th embodiment may be provided in the terminal of the terminal device in the 18th embodiment or the 19th embodiment. Further, the eye presence/absence determination section and notification control section in the 20th embodiment may be provided in the terminal of the terminal device in the 18th embodiment together with the image conversion section in the 19th embodiment.
- FIG. 59(a) is a schematic plan view of a terminal device according to the twenty-first embodiment of the present invention
- FIG. 59(b) is a schematic right side view of the terminal device
- 60 is a schematic perspective view of the terminal device shown in FIG. 59.
- components having the same functions as those in the above-described third embodiment are denoted by the same reference numerals, and detailed description thereof will be omitted.
- the terminal device 2C of the twenty-first embodiment includes spectacles 10, a display device 20A provided in the spectacles 10, and a display unit 31 configured separately from the spectacles 10. , a communication unit 40, and an imaging device 60 for imaging the front of the user.
- the terminal device 2C of the twenty-first embodiment differs from the terminal device 2A of the third embodiment only in the configuration of the display device.
- the device 2C is exactly the same as the terminal device 2A of the third embodiment.
- the display device 20A includes a small projector (not shown) having a liquid crystal panel (display device), an optical system (not shown) for reflecting part of the light (image), and a half screen. and a mirror 24 .
- the half mirror 24 is arranged in front of the user's face and within the user's field of view. Various sizes and shapes can be used as the half mirror 24 .
- the half mirror 24 is embedded in a prism arranged in front of the lens portion 11 of the spectacles 10 .
- the optical system is composed of this prism alone or a combination of this prism and a lens or the like.
- the display device 20A and the imaging device 60 may be provided detachably from the spectacles 10, respectively.
- the half mirror 24 may be integrated with the prism placed in front of the lens portion 11 of the spectacles 10 or may be attached to the prism placed in front of the lens portion 11 of the spectacles 10 . Also, the half mirror 24 may be attached to the lens portion 11 of the spectacles 10 , embedded in or integrated with the lens portion 11 of the spectacles 10 .
- the images and videos displayed on the liquid crystal panel of the small projector are projected onto the half mirror 24 via the optical system. As a result, a very small original screen is displayed on the half mirror 24 .
- the visible screen S can be seen as if it is floating in the air.
- the image or video displayed on the liquid crystal panel of the small projector is projected onto the entire half mirror 24 here, the image or video may be projected onto a portion of the half mirror 24 .
- the terminal device of the twenty-first embodiment has the same functions and effects as those of the third embodiment.
- the display device for projecting an image onto a half mirror which is described in the twenty-first embodiment, can be applied not only to the terminal device of the third embodiment, but also to the terminal devices of the other embodiments described above. can.
- the imaging lens of the imaging device 60B is replaced with the half mirror of the display device 20A. 24 may be installed.
- FIG. 24 a case where such a configuration is applied to the terminal device of the seventeenth embodiment will be described.
- FIG. 61(a) is a schematic plan view of the terminal device of the seventeenth embodiment in which a display device for projecting an image onto a half mirror and an imaging lens are attached near the half mirror
- FIG. 61(b) is the terminal It is a schematic right side view of an apparatus.
- 62 is a schematic perspective view of the terminal device shown in FIG. 61.
- FIG. 61 by mounting the imaging lens near the half mirror 24, the deviation between the direction of the optical axis of the imaging lens facing the user's eyes and the line-of-sight direction when the user views the original screen M can be minimized. can be minimized. Therefore, the operation position within the original screen M can be accurately specified by the operation position specifying unit.
- the display device 20A and the imaging device 60B may be detachably attached to the spectacles 10 .
- FIG. 63 is a schematic perspective view of a terminal device which is the twenty-second embodiment of the present invention.
- the same reference numerals are given to the parts having the same functions as those of the first embodiment described above, and the detailed explanation thereof will be omitted.
- the terminal device 1C of the twenty-second embodiment includes a face shield 1000 as an attachment worn on the user's head, a display device 20 provided on the face shield 1000, and a face shield A terminal 30 having a display unit 31 and a communication unit 40 are provided separately from 100 .
- the terminal device 1C of the 22nd embodiment differs from the terminal device 1A of the first embodiment described above only in that the face shield 1000 is used instead of the glasses.
- the terminal device 1C of the twenty-second embodiment is exactly the same as the terminal device 1A of the first embodiment.
- the face shield 1000 has a transparent shield part 1001 configured to partially or entirely cover the surface of the user's face, and a frame 1002 that fixes the shield part 1001.
- the hologram sheet 23 of the display device 20 is, for example, attached to a predetermined portion of the shield portion 1001, or embedded in or integrated with the shield portion 1001.
- FIG. 63 a small rectangular sheet is used as the hologram sheet 23, and the hologram sheet 23 is attached to the upper part of the shield part 1001 and slightly to the right in a horizontally long state.
- FIG. 64 is a diagram showing an example of a large-sized hologram sheet 23 attached to the terminal device of the twenty-second embodiment.
- a large rectangular hologram sheet 23 is used, and the hologram sheet 23 is attached to the lower portion of the shield section 1001 in a horizontally long state.
- part of the shield part 1001 including the mounting position of the hologram sheet 23 is made of a translucent or non-transparent material. It is desirable that the shield part 1001 is made of a transparent or non-transparent film, or that a part of the shield part 1001 is affixed with a semitransparent or non-transparent film.
- FIG. 65 shows the terminal device of FIG. 64 configured such that a portion of shield portion 1001 of face shield 1000 is translucent or non-transparent. In FIG. 65, a translucent film 1003 is attached to the lower half of the shield portion 1001 of the face shield 1000.
- the terminal device of the twenty-second embodiment has the same functions and effects as those of the first embodiment.
- the display device and the terminal are wirelessly connected
- the display device and the terminal may be connected by wire using a cable.
- FIG. 66 is a schematic perspective view of the terminal device of the second embodiment using the face shield 1000 as an attachment.
- FIG. 67 is a schematic perspective view of the terminal device of the twenty-first embodiment in which the face shield 1000 is used as an attachment and the display device 20A and imaging device 60 are wirelessly connected to the terminal 30A
- FIG. 21 is a schematic perspective view of a terminal device of a twenty-first embodiment in which a face shield 1000 is used as an object and a display device 20A and an imaging device 60 are connected to a terminal 30A by wire;
- the display device 20A and the imaging device 60 can be configured to be detachable from the face shield 1000.
- FIG. 67 and 68 the display device 20A and the imaging device 60 can be configured to be detachable from the face shield 1000.
- the characteristic configuration of each of the above embodiments may be applied to other embodiments. That is, the present invention also includes a terminal device configured by freely combining the characteristic configurations of the above embodiments.
- the case where the user operates the visible screen with a finger has been described. may be operated.
- the operation determination unit determines whether the input pointing tool is operated based on a series of image data obtained by the imaging.
- the operation position specifying unit determines what type of operation is the operation among various operations, and the operation position specifying unit, when the image capturing device captures an image of the original screen and the input pointer reflected in the user's eyes, captures the image. Based on the series of image data obtained by the above, it is specified which position in the original screen is the position to be operated by the input pointing tool.
- the user may attach a predetermined mark to the fingertip when performing a touch operation on the visible screen with the finger.
- Simple figures such as circles and squares, symbols, and the like can be used as the marks.
- a method of marking the fingertip not only the method of drawing the mark directly on the fingertip, but also the method of pasting a sticker with the mark on the fingertip, the method of wearing a sack or ring with the mark on the fingertip, etc. can be used.
- FIGS. 9A and 10A when performing a touch operation with a finger from the front side of the visible screen, a mark is attached to the nail side of the finger. good.
- FIGS. 9A and 10A when performing a touch operation with a finger from the front side of the visible screen, a mark is attached to the nail side of the finger. good.
- FIGS. 9A and 10A when performing a touch operation with a finger from the front side of the visible screen, a mark is attached to the nail side of the finger. good.
- the mark when a finger touch operation is performed not only from the front side of the visible screen but also from the back side, the mark is placed on the fingernail side and the fingertip or pad of the finger. It may be attached to both sides of the side (the part with the fingerprint).
- the mark when a touch operation is performed with a finger only from the back side of the viewing screen, the mark may be attached only to the fingertip or the pad side of the finger.
- the image data extraction unit extracts the image data in which the mark exists as the image data in which the finger exists.
- the operation determination unit determines what kind of operation the finger operation is based on the movement of the mark, and the operation position identification unit determines the position of the mark as the target position of the finger operation. Identify.
- it is easier and more accurate to recognize a mark such as a simple figure than to recognize a finger itself, so the accuracy of image recognition can be improved.
- the display device includes a small projector (projection device) having a display device, an optical system, and a projection unit that projects an original screen displayed on the display device of the projector via the optical system.
- a small projector projection device
- the case of including a hologram sheet or a half mirror has been described, a translucent screen, a transmissive screen, or a hologram optical element, for example, may be used instead of the hologram sheet or half mirror.
- the display device includes a small projector (projection device) having a display device, an optical system such as a lens and a prism, and an original screen displayed on the display device of the projector is projected via the optical system.
- the optical system may include a light guide plate and a waveguide. That is, the optical system may be composed of a part or all of lenses, reflectors, prisms, light guide plates, waveguides, and the like. Also, other optical systems may be used instead of the light guide plate and the waveguide.
- a display device having a small projector, an optical system, a hologram sheet, etc. is used, but the display device does not include a small projector, an optical system, a hologram sheet, etc. , or may be composed only of a transmissive or transparent display device.
- This transmissive or transparent display device is placed in front of the user's face and within the user's field of vision.
- a transmissive or transparent display device for example, a transmissive or transparent liquid crystal panel, a transmissive or transparent organic EL panel, or a transmissive or transparent inorganic EL panel can be used.
- a cover may be attached to the outside of the display device. This makes it possible to make the screen of the display device easier to see and block other people's line of sight. Also, instead of the cover, an electronic curtain using an electrochromic material may be used.
- a display device having a small projector, an optical system, a hologram sheet, etc. is used, but the display device does not include a small projector, an optical system, a hologram sheet, etc. , or may consist of only a display device that is not transparent or transmissive.
- the display device is positioned in front of the user's face and within the user's field of view.
- a display device that is not transparent or transmissive for example, a normal liquid crystal panel, an organic EL panel, or an inorganic EL panel can be used.
- the display device and the imaging device may be detachably attached to the attachment worn on the user's head.
- the display device includes a small projector (projection device) having a display device, an optical system, and a projection unit for projecting an original screen displayed on the display device of the projector via the optical system.
- the projection unit may be detachably attached to an attachment that is attached to the user's head.
- FIG. 81 is a diagram showing an example of a hologram sheet detachably attached to the lens portion 11 of the spectacles 10.
- FIG. 81 the user can reattach the hologram sheet to the desired position of the lens portion 11 of the spectacles 10 any number of times.
- FIG. 81 is a diagram showing an example of a hologram sheet detachably attached to the lens portion 11 of the spectacles 10.
- FIG. 82 is a diagram showing an example of a hologram sheet detachably attached to the lens portion 11 of the spectacles 10.
- an attachment having substantially the same shape as the lens portion 11 of the spectacles 10 is used, and a hologram sheet is attached to a part of or the entire surface of this attachment.
- the user can detachably attach the attachment to the spectacles 10 . That is, the hologram sheet is detachably attached to the spectacles 10 by using an attachment.
- the attachment is attached to the one-eye portion of the spectacles 10, but it may be attached to the both-eye portion of the spectacles 10.
- the display device includes a small projector (projection device) having a display device, an optical system, and a projection unit that projects the original screen displayed on the display device of the projector via the optical system.
- a small projector projection device
- the small projector, or the small projector and the optical system may be detachably attached to an object worn on the user's head.
- the imaging device may also be detachably attached to the wearable object.
- the projection unit may be separately provided detachably from the wearable object.
- the present invention is not limited to this, and two original screens may be displayed on the display device. good.
- the user sees the two visible screens as if they were floating in the air.
- the keyboard image on the character input screen is divided into two, and the keyboard image consists of a right keyboard image and a left keyboard image
- the user can float the right keyboard image and the left keyboard image in the air. It can recognize and input characters with the fingers of both hands.
- the terminal device of the present invention includes one display device, but the terminal device of the present invention may include two display devices.
- FIG. 69(a) is a schematic perspective view of a terminal device having two display devices of a type having a hologram sheet
- FIG. 69(b) is a schematic perspective view of a terminal device having two display devices of a type having a half mirror. be.
- the projector 21 and the optical system 22 of one display device 20 are attached to the handle on the right side of the eyeglasses 10
- the hologram sheet 23 of the display device 20 is attached to the lens portion 11 for the right eye. .
- the projector 21 and the optical system 22 of the other display device 20 are attached to the handle on the left side of the eyeglasses 10, and the hologram sheet 23 of the display device 20 is attached to the lens portion 11 for the left eye.
- a small rectangular hologram sheet 23 is used as the hologram sheet 23, and the two hologram sheets 23, 23 are attached to the upper right and upper left portions of the lens portion 11, respectively, in a horizontally long state. Also, in FIG.
- the half mirror in one display device 20a is embedded in a prism arranged in front of the lens portion 11 on the right side of the glasses 10, and the half mirror in the other display device 20a is It is embedded in a prism arranged in front of the lens portion 11 on the left side of the spectacles 10 .
- 69A and 69B the user perceives the visible screen on one display device with the right eye and perceives the visible screen on the other display device with the left eye. Also, if an image pickup device is provided for each display device, each image pickup device will pick up an image of the eye on each side.
- FIG. 70 is a diagram showing an example of a hologram sheet attached to the lenses of eyeglasses in a terminal device having two display devices.
- a small rectangular hologram sheet is used as the hologram sheet, and two hologram sheets are vertically elongated and attached to the upper right or upper left part of the lens portion.
- a large rectangular hologram sheet is used as the hologram sheet, and two hologram sheets are attached to the upper right or upper left part of the lens section in a horizontally long state.
- two hologram sheets are respectively attached to the entire surface of the right-eye lens portion or the left-eye lens portion.
- a transmissive solar cell may be attached to an item worn on the user's head, and power may be supplied from the solar cell to a display device or the like.
- Various methods are conceivable for attaching this solar cell to the wearable object.
- 71 to 73 are diagrams for explaining a method of attaching a solar cell to a terminal device equipped with eyeglasses as attachments
- FIGS. 74 and 75 are illustrations of a method of attaching a solar cell to a terminal device provided with a face shield as an attachment device. It is a figure for explaining.
- the solar cell 301 is integrated with the lens portion 11 of the spectacles 10 .
- the solar cell 302 is formed in the same shape as the two lens portions 11, 11 of the spectacles 10, and attached to the lens portions 11, 11 of the spectacles 10 from the front side thereof.
- the solar cell 302 is detachable from the lens portions 11,11.
- the solar cell 303 is formed in the same shape as the two lens portions 11, 11 of the spectacles 10, and attached to the lens portions 11, 11 of the spectacles 10 from the front side thereof.
- This solar cell 303 can be flipped up as shown in FIG. 73(c).
- FIGS. 73(a) and (b) show a state in which the solar cell 303 is flipped up.
- the flip-up type solar cell 303 is fixedly attached to the spectacles 10, but may be detachable.
- the solar cell 304 is formed in the shape of the brim of the face shield 1000 and attached to the frame 1002 of the face shield 1000 .
- FIG. 74(a) shows a diagram of the terminal device viewed obliquely from above
- FIG. 74(b) shows a diagram of the terminal device viewed from the side.
- the solar cell 305 is attached to a part of the shield part 1001 of the face shield 1000 .
- the solar cell 305 is formed to have approximately half the size of the shield portion 1001 and is attached to the lower portion of the shield portion 1001 .
- the power supply unit when attaching a solar cell to a wearable object, it is necessary to provide a power supply unit that stores the electricity generated by the solar cell.
- the power supply unit When the terminal and the display device are wirelessly connected, the power supply unit is provided in the housing or the attachment.
- the power supply unit when the terminal and the display device are connected by wire, the power supply unit is provided in both or one of the housing or the wearable object and the terminal. In particular, when the power supply unit is provided only in the terminal, electricity is stored in the terminal when power is generated, and power is supplied from the terminal to the display device by wire.
- FIG. 76 is a diagram for explaining the mounting location of the touch pad section 90 in a terminal device using the face shield 1000 as an attachment.
- the touch pad section 90 can be attached to the right side of the frame 1002 as shown in FIG. 76(a), or attached to the lower right side of the shield section 1001 as shown in FIG. 76(b).
- the touch pad part 90 is fixedly attached to the part, it may be configured to be detachable.
- the touch pad portion is divided into two, one having a simple keyboard function and the other having functions such as an operation panel. may be mounted on the left or right side of the terminal device.
- other mobile terminals mobile phones, smart phones, smart watches, tablet terminals, digital audio players and laptop computers
- operation screens for other information terminals personal computers, etc.
- remote control screens for home appliances lighting, television, air conditioner, security system, etc.
- electrical equipment Control panel screens for car stereos, car navigation systems, in-vehicle AV equipment, air conditioners, etc.
- the control unit of the terminal when the user operates the terminal to display the operation screen or the like as the original screen on the display device, and operates the visible screen corresponding to the original screen with the finger, the control unit of the terminal (Remote control unit) generates a command signal (command) indicating the content of the operation, and wirelessly transmits the generated command signal to the portable terminal or the like via the communication unit.
- the terminal device of the present invention can also be used as a remote controller (remote controller) for remotely controllable devices such as air conditioners.
- FIG. 77 is a diagram showing a situation when a user operates a visible screen corresponding to the remote controller screen using a screen (remote control screen) corresponding to the operation unit of the remote controller for the air conditioner as the original screen.
- the control unit of the terminal when the user operates a button on the visible screen to instruct to lower the set temperature with a finger, the control unit of the terminal generates a command to lower the set temperature, and sends it via the communication unit. Then, the command is transmitted to the air conditioner by infrared communication, so that the user can easily lower the set temperature of the air conditioner.
- FIG. 78 is a diagram showing an example of an original screen for an operation screen when making a call with a mobile phone.
- the control unit of the terminal When the user inputs a desired telephone number on the visible screen corresponding to the original screen for inputting the telephone number as shown in FIG. 78(a) or (b), the control unit of the terminal and transmits the command to the mobile phone via the communication unit, thereby allowing the user to make a call without holding the mobile phone in hand.
- the terminal is not limited to a smartphone or a tablet terminal.
- An audio player, a personal computer, a car navigation system, an in-vehicle AV device, a dedicated terminal, or the like may be used.
- the terminal may be used as a touch pad section for instructing various operations (for example, pointer movement, character input, etc.) on the screen of the display device. That is, the terminal may have a touch pad function.
- a smartphone or a tablet terminal is used as a terminal
- an image of a touch pad can be displayed on the display section of the terminal, and the screen itself of the display section can be used as the touch pad section.
- FIG. 79 is a diagram showing how the screen of the terminal is used as a touch pad when a smartphone is used as the terminal.
- the user operates the screen (touch pad section) of the display section 31 using a finger or an input indicator.
- the terminal 30A detects the touched position, and outputs touch position information indicating the detected position to the input control unit 346.
- a position detector is provided. Data related to the image of the touch pad displayed on the display unit 31 is stored in the storage unit 33 of the terminal 30A.
- the input control unit 346 displays the image of the touch pad on the display unit 31, and detects the position when the user performs a contact operation on the image of the touch pad.
- the content of the contact operation is recognized based on the contact position information for the contact operation sent from the unit and the data related to the image of the touch pad stored in the storage unit 33, and the content of the recognized contact operation is displayed according to the content of the recognized contact operation.
- the original screen M displayed in 20 is controlled.
- a touch pad or keyboard attached to the personal computer can be used as the touch pad section.
- the touch pad section one having a mouse function, one having a simple keyboard function, or one having both a mouse function and a simple keyboard function can be used.
- the touch pad portion is not limited to these, and may be one having a number panel, an operation panel, or the like.
- the terminal may be used as a mouse for moving a cursor displayed on the screen of the display device or selecting an object displayed on the screen. That is, the terminal may have a mouse function.
- the terminal detects the movement direction of the terminal, measures the movement amount of the terminal, and outputs movement information indicating the detected movement direction and the measured movement amount to the input control unit. section is provided.
- the storage unit of the terminal stores data representing the correspondence relationship between the movement information of the terminal and the operation related to the cursor displayed on the display device.
- the input control unit controls the movement information corresponding to the movement sent from the movement information output unit and the correspondence relationship stored in the storage unit.
- the data representing the above-mentioned correspondence relationship may include that moving the terminal in the right (left) direction on the screen of the display corresponds to moving the cursor in the right (left) direction, Moving up (down) corresponds to moving the cursor up (down), and moving the terminal once in the vertical direction of the screen of the display corresponds to performing a tap operation. , moving the terminal twice in the vertical direction of the display screen corresponds to performing a double-tap operation, and moving the terminal in the vertical direction of the display screen corresponds to performing a swipe operation.
- FIG. 80 is a diagram showing how, when a smart phone is used as a terminal, the terminal is used as a mouse to instruct the movement of the cursor. If the terminal has not only the mouse function but also the above-mentioned touch pad function, the user uses the mouse function to instruct the terminal to move the cursor, and uses the touch pad function to tap or double-tap. , swipe, drag, etc. can be instructed to the terminal.
- terminals In the terminal device of each of the embodiments described above, the case where glasses or a face shield is used as the wearable item worn on the user's head has been described. Terminals, head-mounted display (HMD) type terminals, and goggle type terminals can also be used.
- HMD head-mounted display
- the object worn on the head of the user and the terminal equipped with the display unit are configured separately.
- An existing mobile terminal or the like can be used.
- an existing portable terminal or the like as a terminal in this manner, the number of parts of the wearable object can be reduced, and the structure of the wearable object can be simplified.
- a commercially available smart phone or the like as a terminal, it is possible to operate using a familiar smart phone or the like, so that operability can be improved. Therefore, the present invention can be applied to a terminal device having attachments to be worn on the user's head, such as eyeglasses and face shields.
- Lens unit 1000 face shield 1001 shield part 1002 frame 1003 translucent films 20, 20A display device (display device) 21 small projector 22 optical system 23 hologram sheet 24 half mirror 30, 30A, 30B, 30C, 30D, 30E, 30F, 30G, 30H terminal 31 display unit 32 communication unit 33 storage unit 34, 34A, 34B, 34C, 34D, 34E , 34F, 34G, 30H control unit 341 display control units 342, 342B image data extraction unit 343 operation determination units 344, 344C position data generation units 345, 345D reference data generation units 346, 346D, 346E input control unit 347C deviation correction unit 348D Distance determination unit 349E Operation position identification unit 351 Iris/pupil image
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Engineering & Computer Science (AREA)
- Human Computer Interaction (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Computer Networks & Wireless Communication (AREA)
- Signal Processing (AREA)
- Business, Economics & Management (AREA)
- General Business, Economics & Management (AREA)
- Optics & Photonics (AREA)
- Position Input By Displaying (AREA)
- Controls And Circuits For Display Device (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
装着物には、視認画面に対して使用者が指又は所定の入力指示具で操作を行ったときにその操作を行った指又は入力指示具を撮像し、その撮像して得られた画像データを端末に無線又は有線で出力する撮像装置が設けられており、
端末は、
元画面に関するデータを含む各種のデータを記憶する記憶部と、
使用者が視認画面に対して操作を行った指又は入力指示具を撮像装置が撮像したときに、その撮像して得られた画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定する操作判定部と、
使用者が視認画面に対して操作を行った指又は入力指示具を撮像装置が撮像したときに、その撮像して得られた画像データに基づいて撮像装置が撮像することができる範囲である撮像範囲における当該指又は入力指示具の位置データを生成する位置データ生成部と、
使用者が視認画面における一又は複数の所定位置において指又は入力指示具で操作を行ったときに、操作判定部で各所定位置における操作が所定の操作であると判定された画像データに基づいて位置データ生成部で生成された当該指又は入力指示具の位置データを用いて、当該視認画面の位置及び大きさを特定するための当該視認画面に関するデータを生成して基準データとして記憶部に記憶する基準データ生成部と、
使用者が視認画面に対して指又は入力指示具で操作を行ったときに、操作判定部で判定して得られた当該指又は入力指示具による操作の内容に関するデータ及び位置データ生成部で生成された当該指又は入力指示具の位置データと、記憶部に記憶されている当該視認画面に関する基準データと、記憶部に記憶されている当該視認画面に対応する元画面に関するデータとに基づいて、撮像範囲内で当該視認画面の範囲を特定し、当該指又は入力指示具による操作がその特定した当該視認画面の範囲内のどの位置で行われたのかを調べることにより、当該指又は入力指示具による操作に対応する入力指示の内容を認識し、その認識した入力指示の内容に応じて、表示部に表示する画面の制御及び前記表示装置に表示する前記元画面の制御を行う入力制御部と、
を備えていてもよい。
装着物には、使用者の目を撮像することにより、使用者が視認画面に対して指又は所定の入力指示具で操作を行ったときに使用者の目に映り込んだ元画面及び当該指又は入力指示具についての画像を取得し、その取得した画像データを端末に無線又は有線で出力する撮像装置が設けられており、
端末は、
元画面に関するデータを含む各種のデータを記憶する記憶部と、
使用者の目に映り込んだ元画面及び指又は入力指示具を撮像装置が撮像したときに、その撮像して得られた一連の画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定する操作判定部と、
使用者の目に映り込んだ元画面及び指又は入力指示具を撮像装置が撮像したときに、その撮像して得られた一連の画像データに基づいて当該指又は入力指示具による操作の対象となった位置が当該元画面内におけるどの位置であるのかを特定する操作位置特定部と、
使用者が視認画面に対して指又は入力指示具で操作を行ったときに、操作判定部で得られた当該指又は入力指示具による操作の内容に関するデータと、操作位置特定部で得られた当該元画面内における当該指又は入力指示具による操作の対象となった位置を表すデータと、記憶部に記憶されている当該元画面に関するデータとに基づいて、当該視認画面に対して行われた当該指又は入力指示具による操作に対応する入力指示の内容を認識し、その認識した入力指示の内容に応じて、表示部に表示する画面の制御及び前記表示装置に表示する前記元画面の制御を行う入力制御部と、
を備えていてもよい。
操作判定部は、画像データ抽出部で抽出された一連の画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定し、操作位置特定部は、画像データ抽出部で抽出された一連の画像データに基づいて当該元画面内における当該指又は入力指示具による操作の対象となった位置を特定することが望ましい。これにより、画像データ抽出部によって抽出される一連の画像データには、指又は指示具が存在している画像データだけが含まれるので、操作判定部及び操作位置特定部はそれぞれ、その処理を効率よく行うことができる。
まず、本発明の第一実施形態である端末装置について説明する。図1は本発明の第一実施形態である端末装置の概略斜視図、図2(a)は第一実施形態の端末装置の概略平面図、図2(b)はその端末装置の概略右側面図である。また、図3は第一実施形態の端末装置においてディスプレイ装置(表示装置)のホログラムシートに元画面を投影している様子を説明するための概略図、図4は第一実施形態の端末装置の概略ブロック図である。
次に、本発明の第二実施形態である端末装置について説明する。図10は本発明の第二実施形態である端末装置の概略斜視図、図11は第二実施形態の端末装置の概略ブロック図である。尚、本第二実施形態において、上述した第一実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第三実施形態である端末装置について説明する。図13は本発明の第三実施形態である端末装置の概略斜視図、図14は第三実施形態の端末装置の概略ブロック図である。尚、本第三実施形態において、上述した第一実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第四実施形態である端末装置について説明する。図23は本発明の第四実施形態である端末装置の概略ブロック図である。尚、第四実施形態において、上述した第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第五実施形態である端末装置について説明する。図24は本発明の第五実施形態である端末装置の概略ブロック図である。尚、本第五実施形態の端末装置の概略斜視図は、一部の構成要素が図示されていない点を除き、図13に示す第三実施形態の端末装置の概略斜視図と略同じである。このため、ここでは、図13を第五実施形態の端末装置の概略斜視図としても用いることにする。また、第五実施形態において、第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第六実施形態である端末装置について説明する。図25は本発明の第六実施形態である端末装置の概略ブロック図である。尚、第六実施形態において、上述した第五実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第七実施形態である端末装置について説明する。図26は本発明の第七実施形態である端末装置の概略ブロック図である。尚、本第七実施形態の端末装置の概略斜視図は、図13に示す第三実施形態の端末装置の概略斜視図と略同じである。このため、ここでは、図13を第七実施形態の端末装置の概略斜視図としても用いることにする。また、第七実施形態において、第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第八実施形態である端末装置について説明する。図27は本発明の第八実施形態である端末装置の概略ブロック図である。尚、第八実施形態において、上述した第七実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第九実施形態である端末装置について説明する。図28は本発明の第九実施形態である端末装置の概略ブロック図である。尚、本第九実施形態の端末装置の概略斜視図は、図13に示す第三実施形態の端末装置の概略斜視図と略同じである。このため、ここでは、図13を第九実施形態の端末装置の概略斜視図としても用いることにする。また、第九実施形態において、上記第七実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
dX=W×(Z-L)/L
である。また、三角形Cc-Pd-Pと三角形Cc-pc-p1に注目すると、{(X-Xc)+dX}:(x1-xc)=Z:Lより、
X-Xc=(x1-xc)×Z/L-dX
=(x1-xc)×Z/L-W×(Z-L)/L
である。更に、三角形Ec-Pc-Pと三角形Ec-pc-p0に注目すると、(X-Xc):(x0-xc)=(Z+α):(L+α)より、
x0-xc=(X-Xc)×(L+α)/(Z+α)
={(x1-xc)×Z/L-W×(Z-L)/L}
×(L+α)/(Z+α)
である。したがって、
x0=(x0-xc)+xc
={(x1-xc)×Z/L-W×(Z-L)/L}
×(L+α)/(Z+α)+xc ・・・・(1)
となる。一方、図30において同様に考えると、y0をYで表す式は、
y0=(y0-yc)+yc
={(y1-yc)×Z/L-H×(Z-L)/L}
×(L+α)/(Z+α)+yc ・・・・(2)
となる。尚、上記(1)式、(2)式はともに、操作画面Tが基準画面Kよりも手前に位置しているとユーザが認識している場合にも成り立つ。
次に、本発明の第十実施形態である端末装置について説明する。図31は本発明の第十実施形態である端末装置の概略ブロック図である。尚、第十実施形態において、上述した第九実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十一実施形態である端末装置について説明する。図32は本発明の第十一実施形態である端末装置の概略ブロック図である。尚、第十一実施形態において、第九実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十二実施形態である端末装置について説明する。図33は本発明の第十二実施形態である端末装置の概略ブロック図である。尚、第十二実施形態において、上述した第十一実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十三実施形態である端末装置について説明する。図34は本発明の第十三実施形態である端末装置の概略ブロック図である。尚、本第十三実施形態の端末装置の概略斜視図は、図13に示す第三実施形態の端末装置の概略斜視図と略同じである。このため、ここでは、図13を第十三実施形態の端末装置の概略斜視図としても用いることにする。また、第十三実施形態において、第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十四実施形態である端末装置について説明する。図38は本発明の第十四実施形態である端末装置の概略ブロック図である。尚、第十四実施形態において、上述した第十三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十五実施形態である端末装置について説明する。図39は本発明の第十五実施形態である端末装置の概略ブロック図である。尚、第十五実施形態において、第十三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十六実施形態である端末装置について説明する。図40は本発明の第十六実施形態である端末装置の概略ブロック図である。尚、第十六実施形態において、上述した第十五実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十七実施形態である端末装置について説明する。図41(a)は本発明の第十七実施形態である端末装置の概略平面図、図41(b)はその端末装置の概略右側面図である。図42は第十七実施形態の端末装置の概略斜視図である。また、図43は第十七実施形態の端末装置の概略ブロック図である。尚、本第十七実施形態において、上述した第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
尚、表示制御部341は、緑色ランプを示す画像とともに、或いはその画像に代えて、指による操作が正常に認識されたことを意味する文字や図形を示す画像を元画面Mに追加するようにしてもよい。或いは、制御部34Eは、指による操作が正常に認識されたことを意味する画像の表示とともに、若しくは当該画像の表示に代えて、特定の報知音をスピーカ部80から発するようにしてもよい。
次に、本発明の第十八実施形態である端末装置について説明する。図50は本発明の第十八実施形態である端末装置の概略ブロック図である。尚、本第十八実施形態において、上述した第十七実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第十九実施形態である端末装置について説明する。図53は本発明の第十九実施形態である端末装置の概略ブロック図である。尚、本第十九実施形態において、上述した第十七実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第二十実施形態である端末装置について説明する。図56は本発明の第二十実施形態である端末装置の概略ブロック図である。尚、本第二十実施形態において、上述した第十七実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第二十一実施形態である端末装置について説明する。図59(a)は本発明の第二十一実施形態である端末装置の概略平面図、図59(b)はその端末装置の概略右側面図である。図60は図59に示す端末装置の概略斜視図である。尚、本第二十一実施形態では、上述した第三実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
次に、本発明の第二十二実施形態である端末装置について説明する。図63は本発明の第二十二実施形態である端末装置の概略斜視図である。尚、本第二十二実施形態では、上述した第一実施形態のものと同一の機能を有するものには、同一の符号を付すことにより、その詳細な説明を省略する。
尚、本発明は上記の各実施形態に限定されるものではなく、その要旨の範囲内において種々の変形が可能である。
10 眼鏡
11 レンズ部
1000 フェイスシールド
1001 シールド部
1002 フレーム
1003 半透明のフィルム
20,20A ディスプレイ装置(表示装置)
21 小型プロジェクタ
22 光学系
23 ホログラムシート
24 ハーフミラー
30,30A,30B,30C,30D,30E,30F,30G,30H 端末
31 表示部
32 通信部
33 記憶部
34,34A,34B,34C,34D,34E,34F,34G,30H 制御部
341 表示制御部
342,342B 画像データ抽出部
343 操作判定部
344,344C 位置データ生成部
345,345D 基準データ生成部
346,346D,346E 入力制御部
347C ずれ補正部
348D 距離判定部
349E 操作位置特定部
351 虹彩・瞳孔画像データ生成部
352 画像差分抽出部
353 画像変換部
354 目存在・不存在判定部
355 報知制御部
40 通信部
50 ケーブル
60,60A,60B 撮像装置
61 カメラ部
62 画像処理部
63,63A カメラ制御部
631 オートフォーカス制御部
70 マイク部
80 スピーカ部
90 タッチパッド部
100 筐体
200 文字入力画面
201 文字入力画面(基準データ設定用の元画面)
210 キーボード画像
220 表示領域
221 検索画面
2211 キーワード入力部
2212 検索結果表示部
301,302,303,304,305 太陽電池
M 元画面
S 視認画面
K 基準画面
T 操作画面
Claims (29)
- 使用者の頭部に装着される装着物と、
前記装着物に設けられた、前記使用者に空中に浮かんでいるように見える視認画面に対応する元画面を表示する表示装置と、
前記装着物と別体に構成された、表示部を搭載する端末と、
を具備し、
前記端末が、前記表示装置と無線又は有線で接続されており、前記表示部に表示されている画面を前記元画面として前記表示装置に表示するように前記表示装置を制御する機能を有することを特徴とする端末装置。 - 前記表示装置は表示デバイスを有する投影装置とホログラムシート又はハーフミラーとを備え、前記ホログラムシート又はハーフミラーは前記使用者の顔の前側であって前記使用者の視野の範囲内に配置されており、
前記元画面は前記投影装置からの画像を前記ホログラムシート又はハーフミラーに投影することにより前記ホログラムシート又はハーフミラーに表示されることを特徴とする請求項1記載の端末装置。 - 前記表示装置は、表示デバイスを有する投影装置と、光学系と、前記表示デバイスに表示された前記元画面が前記光学系を介して投影される投影部とを備えており、
前記投影部は、半透明スクリーン、透過型スクリーン、ホログラムシート、ホログラムフィルム、ホログラム光学素子又はハーフミラーであることを特徴とする請求項1記載の端末装置。 - 前記表示装置は、表示デバイスを有する投影装置と、光学系と、前記表示デバイスに表示された前記元画面が前記光学系を介して投影される投影部とを備えており、
前記光学系は、レンズ、反射鏡、プリズム、導光板や導波路のうちその一部又は全部から構成されるものであることを特徴とする請求項1記載の端末装置。 - 前記表示装置は透過型又は透明型の表示デバイスであり、前記表示デバイスは前記使用者の顔の前側であって前記使用者の視野の範囲内に配置されていることを特徴とする請求項1記載の端末装置。
- 前記装着物に設けられた各種の装置が外部と無線通信を行うための通信部を更に具備しており、
前記端末は外部と無線通信を行う機能を有し、前記表示装置は前記通信部を介して前記端末と無線通信を行うことを特徴とする請求項1、2、3、4又は5記載の端末装置。 - 前記端末及び前記表示装置はそれぞれ、無線通信を行う際に相手から送られる識別情報に基づいて認証を行った後にデータの通信を行うことを特徴とする請求項6記載の端末装置。
- 前記端末は、前記表示部に表示されている画面を前記元画面として前記表示装置に表示する場合、前記使用者によってなされた前記表示装置の画面表示に関する設定にしたがって、前記表示部に表示されている画面を簡略化した画面、前記表示部に表示されている画面の一部、又は、前記表示部に表示されている画面において文字及び/又は図表を拡大した画面を前記表示装置に表示することを特徴とする請求項1、2、3、4、5、6又は7記載の端末装置。
- 前記端末は、前記表示部に表示されている画面を前記元画面として前記表示装置に表示する場合、前記使用者によってなされた前記表示部の画面に関する設定にしたがって、前記表示部における当該画面の表示をそのまま維持する又は前記表示部を消灯することを特徴とする請求項1、2、3、4、5、6、7又は8記載の端末装置。
- 前記端末は、前記表示部に表示されている画面を前記元画面として前記表示装置に表示する場合、使用者により前記表示装置に表示すべき画面が指定されたときに、前記表示部に現在表示されている画面とは別に、その指定された画面を前記表示装置に表示することを特徴とする請求項1、2、3、4、5、6、7、8又は9記載の端末装置。
- 前記端末は、携帯端末であり、自らの位置に関する位置情報を取得する機能と、前記記憶部に記憶された地図情報及び前記位置情報に基づいて前記使用者を現在の位置から前記使用者が設定した目的地に導くための画面を作成して前記表示部に表示する機能とを有することを特徴とする請求項1乃至10の何れか一項記載の端末装置。
- 前記端末は、携帯端末であり、自らの位置に関する位置情報を取得する機能と、前記記憶部に記憶された地図情報及び前記位置情報に基づいて現在の位置の周辺にある店舗を検索し、その検索して得られた店舗に関する情報を前記表示部に表示する機能とを有することを特徴とする請求項1乃至11の何れか一項記載の端末装置。
- 前記表示装置は、前記装着物に着脱可能に設けられていることを特徴とする請求項1乃至12の何れか一項記載の端末装置。
- 前記装着物には、前記視認画面に対して使用者が指又は所定の入力指示具で操作を行ったときにその操作を行った指又は入力指示具を撮像し、その撮像して得られた画像データを前記端末に無線又は有線で出力する撮像装置が設けられており、
前記端末は、
前記元画面に関するデータを含む各種のデータを記憶する記憶部と、
使用者が前記視認画面に対して操作を行った指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定する操作判定部と、
使用者が前記視認画面に対して操作を行った指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた画像データに基づいて前記撮像装置が撮像することができる範囲である撮像範囲における当該指又は入力指示具の位置データを生成する位置データ生成部と、
使用者が前記視認画面における一又は複数の所定位置において指又は入力指示具で操作を行ったときに、前記操作判定部で前記各所定位置における操作が所定の操作であると判定された前記画像データに基づいて前記位置データ生成部で生成された当該指又は入力指示具の位置データを用いて、当該視認画面の位置及び大きさを特定するための当該視認画面に関するデータを生成して基準データとして前記記憶部に記憶する基準データ生成部と、
使用者が前記視認画面に対して指又は入力指示具で操作を行ったときに、前記操作判定部で判定して得られた当該指又は入力指示具による操作の内容に関するデータ及び前記位置データ生成部で生成された当該指又は入力指示具の位置データと、前記記憶部に記憶されている当該視認画面に関する前記基準データと、前記記憶部に記憶されている当該視認画面に対応する前記元画面に関するデータとに基づいて、前記撮像範囲内で当該視認画面の範囲を特定し、当該指又は入力指示具による操作がその特定した当該視認画面の範囲内のどの位置で行われたのかを調べることにより、当該指又は入力指示具による操作に対応する入力指示の内容を認識し、その認識した入力指示の内容に応じて、前記表示部に表示する画面の制御及び前記表示装置に表示する前記元画面の制御を行う入力制御部と、
を備えることを特徴とする請求項1、2、3、4又は5記載の端末装置。 - 前記端末は、前記撮像装置を制御して前記撮像装置の前記撮像範囲を調整する機能、及び、前記撮像装置を制御して被写体にピントが合う奥行き方向の範囲である被写界深度を調整する機能を有することを特徴とする請求項14記載の端末装置。
- 前記装着物には、前記使用者の目を撮像することにより、前記使用者が前記視認画面に対して指又は所定の入力指示具で操作を行ったときに前記使用者の目に映り込んだ前記元画面及び当該指又は入力指示具についての画像を取得し、その取得した画像データを前記端末に無線又は有線で出力する撮像装置が設けられており、
前記端末は、
前記元画面に関するデータを含む各種のデータを記憶する記憶部と、
前記使用者の目に映り込んだ前記元画面及び前記指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた一連の画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定する操作判定部と、
前記使用者の目に映り込んだ前記元画面及び前記指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた一連の画像データに基づいて当該指又は入力指示具による操作の対象となった位置が当該元画面内におけるどの位置であるのかを特定する操作位置特定部と、
前記使用者が前記視認画面に対して前記指又は入力指示具で操作を行ったときに、前記操作判定部で得られた当該指又は入力指示具による操作の内容に関するデータと、前記操作位置特定部で得られた当該元画面内における当該指又は入力指示具による操作の対象となった位置を表すデータと、前記記憶部に記憶されている当該元画面に関するデータとに基づいて、当該視認画面に対して行われた当該指又は入力指示具による操作に対応する入力指示の内容を認識し、その認識した入力指示の内容に応じて、前記表示部に表示する画面の制御及び前記表示装置に表示する前記元画面の制御を行う入力制御部と、
を備えることを特徴とする請求項1、2、3、4又は5記載の端末装置。 - 前記操作位置特定部は、前記使用者の目に映り込んだ前記元画面及び前記指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた一連の画像データに基づいて、前記撮像装置が撮像することができる範囲である撮像範囲内における当該元画面の範囲及び前記撮像範囲内における当該指又は入力指示具の位置を求め、その求めた前記撮像範囲内における当該元画面の範囲及び前記撮像範囲内における当該指又は入力指示具の位置に基づいて当該元画面内における当該指又は入力指示具による操作の対象となった位置を特定することを特徴とする請求項16記載の端末装置。
- 前記端末には、前記撮像装置で撮像して得られた一連の画像データの中から、前記指又は入力指示具が存在している画像データを抽出する画像データ抽出部が更に備えられており、
前記操作判定部は、前記画像データ抽出部で抽出された一連の画像データに基づいて当該指又は入力指示具による操作が各種の操作のうちどのような内容の操作であるのかを判定し、前記操作位置特定部は、前記画像データ抽出部で抽出された一連の画像データに基づいて当該元画面内における当該指又は入力指示具による操作の対象となった位置を特定することを特徴とする請求項16又は17記載の端末装置。 - 前記端末は、
前記表示装置に前記元画面を表示する前に前記使用者の目を前記撮像装置で撮像したときに、その撮像して得られた画像データに基づいて虹彩及び瞳孔の画像データを生成して前記記憶部に記憶する虹彩・瞳孔画像データ生成部と、
前記使用者の目に映り込んだ前記元画面及び前記指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた画像データと前記記憶部に記憶されている前記虹彩及び瞳孔の画像データとに基づいてそれらの差分を抽出する処理を行うことにより、前記虹彩及び瞳孔の画像が取り除かれた画像データを生成する画像差分抽出部と、
を更に備えており、
前記画像データ抽出部は、前記画像差分抽出部で生成された一連の画像データを用いて画像データの抽出処理を行うことを特徴とする請求項18記載の端末装置。 - 前記端末は、
前記表示装置に前記元画面を表示する前に、コンタクトレンズを装着している前記使用者の目を前記撮像装置で撮像したときに、その撮像して得られた画像データに基づいてコンタクトレンズ、虹彩及び瞳孔の画像データを生成して前記記憶部に記憶する虹彩・瞳孔画像データ生成部と、
前記使用者の目に映り込んだ前記元画面及び前記指又は入力指示具を前記撮像装置が撮像したときに、その撮像して得られた画像データと前記記憶部に記憶されている前記コンタクトレンズ、虹彩及び瞳孔の画像データとに基づいてそれらの差分を抽出する処理を行うことにより、前記コンタクトレンズ、虹彩及び瞳孔の画像が取り除かれた画像データを生成する画像差分抽出部と、
を更に備えており、
前記画像データ抽出部は、前記画像差分抽出部で生成された一連の画像データを用いて画像データの抽出処理を行うことを特徴とする請求項18記載の端末装置。 - 前記端末は、
前記撮像装置で撮像して得られた画像データに前記使用者の目の画像が存在しているかどうかを判定し、前記使用者の目の画像が存在していない画像データが前記撮像装置で一定時間継続して取得されたことを検知する目存在・不存在判定部と、
前記目存在・不存在判定部により前記使用者の目の画像が存在していない画像データが前記撮像装置で前記一定時間継続して取得されたことが検知されたときに、報知装置を制御して前記報知装置から音又は振動を発生させる報知制御部と、
を更に備えていることを特徴とする請求項16乃至20の何れか一項記載の端末装置。 - 前記端末には、前記表示部の画面に対して接触操作が行われたときにその接触した位置を検知し、その検知した位置を示す接触位置情報を前記入力制御部に出力する位置検知部が備えられており、
前記表示部に表示されるタッチパッドの画像に関するデータが前記記憶部に記憶されており、
前記入力制御部は、前記表示装置に元画面が表示されている場合、前記表示部に前記タッチパッドの画像が表示され、前記使用者が前記タッチパッドの画像上で接触操作を行ったときに、前記位置検知部から送られる当該接触操作に対する接触位置情報及び前記記憶部に記憶されている前記タッチパッドの画像に関するデータに基づいて当該接触操作の内容を認識し、その認識した接触操作の内容にしたがって、前記表示装置に表示されている元画面の制御を行うことを特徴とする請求項14乃至21の何れか一項記載の端末装置。 - 前記端末には、当該端末の移動方向を検知すると共に当該端末の移動量を計測し、その検知した移動方向及びその計測した移動量を示す移動情報を前記入力制御部に出力する移動情報出力部が備えられており、
前記端末の移動情報と前記表示装置上に表示されるカーソルに関連する操作との対応関係を表すデータが前記記憶部に記憶されており、
前記入力制御部は、前記表示装置に元画面が表示されている場合、前記使用者が前記端末を移動させたときに、前記移動情報出力部から送られる当該移動に対する移動情報及び前記記憶部に記憶されている前記対応関係を表すデータに基づいて前記カーソルに関連する操作の内容を認識し、その認識した操作の内容にしたがって、前記表示装置に表示されている元画面の制御を行うことを特徴とする請求項14乃至22の何れか一項記載の端末装置。 - 前記表示装置に表示される元画面には遠隔制御可能な装置についてのリモートコントローラの操作部に対応する画面が含まれており、
前記端末は、前記リモートコントローラの操作部に対応する画面を元画面として前記表示装置に表示させ、その元画面に対応する視認画面に対して操作が行われたときに、その操作の内容を示す指令信号を生成し、その生成した指令信号を前記遠隔制御可能な装置に無線送信する遠隔制御部を更に備えることを特徴とする請求項14乃至23の何れか一項記載の端末装置。 - 前記表示装置及び前記撮像装置は、前記装着物に着脱可能に設けられていることを特徴とする請求項14乃至24の何れか一項記載の端末装置。
- 前記表示装置の一部は前記装着物に着脱可能に設けられていることを特徴とする請求項2、3又は4記載の端末装置。
- 前記端末に対する入力装置であるタッチパッド部を更に具備しており、
前記タッチパッド部が前記装着部に着脱可能に設けられていることを特徴とする請求項1乃至26記載の何れか一項記載の端末装置。 - 前記端末から出力された電気信号を音に変換してその音を耳を介して又は骨伝導により前記使用者に伝達する音出力装置を更に具備しており、
前記音出力装置が前記装着物に設けられていることを特徴とする請求項1乃至27の何れか一項記載の端末装置。 - 前記使用者の音声を電気信号に変換してその電気信号を前記端末に出力する音入力装置を更に具備しており、
前記音入力装置が前記装着物に設けられていることを特徴とする請求項1乃至28の何れか一項記載の端末装置。
Priority Applications (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US18/280,888 US20240089362A1 (en) | 2021-03-08 | 2021-07-30 | Terminal Device |
EP21929440.2A EP4307289A1 (en) | 2021-03-08 | 2021-07-30 | Terminal device |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2021-036577 | 2021-03-08 | ||
JP2021036577 | 2021-03-08 | ||
JP2021106022A JP7080448B1 (ja) | 2021-03-08 | 2021-06-25 | 端末装置 |
JP2021-106022 | 2021-06-25 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022190406A1 true WO2022190406A1 (ja) | 2022-09-15 |
Family
ID=81892212
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/028304 WO2022190406A1 (ja) | 2021-03-08 | 2021-07-30 | 端末装置 |
Country Status (4)
Country | Link |
---|---|
US (1) | US20240089362A1 (ja) |
EP (1) | EP4307289A1 (ja) |
JP (1) | JP7080448B1 (ja) |
WO (1) | WO2022190406A1 (ja) |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1198227A (ja) | 1997-09-22 | 1999-04-09 | Gakuichi Yanagi | 通話システムの携帯用遠隔起動端末 |
JP2012078224A (ja) * | 2010-10-01 | 2012-04-19 | Olympus Corp | 画像生成システム、プログラム及び情報記憶媒体 |
JP2014048775A (ja) * | 2012-08-30 | 2014-03-17 | Kotaro Unno | 注視位置特定装置、および注視位置特定プログラム |
US20150128251A1 (en) * | 2013-11-05 | 2015-05-07 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
JP2015536514A (ja) * | 2012-11-30 | 2015-12-21 | マイクロソフト テクノロジー ライセンシング,エルエルシー | Imuを用いた直接ホログラム操作 |
JP2019082891A (ja) * | 2017-10-31 | 2019-05-30 | セイコーエプソン株式会社 | 頭部装着型表示装置、表示制御方法、およびコンピュータープログラム |
JP2021005157A (ja) * | 2019-06-25 | 2021-01-14 | 株式会社ソニー・インタラクティブエンタテインメント | 画像処理装置および画像処理方法 |
Family Cites Families (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07110735A (ja) * | 1993-10-14 | 1995-04-25 | Nippon Telegr & Teleph Corp <Ntt> | 装着型ペン入力装置 |
JP5293154B2 (ja) * | 2008-12-19 | 2013-09-18 | ブラザー工業株式会社 | ヘッドマウントディスプレイ |
JP2014056462A (ja) | 2012-09-13 | 2014-03-27 | Toshiba Alpine Automotive Technology Corp | 操作装置 |
JP6057396B2 (ja) | 2013-03-11 | 2017-01-11 | Necソリューションイノベータ株式会社 | 3次元ユーザインタフェース装置及び3次元操作処理方法 |
JP6206099B2 (ja) | 2013-11-05 | 2017-10-04 | セイコーエプソン株式会社 | 画像表示システム、画像表示システムを制御する方法、および、頭部装着型表示装置 |
US10663740B2 (en) * | 2014-06-09 | 2020-05-26 | Mentor Acquisition One, Llc | Content presentation in head worn computing |
JP6041016B2 (ja) | 2014-07-25 | 2016-12-07 | 裕行 池田 | 眼鏡型端末 |
JP6724931B2 (ja) | 2016-02-12 | 2020-07-15 | 富士通株式会社 | 画面解像度調整システム、表示制御システム、画面解像度調整方法、及び表示制御方法 |
-
2021
- 2021-06-25 JP JP2021106022A patent/JP7080448B1/ja active Active
- 2021-07-30 US US18/280,888 patent/US20240089362A1/en active Pending
- 2021-07-30 EP EP21929440.2A patent/EP4307289A1/en active Pending
- 2021-07-30 WO PCT/JP2021/028304 patent/WO2022190406A1/ja active Application Filing
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH1198227A (ja) | 1997-09-22 | 1999-04-09 | Gakuichi Yanagi | 通話システムの携帯用遠隔起動端末 |
JP2012078224A (ja) * | 2010-10-01 | 2012-04-19 | Olympus Corp | 画像生成システム、プログラム及び情報記憶媒体 |
JP2014048775A (ja) * | 2012-08-30 | 2014-03-17 | Kotaro Unno | 注視位置特定装置、および注視位置特定プログラム |
JP2015536514A (ja) * | 2012-11-30 | 2015-12-21 | マイクロソフト テクノロジー ライセンシング,エルエルシー | Imuを用いた直接ホログラム操作 |
US20150128251A1 (en) * | 2013-11-05 | 2015-05-07 | Lg Electronics Inc. | Mobile terminal and method for controlling the same |
JP2019082891A (ja) * | 2017-10-31 | 2019-05-30 | セイコーエプソン株式会社 | 頭部装着型表示装置、表示制御方法、およびコンピュータープログラム |
JP2021005157A (ja) * | 2019-06-25 | 2021-01-14 | 株式会社ソニー・インタラクティブエンタテインメント | 画像処理装置および画像処理方法 |
Also Published As
Publication number | Publication date |
---|---|
EP4307289A1 (en) | 2024-01-17 |
JP7080448B1 (ja) | 2022-06-06 |
JP2022136951A (ja) | 2022-09-21 |
US20240089362A1 (en) | 2024-03-14 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10983593B2 (en) | Wearable glasses and method of displaying image via the wearable glasses | |
JP6041016B2 (ja) | 眼鏡型端末 | |
US10635182B2 (en) | Head mounted display device and control method for head mounted display device | |
US10133407B2 (en) | Display apparatus, display system, method for controlling display apparatus, and program | |
JP6264871B2 (ja) | 情報処理装置および情報処理装置の制御方法 | |
US9367128B2 (en) | Glass-type device and control method thereof | |
CN106067833B (zh) | 移动终端及其控制方法 | |
JP2016004340A (ja) | 情報配信システム、頭部装着型表示装置、頭部装着型表示装置の制御方法、および、コンピュータープログラム | |
JP6303274B2 (ja) | 頭部装着型表示装置および頭部装着型表示装置の制御方法 | |
US11429200B2 (en) | Glasses-type terminal | |
KR20180004112A (ko) | 안경형 단말기 및 이의 제어방법 | |
KR20180002208A (ko) | 단말기 및 그 제어 방법 | |
US10884498B2 (en) | Display device and method for controlling display device | |
JP6638392B2 (ja) | 表示装置、表示システム、表示装置の制御方法、及び、プログラム | |
JP6740613B2 (ja) | 表示装置、表示装置の制御方法、及び、プログラム | |
WO2022190406A1 (ja) | 端末装置 | |
JP7031112B1 (ja) | 眼鏡型端末 | |
JP2017182460A (ja) | 頭部装着型表示装置、頭部装着型表示装置の制御方法、コンピュータープログラム | |
KR20170046947A (ko) | 이동 단말기 및 제어 방법 | |
JP2016142966A (ja) | 頭部装着型表示装置、情報処理装置、画像表示装置、画像表示システム、頭部装着型表示装置の表示を共有する方法、コンピュータープログラム | |
KR20160027813A (ko) | 글래스형 단말기 | |
JP2018107823A (ja) | 頭部装着型表示装置および頭部装着型表示装置の制御方法 | |
KR20160097785A (ko) | 글래스 타입 단말기 및 그 제어 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21929440 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 18280888 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2021929440 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: 2021929440 Country of ref document: EP Effective date: 20231009 |