CN101141567A - Image capturing and displaying apparatus and image capturing and displaying method - Google Patents

Image capturing and displaying apparatus and image capturing and displaying method Download PDF

Info

Publication number
CN101141567A
CN101141567A CN 200710153620 CN200710153620A CN101141567A CN 101141567 A CN101141567 A CN 101141567A CN 200710153620 CN200710153620 CN 200710153620 CN 200710153620 A CN200710153620 A CN 200710153620A CN 101141567 A CN101141567 A CN 101141567A
Authority
CN
China
Prior art keywords
image
image acquisition
user
display
information
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN 200710153620
Other languages
Chinese (zh)
Other versions
CN101141567B (en
Inventor
佐古曜一郎
鹤田雅明
伊藤大二
飞鸟井正道
海老泽观
Original Assignee
Sony Corp
Sony Computer Entertainment Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2006261975A external-priority patent/JP2008083289A/en
Application filed by Sony Corp, Sony Computer Entertainment Inc filed Critical Sony Corp
Publication of CN101141567A publication Critical patent/CN101141567A/en
Application granted granted Critical
Publication of CN101141567B publication Critical patent/CN101141567B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

An image capturing and displaying apparatus is disclosed. The image capturing and displaying apparatus includes an image capturing section, a display section, a user's information obtaining section, and a control section. The image capturing section captures an image such that a direction in which a user sees a subject is a direction of the subject. The display section is disposed in front of eyes of the user and displays the image captured by the image capturing section. The user's information obtaining section obtains information about a motion and a physical situation of the user. The control section controls an operation of the image capturing section or an operation of the display section corresponding to information obtained by the user's information obtaining section.

Description

Image acquisition and display device and image acquisition and display packing
The cross reference of related application
The present invention comprises Japanese patent application 2006-244685 that submits to on September 8th, 2006 and the relevant theme of submitting to September 27 in 2007 of Japanese patent application 2006-261975, and the full content of these applications is incorporated in this by reference.
Technical field
The present invention relates to image acquisition and display device and image acquisition and display packing, they are configured to catch the subject image along the user's visual orientation as the subject direction, and when for example putting on the equipment as glasses type installation unit or helmet type installation unit he or she, show the image of being caught in he or she eyes front.
Background technology
Proposed the equipment of many use glasses type installation units or helmet type installation unit in Japanese Patent Application Publication HEI 8-126031, HEI 9-27970 and HEI 9-185009, they have the display part that is arranged in eyes of user front and display image.
Summary of the invention
Yet particularly from helping user's vision and expand on the viewpoint of he or she visual capacity, the equipment in the correlation technique is not controlled image acquisition operations and display operation.
Consider said circumstances, desired is the visual field of assisting users and expands he or she visual capacity.In addition, also expectation control display operation and image acquisition operations rightly in case with user's situation (for example, hope, visual state, physical condition etc.) accordingly assisting users the visual field and expand he or she visual capacity.In addition, also expect to control rightly display operation and image acquisition operations so that carry out these operations accordingly with external circumstances (for example, surrounding environment, subject, date and time, position etc.).
According to embodiments of the invention, a kind of image acquisition and display device are provided.This image acquisition and display device comprise that image acquisition section, display part, user profile obtain part and control section.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.User profile obtains part and obtains relevant user's the motion and the information of physical condition.Control section is controlled the operation of image acquisition section or the operation of display part accordingly with the information that is obtained the part acquisition by user profile.
According to embodiments of the invention, provide the image acquisition and the display packing of a kind of image acquisition and display device.This image acquisition and display device comprise image acquisition section and display part.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.Obtain the information of relevant user movement or user's body situation.Control the operation of image acquisition section or the operation of display part accordingly with the information that is obtained.
According to embodiments of the invention, a kind of image acquisition and display device are provided.This image acquisition and display device comprise that image acquisition section, display part, external information obtain part and control section.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.External information obtains part and obtains external information.Control section is controlled the operation of image acquisition section or the operation of display part accordingly with the information that is obtained the part acquisition by external information.
According to embodiments of the invention, provide the image acquisition and the display packing of a kind of image acquisition and display device.This image acquisition and display device comprise image acquisition section and display part.Image acquisition section is obtained image like this, so that make the user see that the direction of subject is exactly the direction of subject.The display part is arranged in the front of eyes of user and shows the image that is obtained by image acquisition section.Obtain external information.Control the operation of image acquisition section or the operation of display part accordingly with the information that is obtained.
According to embodiments of the invention, when the user put on such image acquisition of glasses type installation unit for example or helmet type installation unit and display device, the user saw the display part that is arranged in he or her front.When making the display part show the image that is obtained by image acquisition section, the user can utilize the display part to see the scene image that is obtained in he or she normal vision direction.
In this case, though the user sees scene by image acquisition in the embodiment of the invention and display device on he or she normal vision direction, he or she sees that the image that shows is as the scene in he or she the normal vision scene on the display part.When changing the display mode of the image that shows on the display part accordingly, can help or expand he or she visual capabilities with such user situation such as for example user's hope, the visual field, physical condition.
When for example showing distant view image (telescopic image), the user can see the distant place scene that he or she can not see usually.When the user read books or newspaper with he or she amblyopia power eyes, if character has wherein been amplified in the display part, then he or she can be clear that them.
In other words, when the display mode of the image of controlling the operation of image acquisition section and display part accordingly with user's situation and being obtained, can providing wherein, the user feels comfortable vision situation.
According to embodiments of the invention, when the user put on as the image acquisition of glasses type installation unit for example or helmet type installation unit and display device, the user saw the display part that is arranged in he or her front.When making the display part show the image that is obtained by image acquisition section, the user can utilize the display part to see the scene image that is obtained on he or she normal vision direction.
In this case, though the user sees scene by image acquisition in the embodiment of the invention and display device on he or she normal vision direction, he or she sees that the image that shows on the display part is as the scene in he or she the normal vision scene.When changing the display mode of the image that on the display part, shows accordingly, can help or expand he or she visual capacity with the such external circumstances of for example ambient conditions, subject situation etc.
When for example showing distant view image, the user can see the distant place scene that he or she can not see usually.When the user read books or newspaper with he or she amblyopia power eyes, if the display part has been amplified character wherein and adjusted its brightness and contrast, then he or she can be clear that them.
In other words, when the display mode of the image of controlling the operation of image acquisition section and display part accordingly with external information and being obtained, can providing wherein to the user, the user feels comfortable or interested vision situation.
According to embodiments of the invention, show the image that obtains by image acquisition section by the display part that is arranged in the user front, that is, and the image that on user's visual direction, obtains as the subject direction.When controlling the operation of the operation of image acquisition section or display part accordingly, can assist and expand he or she visual capacity in fact with the information of user's operation or relevant he or she physical condition.
Because by with user's hope or determined and represent the corresponding situation of information of he or she motion or physical condition to control image acquisition section accordingly or display part assigns to change display mode, so do not apply the operation burden to he or she.In addition, because controlled image acquisition section and display part rightly, so image acquisition and display device have the user friendly of height.
In addition, promptly transparent or translucent because the display part can become by state (through state), rather than show the image that obtains by image acquisition section, can live so carry the user of image acquisition and display device with having no problem.Therefore, in user's normal life, can obtain the benefit of the embodiment of the invention effectively.
According to embodiments of the invention, show the image that obtains by image acquisition section by the display part that is arranged in the user front, that is, and the image that in user's visual direction, obtains as the subject direction.When controlling the operation of the operation of image acquisition section or display part accordingly, can assist and expand he or she visual capacity in fact with external information.
Because by with determined, control image acquisition section accordingly with the corresponding surrounding environment of external information, subject type, its situation etc. or display part assigns to change display mode, so do not apply the operation burden to he or she.In addition, because controlled image acquisition section and display part rightly, so image acquisition and display device have the user friendly of height.
In addition, promptly transparent or translucent because the display part can become by state, rather than show the image that obtains by image acquisition section, can live so put on the user of image acquisition and display device with having no problem.Therefore, in user's normal life, can obtain the benefit of the embodiment of the invention effectively.
According to following detailed description to the optimal mode embodiment of the present invention that illustrates in the accompanying drawing, these and other purpose of the present invention, feature and advantage will become more obvious.
Description of drawings
Fig. 1 is the schematic diagram of describing according to the exemplary outward appearance of the image acquisition of the embodiment of the invention and display device;
Fig. 2 illustrates according to the image acquisition of first embodiment of the invention and the block diagram of display device;
Fig. 3 A, Fig. 3 B and Fig. 3 C describe passing through state, normally obtain the schematic diagram of image display status and distant view image show state according to the embodiment of the invention respectively;
Fig. 4 A and Fig. 4 B are the schematic diagrames of describing respectively according to the embodiment of the invention that passes through state and wide-angle zoom image display status;
Fig. 5 A and Fig. 5 B are the schematic diagrames of describing respectively according to the embodiment of the invention that normally obtains image display status/by state and enlarged image show state;
Fig. 6 A and Fig. 6 B describe normally obtaining image display status/by state and adjusting the schematic diagram of image display status according to the embodiment of the invention respectively;
Fig. 7 A and Fig. 7 B describe normally obtaining image display status/by state and having increased the schematic diagram that obtains image display status of infrared sensitivity according to the embodiment of the invention respectively;
Fig. 8 A and Fig. 8 B describe normally obtaining image display status/by state and having increased the schematic diagram that obtains image display status of ultraviolet sensitivity according to the embodiment of the invention respectively;
Fig. 9 A, Fig. 9 B and Fig. 9 C describe the schematic diagram that passes through state, two separate picture show states and four separate picture show states according to an embodiment of the invention respectively;
Figure 10 is the flow chart that illustrates according to the control and treatment of first embodiment of the invention;
Figure 11 A and Figure 11 B are that the supervision that illustrates according to first embodiment of the invention shows the flow chart that starts the definite processing of triggering;
Figure 12 A and Figure 12 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 13 A and Figure 13 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 14 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 15 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 16 is that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 17 A and Figure 17 B are that the image control that illustrates according to first embodiment of the invention triggers the flow chart of determining processing;
Figure 18 A and Figure 18 B are that the supervision that illustrates according to first embodiment of the invention shows the flow chart of finishing the definite processing of triggering;
Figure 19 is that the supervision that illustrates according to first embodiment of the invention shows the flow chart of finishing the definite processing of triggering;
Figure 20 illustrates according to the image acquisition of second embodiment of the invention and the block diagram of display device;
Figure 21 A and Figure 21 B are the schematic diagrames of describing respectively according to second embodiment of the invention of not adjusting image display status and adjustment image display status;
Figure 22 A and Figure 22 B are the schematic diagrames of describing according to second embodiment of the invention that highlights image display status;
Figure 23 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 24 A and Figure 24 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 25 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 26 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 27 is that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 28 A and Figure 28 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing;
Figure 29 A and Figure 29 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing; And
Figure 30 A and Figure 30 B are that the image control that illustrates according to second embodiment of the invention triggers the flow chart of determining processing.
Embodiment
Next, will be with image acquisition and display device and image acquisition and the display packing of following order description according to first embodiment of the invention.
[the 1. exemplary outward appearance of image acquisition and display device]
[the 2. demonstrative structure of image acquisition and display device]
[3. exemplary display image]
[4. user situation determines]
[5. example operation]
[the 6. effect of first embodiment, modification and expansion]
[the 1. demonstration outward appearance of image acquisition and display device]
Fig. 1 show according to first embodiment of the invention, show that as glasses type image of camera obtains the exemplary outward appearance with display device 1.Image acquisition and display device 1 comprise the installation unit with semi-circular shape, its each from two stature lateral parts to the end aft section around user's head.As shown in Figure 1, the user is suspended on by the predetermined portions with image acquisition and display device 1 on two auricles of he or her ear and puts on this image acquisition and display device 1.
Under installment state shown in Figure 1, a pair of display part 2 that is used for left eye and right eye just is arranged in the front of eyes of user, that is, and and at the lens position place of conventional glasses.Display part 2 comprises for example liquid crystal panel.By control display part 2 transmissivity, they can become as shown in Figure 1 pass through state, be transparent or translucent.When display part 2 has become by state, though the user resemble wear glasses continuous image acquisition and the display device 1 put on, this equipment can not influence he or she normal life yet.
Put on the user under the state of image acquisition and display device 1, image acquisition lens 3a is arranged by forward direction, so that make it along the image that obtains subject as user's visual direction of subject direction.
In addition, arranged the luminous component 4a of the image acquisition direction that illuminates image acquisition lens 3a.Luminous component 4a comprises for example LED (light-emitting diode).
Also arranged a pair of earphone speaker 5a (only showing left side earphone speaker 5a among Fig. 1) in the left and right sides earhole that under the installment state of image acquisition and display device 1, is inserted into the user.
In addition, collect the microphone 6a of external voice and the right that 6b is arranged in right eye display part 2 and the left side of left eye display part 2.
Fig. 1 only is exemplary.Therefore, can there be many users to put on the structure of image acquisition and display device 1.As long as image acquisition and display device 1 are glasses type installation unit or helmet type installation unit, and as long as just be arranged in the front of eyes of user according to this embodiment, display part 2 at least and the image acquisition direction of image acquisition lens 3a is user's a visual direction, promptly in user's front, then the structure of image acquisition and display device 1 just is not limited to shown in Figure 1 that.In addition, though show wherein two structures that eyes are arranged accordingly of two display parts 2 and user, can arrange a display part 2 accordingly with user's eyes.
Similarly, earphone speaker 5a can need not to be left and right sides boombox.As an alternative, can arrange an earphone speaker accordingly with user's a ear.Similarly, can cloth microphone 6a and one of 6b.In addition, image acquisition and display device 1 can have any microphone and any earphone speaker.
In addition, image acquisition and display device 1 can have luminous component 4a.
[the 2. demonstrative structure of image acquisition and display device]
Fig. 2 shows the exemplary internal configuration of image acquisition and display device 1.
System controller 10 comprises microcomputer, and it comprises for example CPU (CPU), ROM (read-only memory), RAM (random access memory), nonvolatile memory part and interface section.System controller 10 is control sections of the whole parts in control image acquisition and the display device 1.
System controller 10 and user situation are controlled each part of image acquisition and display device 1 accordingly.In other words, system controller 10 is operated accordingly with the operation sequence of detection and definite user situation, and operates and control each part accordingly with definite result.Therefore, as shown in Figure 2, system controller 10 comprises that on function the user situation of determining user situation determines function 10a and control and order the operation controlled function 10b of each part with definite result accordingly.
In image acquisition and display device 1, arranged image acquisition section 3, image acquisition control section 11 and obtained image signal processing section 15, as the structure of obtaining at the image of user front.
Image acquisition section 3 comprises lens combination (it has image acquisition lens 3a (as shown in Figure 1), aperture, zoom lens, condenser lens etc.), make lens combination carry out the drive system of focusing operation and zoom operation, and the solid state image sensor array, wherein the light that obtains image that obtains by lens combination of solid state image sensor array detection, light be converted to electricity and generate and the electric corresponding picture signal of obtaining.The solid state image sensor array comprises for example CCD (charge coupled device) sensor array or CMOS (complementary metal oxide semiconductors (CMOS)) sensor array.
Obtain image signal processing section 15 and comprise sampling maintenance/AGC (automatic gain control) circuit and video a/d converter, wherein the gain of the signal that obtained by the solid state image sensor array in the image acquisition section 3 of this sampling maintenance/agc circuit adjustment and repair the waveform of this signal.Obtain image signal processing section 15 and obtain picture signal as numerical data.Obtaining image signal processing section 15 is that the picture signal of being obtained is carried out white balance processing, brightness processed, colour signal processing, vibration correction processing etc.
Image acquisition control section 11 is controlled image acquisition section 3 and the operation of obtaining image signal processing section 15 accordingly with the order that receives from system controller 10.Image acquisition control section 11 for example opening and closing image acquisition section 3 and the operation of obtaining image signal processing section 15.In addition, image acquisition control section 11 control image acquisition section 3 are carried out automatic focus operation, automatic exposure adjustment operation, aperture adjustment operation, zoom operation etc. with (passing through motor).
In addition, image acquisition control section 11 comprises timing generator.Image acquisition control section 11 utilizes the timing signal that is generated by timing generator, sampling maintenance/agc circuit and video a/d converter in control solid state image sensor array and the image acquisition control section 11.In addition, image acquisition control section 11 can utilize timing signal to change the frame frequency that obtains image.
In addition, image acquisition control section 11 control solid state image sensor arrays and image acquisition sensitivity and the signal processing of obtaining image signal processing section 15.In order to control image acquisition sensitivity, image acquisition control section 11 control examples are as gain, the black level setting of the signal that read from the solid state image sensor array, the various types of coefficients that obtain the numerical data of picture signal handling, the correcting value that vibration correction is handled etc.About the image acquisition sensitivity adjustment, image acquisition control section 11 can be carried out the total sensitivity adjustment, and irrelevant with wave band and the particular sensitivity adjustment that is used for the specific band such as infrared and ultraviolet range.Can be by in the image acquisition lens combination, inserting wavelength filter, and, carry out the specific sensitivity adjustment of wavelength by being that the picture signal of being obtained is carried out the wavelength filtering computing.In these cases, image acquisition control section 11 can be by inserting wavelength filter and/or specifying the filtering design factor to control sensitivity.
As the structure to user's video data, image acquisition and display device 1 comprise display part 2, display image processing section 12, display driving part 13 and display control section 14.
Its image obtained by image acquisition section 3 and offer display image processing section 12 by obtaining the picture signal of obtaining that image signal processing section 15 handles then.Display image processing section 12 for example is so-called video processor.Various types of demonstrations processing are carried out for the picture signal of obtaining that is provided in display image processing section 12.For example intensity level adjustment, colour correction, contrast adjustment, acutance (edge enhancing) adjustment etc. can be carried out for obtaining picture signal in display image processing section 12.In addition, display image processing section 12 can generate the enlarged image and the downscaled images of wherein amplifying a part of obtaining picture signal, and separate picture is so that separate demonstration, combination image, generate character picture and graph image, and with the image that generated with to obtain image superimposed.In other words, display image processing section 12 can be for carrying out various types of processing as the data image signal that obtains picture signal.
Display driving part 13 comprises pixel-driving circuit, and it shows the picture signal that provides from display image processing section 12 on for example for the display part 2 of LCD.In other words, display driving part 13 will be applied to based on the drive signal of picture signal in each pixel that forms with matrix shape in the display part 2, to cause display part 2 display images with predetermined horizontal/vertical driving timing.In addition, the transmissivity of display driving part 13 each pixel of control is passed through state to cause display part 2 to become.
Display control section 14 is controlled the processing of display image processing section 12 and the operation of operation and display driving part 13 accordingly with the order that receives from system controller 10.In other words, display control section 14 makes display image processing section 12 carry out above-mentioned various types of processing.In addition, display control section 14 control display driving parts 13 switch show state to cause display part 2 between by state and image display status.
In the following description, wherein display part 2 becomes transparent or translucent state and is called as " passing through state ", and wherein the operation (and state) of display part 2 display images is called as " supervision show state ".
In addition, image acquisition and display device 1 also comprise sound importation 6, sound signal processing part 16 and sound output 5.
Sound importation 6 comprises microphone 6a and 6b shown in Figure 1, and the microphone amplifier section, and it handles the voice signal that is obtained by microphone 6a and 6b.
Sound signal processing part 16 comprises for example A/D converter, digital signal processor, D/A converter etc.Sound signal processing part 16 will be converted to numerical data from the voice signal that sound importation 6 provides, and processing such as the adjustment of execution volume, tonequality adjustment, acoustics under the control of system controller 10.Sound signal processing part 16 is converted to analog signal with the voice signal that produces, and analog signal is offered voice output part 5.Sound signal processing part 16 is not limited to the structure of combine digital signal processing.On the contrary, sound signal processing part 16 can utilize analogue amplifier and analog filter to carry out signal processing.
Voice output part 5 comprises a pair of earphone speaker 5a shown in Figure 1 and the amplifier circuit that is used for earphone speaker 5a.
Sound importation 6, sound signal processing part 16 and sound output 5 allow the user to hear external voice by image acquisition and display device 1.
Voice output part 5 can be constructed to so-called sclerotin (osseous) conduction loud speaker.
In addition, image acquisition and display device 1 comprise that illumination (lighting) part 4 and lighting control section divide 18.Illumination section 4 comprises luminous component 4a shown in Figure 1 (for example, light-emitting diode) and causes the luminous illuminating circuit of luminous component 4a.Lighting control section divides 18 to make illumination section 4 and the order that provides from system controller 10 carry out light emission operation accordingly.
Because the luminous component 4a in the illumination section 4 is arranged to luminous component 4a is forwards illuminated, so illumination section 4 is carried out the illumination operation on user's visual direction.
As the structure that obtains user profile, image acquisition and display device 1 comprise vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17.
Vision sensor 19 detects the information of relevant user's vision.Vision sensor 19 is the transducers that can detect the information of relevant user's vision, and these information are such as opening/close for direction of visual lines, focal length, pupil dilation, eyeground pattern, eyelid etc.
Acceleration transducer 20 and gyroscope 21 outputs and the corresponding signal of user movement.Acceleration transducer 20 and image acquisition control section 11 are transducers of the motion of the head that detects the user, neck, whole health, arm, leg etc.
Biological sensor 22 detects user's biological information.Biological sensor 22 is transducers of heart rate information, pulse information, perspiration information, E.E.G information, electrodermal response (GSR), body temperature, blood pressure, respiratory activity information of test example such as user etc.The detection signal of biological sensor 22 becomes such information, utilizes this information to determine tense situation, excitatory state, tranquility, hypnosis, comfort conditions, uncomfortable state etc.
Importation 17 is parts that the user utilizes its manual input information.Having formed the user in the importation 17 can utilize it to import the switch in he or she the visual field.
Utilize vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17, the information about the user's that puts on image acquisition and display device 1 motion or physical condition of obtaining is as user profile, and provides it to system controller 10.
Determine in the processing of function 10a that at user situation system controller 10 is determined to wish or situation with the corresponding user of user profile who is obtained.In the processing of operation controlled function 10b, system controller 10 is wished with determined user or situation is controlled image acquisition operations and display operation accordingly.In other words, the operation that image signal processing section 15 is obtained in 11 controls of system controller 10 order image acquisition control sections, and the operation of order display control section 14 control display image processing sections 12 and display driving part 13.
Structure as obtaining user profile in image acquisition and the display device 1 has exemplarily illustrated vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17.Yet image acquisition and display device 1 can comprise all these transducers.In addition, image acquisition and display device 1 can comprise other transducer such as the transducer of transducer that detects user's voice and the motion that detects lip.
[3. exemplary display image]
System controller 10 is wished with the user or situation is controlled image acquisition operations and display operation accordingly.Therefore, the various display modes of User Recognition display part 2.Fig. 3 A has exemplarily illustrated various display modes to Fig. 3 C to Fig. 9 A to Fig. 9 C.
Fig. 3 A shows wherein, and display part 2 is in by the state under the state.In other words.Under this state, display part 2 is simple transparent flat parts, and the user can see scene in the visual field by transparent display part 2.
Fig. 3 B shows the image that is wherein obtained by image acquisition section 3 and is displayed on the state that monitors on the display part 2 of operating under the show state that is in.Image acquisition section 3, obtain image signal processing section 15, display image processing section 12 and display driving part 13 and under the state shown in Fig. 3 A, operate, so that make them normally on display part 2, show the image that obtains.In this case, show on the display part 2 to obtain image (image that normally obtains) almost identical with the image that manifests on by the display part 2 of operating under the state.In other words, under this state, the user can see that the normal visual field is as the image that obtains.
Fig. 3 C shows system controller 10 wherein and causes image acquisition section 3 to obtain distant view image and distant view image is presented at state on the display part 2 by image acquisition control section 11.
By contrast, when system controller 10 causes image acquisition section 3 to obtain wide angle picture by image acquisition control section 11, on display part 2, show closely wide angle picture (not shown).Though image acquisition section 3 is carried out distant view and wide-angle control by the zoom lens that drive image acquisition section 3, obtains image processing section 15 and can carry out these control by processing signals.
Fig. 4 A shows wherein, and display part 2 is in the state that is reading newspaper by state, for example user.
Fig. 4 B shows so-called wide-angle zoom state.In other words, Fig. 4 B shows such state, under this state, obtains nearly focal length zoom image and it is presented on the display part 2, so that for example amplify character in the newspaper.
Fig. 5 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in by under the state.
At this moment, when display control section 14 carries out image processing and amplifying are passed through in system controller 10 order display image processing sections 12, on display part 2, show the enlarged image shown in Fig. 5 B.
Fig. 6 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in by under the state.Especially, Fig. 6 A shows the state that user is wherein reading newspaper or books.In this case, suppose because around be dim, so the user can not or pass through to see under the state character in the newspaper etc. in display part 2 with the image that normally obtains.
In this case, system controller 10 order image acquisition control sections 11 (image acquisition section 3 and obtain image signal processing section 15) increase image acquisition sensitivity, and/or make display control section 14 (display image processing section 12 and display driving part 13) increase brightness and adjustment contrast and acutance, so that on display part 2, show than more sharpening, shown in Fig. 6 B the image of the image shown in Fig. 6 A.On the contrary, when system controller 10 makes illumination section 4 carry out the illumination operation, can be on display part 2 display image sharply.
Fig. 7 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in by under the state.In this case, the user is in the dark bedroom that wherein child is sleeping.Because the user is in the dark room, thus he or she can not utilize the image that normally obtains or display part 2 pass through clearly see child under the state.
At this moment, when system controller 10 order image acquisition control sections 11 (image acquisition section 3 and obtain image signal processing section 15) increase infrared image and obtain sensitivity, on display part 2, show the infrared image that obtains shown in Fig. 7 B, so that make the user can see children's sleeping face etc.
Fig. 8 A shows such state, and wherein display part 2 shows that the image or the display part 2 that normally obtain are in by under the state.
When system controller 10 order image acquisition control sections 11 (image acquisition section 3 and obtain image signal processing section 15) increase ultraviolet image and obtain sensitivity, on display part 2, show shown in Fig. 8 B, have ultraviolet component obtain image.
Fig. 9 A shows wherein, and display part 2 is in by the state under the state.
When system controller 10 order display control sections 14 (display image processing section 12 and display driving part 13) display images or display image and partly during enlarged image respectively respectively, can be on display part 2 image shown in the displayed map 9B.In other words, the screen of display part 2 is separated into regional AR1 and AR2, and wherein regional AR1 is in by state or is in the normal picture show state, and regional AR2 is in the enlarged image show state.
Fig. 9 C shows another exemplary separation and shows.In this case, the screen of display part 2 is separated into regional AR1, AR2, AR3 and AR4, and these zones show the frame with the image of predetermined amount of time interval acquiring.System controller 10 makes display image processing section 12 extract a frame with 0.5 second interval from the picture signal of being obtained, and shows the frame that is extracted with the order of regional AR1, AR2, AR3, AR4, AR1, AR2 etc.In this case, the image that on display part 2, shows so-called flicker display mode (strobedisplay mode) discretely.
Various types of display images exemplarily have been described hereinbefore.Yet, in this embodiment,, can realize various types of display modes by control image acquisition section 3, each processing of obtaining image signal processing section 15, display image processing section 12 and display driving part 13 and each operation.
For example, pre-display mode in respect of many types is such as the distant view display mode, the wide-angle display mode, what change the scope from the distant view display mode to the wide-angle display mode dwindles display mode and enlarged display mode, the image enlarged display mode, image dwindles display mode, variable frame frequency display mode (obtaining image) with high frame frequency, the high brightness display mode, the low-light level display mode, variable contrast display mode, variable acutance display mode, increased the image display mode that obtains of sensitivity, increased the image display mode that obtains of infrared sensitivity, increased the image display mode that obtains of ultraviolet sensitivity, the image effect display mode is (such as mosaic image, the brightness reverse image, the soft focus image, the part screen highlights image, has variable color atmosphere image etc.), slow display mode, display mode frame by frame, separate display mode with these display modes combine, with by state with obtain the display mode that separates that image combines, the flicker display mode, has the rest image display mode of the frame in the image that obtains etc.
[4. user situation determines]
As mentioned above, as the structure that obtains user profile, comprise vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17 according to image acquisition and the display device 1 of this embodiment.
Vision sensor 19 detects the information of relevant user's vision.Vision sensor 19 can comprise for example image acquisition section, and it arranges and be used to obtain the image of eyes of user near one of display part 2.Obtain the image of the eyes of user that obtains by image acquisition section by system controller 10.User situation determines that function 10a analyzes this image, and detection and the corresponding direction of visual lines of analysis result, focal length, pupil dilation, eyeground pattern, eyelid are opened/closed etc.Therefore, user situation is determined corresponding user situation of result and the hope that function 10a can determine and be detected.
As an alternative, vision sensor 19 can comprise luminous component and light receiving part, and wherein luminous component is arranged near one of display part 2 and be luminous to user's eyes, and light receiving part receives the light from eye reflections.By for example utilizing the lenticular thickness that detects eyes of user with the corresponding signal of light that receives, can detect the focal length of eyes of user.
By detect the direction of visual lines of eyes of user, the part that the user is focusing in the image that system controller 10 can be determined to show on display part 2.
In addition, system controller 10 can be discerned the direction of visual lines of eyes of user as the operation input.For example, as user left and when moving right sight line, system controller 10 can identify these operations as the scheduled operation input to image acquisition and display device 1.
By detecting the focal length of eyes of user, system controller 10 can determine that it is a long way off or on hand that the user is just focusing on scene on it.System controller 10 can be carried out zoom control accordingly, amplify control, dwindle control etc. with definite result.For example, when the user watched the distant place scene, system controller 10 can be carried out the distant view display operation.
When detecting pupil dilation under the state, the brightness around can determining.When under monitoring show state, detecting pupil dilation, can determine that the user feels to shown image to dazzle the eyes etc.Can adjust brightness, image acquisition sensitivity etc. accordingly with the result who determines.
When the pattern of detection user's eyeground, can verify the user accordingly with testing result.Because the eyeground pattern all is unique for each user, so can identify the user who puts on image acquisition and display device 1.Can control image acquisition and display device 1 accordingly with the result who is identified.As an alternative, system controller 10 can only monitor demonstration for the predesignated subscriber controls.
When the eyelid that detects the user open/during closing operation, can determine eyes of user stare and tired.In addition, the opening of eyelid/closing operation can be identified as the deliberate operation input of user.When the user had carried out the opening of eyelid/closing operation three times, it was the operation input of being scheduled to that these actions can be confirmed as.
Acceleration transducer 20 and gyroscope 21 outputs and user movement information corresponding.The motion of acceleration transducer 20 test example such as property direction along the line.Gyroscope 21 suitably detects the motion and the vibration of rotary system.
Depend on that acceleration transducer 20 and gyroscope 21 are arranged in the position in image acquisition and the display device 1, they can detect the motion of each part of the motion of the whole health of user or he or her health.
When acceleration transducer 20 and gyroscope 21 are arranged in glasses type image acquisition shown in Figure 1 and the display device 1, promptly, when acceleration transducer 20 and gyroscope 21 detected the motion of user's heads, the information of acceleration transducer 20 became the acceleration information as the motion of user's head or he or her whole health.In this case, the information of gyroscope 21 becomes as the angular speed of the motion of user's head or he or her whole health and the information of vibration.
Therefore, can detect the action of user from the neck moving-head.For example, can determine state that the user looks up and the state that he or she looks down.When the user looks down, can determine that he or she is seeing subject nearby, for example he or she is reading books etc.On the contrary, when the user looks up, can determine that he or she is seeing at a distance subject.
When system controller 10 has detected the user when he or she neck moves the action of he or she head, system controller 10 can be identified as it deliberate action of user.For example, if the user rocks he or she neck twice to the left side, the then system controller 10 operation input that this action can be defined as being scheduled to.
Depend on acceleration transducer 20 and gyroscope 21, they can determine that the user is in halted state (non-walking states), walking states, still is in the running state.In addition, acceleration transducer 20 and gyroscope 21 can detect state change or the change of the state from the state of being seated to the state of standing from standing state to the state of being seated.
When acceleration transducer 20 separated with helmet installation unit with gyroscope 21 and be arranged in user's a arm or a pin place, they can only detect the motion of this arm or pin.
Biological sensor 22 test example such as heart rate information (heart rate), pulse information (pulse frequency), perspiration information, brain wave information are (for example, the information of α ripple, β ripple, θ ripple and δ ripple), electrodermal response, body temperature, blood pressure, respiratory activity (for example, respiration rate, the degree of depth and lung capacity) etc. are as user's biological information.System controller 10 can be determined the corresponding tense situation of information, excitatory state, mood tranquility, comfort conditions or the uncomfortable the state whether user is in and is detected.
Can determine accordingly whether the user has put on image acquisition and display device 1 with the biological information that is detected.For example, when the user did not also put on image acquisition and display device 1, system controller 10 can be controlled image acquisition and display device 1 so that only operate under the holding state of detection of biological information therein.When system controller 10 detected with detected biological information that the user has put on image acquisition and display device 1 accordingly, system controller 10 can the conducting image acquisition and the power supply of display device 1.On the contrary, when the user had taken off image acquisition and display device 1, system controller 10 can revert to holding state with image acquisition and display device 1.
In addition, can be used for verifying user (user that image acquisition and display device 1 have been put in identification) by biological sensor 22 detected information.
When biological sensor 22 is arranged in the installation frame of glasses type image acquisition and display device 1, on user's a lateral parts or an aft section, detect above-mentioned information.As an alternative, a pre-position of user's body can be separated and be arranged in to biological sensor 22 with the installation frame of image acquisition and display device 1.
Importation 17 is that the user can utilize it to import the part of he or she visual field information.When the user imports he or she visual field information, for example field number and relevant myopia, long sight, astigmatism, presbyopia's etc. information, system controller 10 can with user's the visual field demonstration of control chart picture accordingly.
[5. example operation]
In the image acquisition and display device 1 of this embodiment according to the present invention, system controller 10 is controlled image acquisition operations and display operation accordingly with the user profile that is detected by vision sensor 19, acceleration transducer 20, gyroscope 21, biological sensor 22 and importation 17.Therefore, hope and the corresponding display operation of situation with the user carried out in display part 2, so that the vision of assistance and extending user.
Next, with the various types of example operation that are described under the control of system controller 10.
Figure 10 shows the control and treatment as the operation controlled function 10b of system controller 10.
In step F 101, system controller 10 control display control sections 14 pass through state so that display part 2 becomes.When initial conducting image acquisition and display device 1, flow process advances to step F 101.In step F 101, system controller 10 control display parts 2 become passes through state.
When display part 2 was in by state, flow process advanced to step F 102.In step F 102, system controller 10 determines whether to have occurred the supervision show state and starts triggering.Determine and determine by user situation that user that function 10a determines wishes or situation when having started the supervision show state accordingly when system controller 10, occur monitoring that show state starts triggers.System controller 10 determines whether to have occurred supervision show state startup triggering accordingly with user's operation, user's motion (being identified as the motion of operation) intentionally or user's motion unintentionally or situation (comprising the identification to the user).These concrete examples will be described after a while.
When definite result represented to have occurred supervision show state startup triggering, flow process advanced to step F 103.In step F 103, system controller 10 execution monitorings show start-up control.In other words, system controller 10 order image acquisition control sections 11 make image acquisition section 3 and obtain image signal processing section 15 and carry out normal image acquisition operations.In addition, system controller 10 order display control sections 14 make display image processing section 12 and display driving part 13 cause display part 2 to show that the picture signal of being obtained is as the image that normally obtains.
In this processing procedure, as shown in Figure 3A the state that passes through is switched to supervision show state shown in Fig. 3 B, that be used for normally obtaining image.
When display part 2 showed the image that normally obtains (its with user identical by the scene seen under the state), flow process advanced to step F 104.In step F 104, whether system controller 10 monitoring image control occurred is triggered.In step F 105, whether system controller 10 monitoring the supervision show state occurred is finished triggering.
When system controller 10 determine need with determine by user situation that user that function 10a determines wishes or situation is monitoring accordingly and image control occurred and triggered when changing display image mode in the show state.When system controller 10 determine need with determine by user situation that the determined user of function 10a wishes or situation is finished when monitoring show state and will monitor that show state switches to by state accordingly, the supervision show state occurs and finish triggering.System controller 10 determines whether to have occurred the supervision show state accordingly and finish triggering with user's operation, user's motion (being identified as the motion of operation) or user's be not intended to motion or the situation (user's physical condition, user's identification etc.) of having a mind to.These concrete examples will be described after a while.
When definite result represented to have occurred image control triggering, flow process advanced to step F 106 from step F 104.In step F 106, the display operation of image is obtained in system controller 10 controls.In other words, system controller 10 order image acquisition control sections 11 and display control section 14 make display part 2 to wish or the corresponding display mode display image of situation with user on this time point.
After step F 106 system controllers 10 had been controlled display mode, flow process turned back to step F 104 or F105.In step F 104 or step F 105, whether system controller 10 monitoring triggering occurred.
Finish when triggering when definite result represents to have occurred the supervision show state, flow process turns back to step F 101 from step F 105.In step F 101, system controller 10 order image acquisition control sections 11 are finished image acquisition operations, and order display control section 14 that display part 2 is become and pass through state.
When the user put on image acquisition and display device 1 and its power supply of conducting, the operation controlled function 10b in the system controller 10 carried out control and treatment shown in Figure 10.
In this processing procedure, start the definite result execution monitoring demonstration accordingly start-up control whether triggering has occurred with monitoring show state.Trigger the definite result who whether has occurred with image control and control display mode accordingly.With monitor show state finish definite result that whether triggering occurred accordingly execution monitoring show and stop and passing through State Control.To Figure 19 A and Figure 19 B the concrete example that triggering is determined and controlled be described with reference to figure 11A and Figure 11 B after a while.
The user situation that Figure 11 A and Figure 11 B show system controller 10 to Figure 19 A and Figure 19 B is determined the exemplary processes of function 10a.Suppose that these are handled with the processing of operation controlled function 10b shown in Figure 10 and carry out concurrently.Carry out these parallel processings, so that for example when system controller 10 is being carried out processing shown in Figure 10, execution graph 11A and Figure 11 B handle to the detection shown in Figure 19 A and Figure 19 B termly as Interrupt Process.Figure 11 A and Figure 11 B can be built in the program of carrying out processing shown in Figure 10 to the program of the processing shown in Figure 19 A and Figure 19 B.As an alternative, these programs can be other programs of regularly calling.In other words, the structure of these programs is not limited to specific structure.
With reference to figure 11A and Figure 11 B, describe determining whether to have occurred causing starting the exemplary processes that triggers with switch to the supervision show state that monitors show state by state.
Figure 11 A and Figure 11 B show the exemplary processes that detects as monitoring the user movement that shows start-up operation.
In the step F 200 shown in Figure 11 A, the information (acceleration signal or angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 or gyroscope 21.
Defined such as twice of the neck that teetertotter, transverse shakiness neck once, the predetermined motion inferior of rolling the neck monitoring the operation of operating under the show state for expression user command image acquisition and display device 1.Determine users when system controller 10 and carried out when representing that information that he or she orders image acquisition and display device 1 and acceleration transducer 20 and/or gyroscope 21 to be detected starts the motion that monitors show state accordingly that flow process advances to step F 202 from step F 201.In step F 202, system controller 10 determines to have occurred being used for the supervision show state startup triggering of the picture signal of obtaining.
When the definite result in step F 202 represented to have occurred supervision show state startup triggering, flow process advanced to step F 103 from step F shown in Figure 10 102.In step F 103, system controller 10 control display parts 2 start the display operation of the image that obtains.
As the information of acceleration transducer 20 and/or gyroscope 21 and detected and order image acquisition and display device 1 are monitoring the consumer premise motion operated under the show state that other is exemplified as jump, waves, rocks arm, rocks pin etc.
Figure 11 B is the exemplary processes that determines whether to have occurred accordingly with the information of vision sensor 19 supervision show state startup triggering.
In step F 210, system controller 10 is analyzed the information that is detected by vision sensor 19.When the image acquisition section that will obtain the eyes of user image was arranged as vision sensor 19, system controller 10 was analyzed the image that is obtained by this image acquisition section.
As supposition user when continuously nictation, three times specific action was defined as user command image acquisition and display device 1 with operation monitoring show state and operate, system controller 10 comes supervisory control action by analyzing the image that is obtained.
Blinked continuously three times the time when system controller 10 has detected the user, flow process advances to step F 212 from step F 211.In step F 212, system controller 10 determines to have occurred being used for the supervision show state startup triggering of the picture signal of obtaining.
When the definite result in step F 212 represented to have occurred supervision show state startup triggering, flow process advanced to step F 103 from step F shown in Figure 10 102.In step F 103, system controller 10 control display parts 2 are so that to monitor that show state starts the display operation of the image that obtained.
With the information that detects by vision sensor 19 detected accordingly and order image acquisition and display device 1 with other example that monitors the user action that show state is operated comprise rotate eyeball, laterally move they twice, move up and down that they are two inferior.
Except with user's voluntary action accordingly with image acquisition and display device 1 from switching to by state these exemplary processes that monitor show state, other action can also be arranged.
In order to switch to the supervision show state, for example, can arrange switch by state.Can switch show state accordingly with the operation of switch.
When the user from the importation 17 when having imported visual field information, system controller 10 can be determined to have occurred the supervision show state and start and trigger.
As an alternative, when the user put on image acquisition and display device 1, system controller 10 can be determined to have occurred the supervision show state and start triggering.Because system controller 10 can be determined the user and whether put on image acquisition and display device 1 accordingly with the information of biological sensor 22 detections, so when biological sensor 22 had for example detected pulse, brain wave, electrodermal response etc., system controller 10 can be determined to have occurred the supervision show state and start and trigger.In this case, when the user had just put on image acquisition and display device 1, image acquisition and display device 1 were operated to monitor show state.
As an alternative, when the predesignated subscriber had put on image acquisition and display device 1, it can begin operation to monitor show state.Can with discern the user accordingly by vision sensor 19 detected eyeground patterns and by biological sensor 22 detected signals.When the eyeground pattern of having registered the user who uses image acquisition and display device 1 and biological information, system controller 10 can determine whether this predesignated subscriber has put on image acquisition and display device 1.
Therefore, when the predesignated subscriber put on image acquisition and display device 1, system controller 10 was verified he or she.When image acquisition and display device 1 had identified the predesignated subscriber, system controller 10 was determined to have occurred the supervision show states and is started and trigger, and control image acquisition and display device 1 are operated to monitor show state.
When allowing function with image acquisition and display device 1 only to be used for the predesignated subscriber, such personal verification can be added to and determine whether to have occurred to monitor that show state starts the condition of triggering.
When showing that with above-mentioned supervision starting triggering shows the image that obtains accordingly on display part 2, shown in Fig. 9 B, zone AR1 can be in by state on the screen of display part 2, and can show the image that is obtained in the regional AR2 as a screen part.
Next, with reference to figure 12A and Figure 12 B to Figure 17 A and Figure 17 B, describe as the processing in the step F shown in Figure 10 104, determined whether to occur the exemplary processes that image control triggers.
Figure 12 A shows the exemplary processes of controlling zoom operation with the mobile phase of user's sight line accordingly.
In the step F 300 shown in Figure 12 A, system controller 10 is analyzed by vision sensor 19 detected information.For example, when the image acquisition section that will obtain the eyes of user image was arranged as vision sensor 19, system controller 10 was analyzed the image that is obtained.
When the visual direction that has detected the user when system controller 10 had moved down, flow process advanced to step F 302 from step F 301.In step F 302, system controller 10 is determined to have occurred amplification (wide-angle zoom) and is shown handover trigger.
When the definite result in step F 302 represented the wide-angle zoom handover trigger to have occurred, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 10 order image acquisition control sections 11 are carried out amplifieroperation.Therefore, the image of display part 2 demonstrations shown in Fig. 4 B.
When user's sight line moved down, he or she was reading newspaper or books or is watching very position near eyes.Therefore, when enlarged image, it has carried out appropriate display for near-sighted or presbyopic user.
Figure 12 B is an exemplary processes of controlling zoom operation with the focal length of the motion of user's neck (head) and he or her eyes accordingly.
In the step F 310 shown in Figure 12 B, system controller 10 is analyzed by vision sensor 19 detected information, and the focal length and the visual direction of detection and the corresponding eyes of user of analysis result.In step F 311, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21, and information definite and that detected is corresponding, the motion of user's neck.
After this, in step F 312 and step F 313, the testing result of the orientation of the focal length of system controller 10 and relevant eyes of user and he or her neck determines that accordingly the user is seeing nearby the position or seeing remote location.
When system controller 10 determined that the user is seeing position (particularly, he or she hand) nearby, flow process advanced to step F 314 from step F 312.In step F 314, system controller 10 is determined to have occurred amplification (wide-angle zoom) and is shown handover trigger.In step F 316, system controller 10 calculates the directed corresponding appropriate zoom magnification ratio with the focal length of eyes of user and he or her neck (head).
When system controller 10 determined that the user is seeing remote location, flow process advanced to step F 315 from step F 313.In step F 315, system controller 10 is determined to have occurred the distant view zoom and is shown handover trigger.In step F 316, system controller 10 calculates the directed corresponding appropriate zoom magnification ratio with the focal length of eyes of user and he or her neck (head).
When the processing of the processing of having carried out step F 314 and step F 316 or step F 315 and step F 316, flow process advances to step F 106 from step F 104 as shown in figure 10.In step F 106, system controller 10 order image acquisition control sections 11 utilize the magnification ratio that is calculated to carry out zoom operation.
Therefore, display part 2 show that the scene of seeing with the user is corresponding, the enlarged image shown in Fig. 4 B or the distant view image shown in Fig. 3 C.
Such operation becomes the user's who assists myopia and long sight function.
In Figure 12 A and Figure 12 B, the processing that is changed display image by the zoom operation of display driving part 13 has been described exemplarily.As an alternative, system controller 10 can cause the focal length of display image processing section 12 and user's visual direction, he or her eyes, the orientation of he or her neck etc. carries out image processing and amplifying, image dwindle processing etc. accordingly.
Figure 13 A shows the exemplary processes of the information in the relevant user of the 17 inputs visual field from the importation.From the importation 17 when having imported the information in the relevant user visual field, flow process advances to step F 401 from the step F 400 shown in Figure 13 A when.In step F 401, system controller 10 is determined to have occurred basis, the visual field and is shown handover trigger.In step F 402, system controller 10 calculates the corresponding magnification ratio of value with the visual field.
When having carried out the processing of step F 401 and step F 402, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, display operation is amplified with the magnification ratio execution of being calculated in system controller 10 order display image processing sections 12.In this processing procedure, display part 2 shows and the corresponding enlarged image in the user visual field.
As an alternative, system controller 10 for example can store user's eyeground pattern and visual field information in the internal storage explicitly in advance.Discern the user by the eyeground pattern that detects user he or she.System controller 10 can order display part 2 to show and the corresponding enlarged image in the user visual field.
The exemplary processes that Figure 13 B shows that user's reply for presbyopia for example and astigmatism and reply dark surrounds reduced for the sensitivity of brightness, ambiguity etc.
In step F 410, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21, and the corresponding user movement of result of determining and being detected.For example, system controller 10 determine users whether with testing result be in accordingly wherein he or she not under the resting state of walking.In step F 411, system controller 10 is analyzed the information that is detected by vision sensor 19, and the pupil dilation and the eyelid folding (screw-up) (he or she eyelid state) of the focal length of detection and the corresponding eyes of user of analysis result, he or her eyes.
When definite result represents that the user is in resting state and seeing closely or during positive closed (screw up) he or she eyelid, flow process advances to step F 413 from step F 412.In step F 413, system controller 10 is determined to have occurred the presbyopia and is waited reply to show triggering.
When the definite result in step F 413 represented triggering to have occurred, flow process advanced to F106 from step F shown in Figure 10 104.In this case, system controller 10 order display driving parts 13 improve image acquisition sensitivity, and image signal processing section 15 is obtained in order or the processing that increases the brightness and contrast and strengthen edge (acutance) is carried out in display image processing section 12.Therefore, in this processing procedure, display part 2 clearly illustrates the image shown in Fig. 6 B.Therefore, this handles when the user who the presbyopia is arranged or be in the dark place for example reads newspaper assisting users visually.
In this case, shown in Fig. 4 B, system controller 10 can cause display part 2 carries out image amplifieroperations.
When system controller 10 determined that users are in the corresponding dark place of pupil dilation with he or her eyes, system controller 10 can be controlled illumination section 4 and throw light on.
Figure 14 shows and depends on that user he or she is the exemplary processes of the comfortable or uncomfortable user's of dealing with vision.
In step F shown in Figure 14 420, system controller 10 is analyzed the information that is detected by vision sensor 19, and detection and analysis result is corresponding, the nictation (number of times of time per unit) of the pupil dilation of eyes of user and he or her eyes.
In step F 421, system controller 10 is checked the information by the brain wave of biological sensor 22 detections, heart rate, volume of perspiration, blood pressure etc.
System controller 10 with determine accordingly that by vision sensor 19 and biological sensor 22 detected information the user is for watching the image that shows on the display part 2 to feel comfortable or feel under the weather.
When definite result represents the user when watching image not feel comfortable, flow process advances to step F 423 from step F 422.In step F 423, system controller 10 is determined to have occurred image adjustment control and is triggered.In this case, flow process advances to step F 424.In step F 424, system controller 10 calculates with user situation (situations of for example, brightness, contrast, acutance, image acquisition sensitivity, brightness of illumination etc.) and is considered to comfortable adjusted value accordingly.
When having carried out the processing of step F 423 and step F 424, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order image acquisition section 13 are adjusted image acquisition sensitivity, and image signal processing section 15 is obtained in order or the processing of adjusting brightness, contrast, acutance etc. is carried out in display image processing section 12.In this processing procedure, be adjusted at the picture quality that shows on the display part 2, so that make the user for watching the image that on display part 2, shows to feel comfortable.When definite result represented that the user is in the dark place, system controller 10 can be controlled illumination section 4 and throw light on.
When because the situation of the fatigue of the user visual field, surrounding brightness and he or her eyes and when causing the user to feel uncomfortable to watching the image that shows on the display part 2, this processing procedure provides comfortable visible situation to he or she for example.For example, when the user is in dark place and he or she can not be clear that image the time, this processing procedure provides distinct image to he or she.When user's eye fatigue, this processing procedure provides soft image to he or she.
Execution graph 12A and Figure 12 B, Figure 13 A and Figure 13 B and processing shown in Figure 14 and carry out the operation had a mind to without the user in such a way is so that make system controller 10 determine he or she situations and control display image mode accordingly with he or she situation.On the contrary, carry out the processing shown in Figure 15, Figure 16 and Figure 17 A and Figure 17 B in such a way, trigger (perhaps a kind of trigger condition) so that image control is used as in user's voluntary action.Next, will these processing be described with reference to Figure 15, Figure 16 and Figure 17 A and Figure 17 B.
Figure 15 shows the processing of the motion of user's neck (head) being used as an operation.
In step F 500, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring are detected by acceleration transducer 20 and gyroscope 21.In step F 501, system controller 10 is determined and the corresponding user's head movement of information that is detected.For example, system controller 10 is determined users' he or she head twice that whether receded, he or her he or she head twice that whether turned forward, and perhaps whether he or she rocks he or she neck left twice.
When system controller 10 detected the user and receded he or her head twice, flow process advanced to step F 505 from step F 502.In step F 505, system controller 10 determines that the image switching that 2 times of distant view magnification ratios occurred triggers.
In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with 2 times of distant view magnification ratios.Therefore, shown in Fig. 3 C, display part 2 shows the image with 2 times of distant view magnification ratios.
When system controller 10 detected the user and turned forward he or her head twice, flow process was from the step F 503 step F506 that advances.In step F 506, system controller 10 determines that the image switching that 1/2 times of distant view magnification ratio occurred triggers.In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 104, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with 1/2 times of distant view magnification ratio.Therefore, display part 2 shows the image with 1/2 times of distant view magnification ratio.
When system controller 10 detected the user and rocked he or her neck twice left, flow process advanced to step F 507 from step F 504.In step F 507, system controller 10 determine to have occurred to cause the to reset image switching of distant view magnification ratio triggers.In this case, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 104, system controller 10 order image acquisition control sections 11 are carried out the zoom operation with standard magnification ratio.Therefore, display part 2 shows the image with standard magnification ratio.
Because user's the motion of having a mind to is defined as triggering, and switches display image mode therewith accordingly, so provide he or she the desired visual field to he or she.
Except the motion of user's neck, can also with such as jump and the motion of he or she hand, arm and leg the motion of he or her whole health be defined as scheduled operation.
Except zoom operation, can also be with user's action or move the image amplifieroperation shown in the execution graph 5B, image reduction operation, image acquisition sensitivity operation accordingly, obtain the increase infrared image shown in picture frame frequency selection operation, Fig. 7 B and obtain the increase ultraviolet image shown in sensitivity display operation, Fig. 8 B and obtain and separate flicker (strobe) display operation shown in display operation, Fig. 9 C etc. shown in sensitivity display operation, Fig. 9 B.
Figure 16 shows the processing that triggers with he or she situation and the corresponding predetermined control of surrounding environment is used as in user's action.
In step F shown in Figure 16 600, system controller 10 is analyzed the information that is detected by vision sensor 19, and detection and analysis result is corresponding, the pupil dilation of he or she eyes and nictation.In step F 601, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and definite user whether the move neck or the whole health (not walking states) of (stopping) he or she.
In step F 602, system controller 10 determines whether the user is in resting state and makes he or she neck downward-sloping, whether surrounding environment deceives, and whether he or she has carried out the specific action of having blinked such as he or she three times.
In other words, system controller 10 determines accordingly with the information that is detected by acceleration transducer 20 and gyroscope 21 whether the user is in resting state and makes he or she neck downward-sloping.In addition, system controller 10 determines accordingly with the pupil dilation of eyes of user whether he or she is in the dark surrounds.When having satisfied these conditions, system controller 10 determines whether the user has blinked three times continuously.
When the user is in resting state and makes he or she neck downward-sloping and he or she when being in the dark surrounds, he or she is just reading something in dark room.When user has in this case blinked three times the time wittingly continuously, system controller 10 determines that he or she wishes a bright and distinct image.Therefore, blinked continuously three times the time when system controller 10 under these circumstances detects the user, flow process advances to step F 603.In step F 603, system controller 10 is determined to have occurred image adjustment control and is triggered.
When having carried out the processing of step F 603, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order display driving parts 13 improve image acquisition sensitivity, and image signal processing section 15 is obtained in order or the processing that increases brightness, enhancing contrast ratio and acutance etc. is carried out in display image processing section 12.As an alternative, system controller 10 can order illumination section 4 to be thrown light on.
Therefore, the user can see image under comfortable situation.
In this example,, just allow the eyes of user operation of nictation as long as satisfied user's situation and ambient environmental conditions.This processing procedure is effectively, and this is because even after the user has by mistake carried out relevant action, also can not change image.
Figure 17 A shows and obtains the described exemplary processes with the image that increases infrared sensitivity with reference to figure 7B.Allow accordingly with user's physical condition or forbid and the corresponding operation of user action.
In the step F 700 shown in Figure 17 A, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and determine and the result that detected is corresponding, the motion of the motion of user's neck and he or her whole health.
In step F 701, system controller 10 is checked by the brain wave of biological sensor 22 detections, heart rate, volume of perspiration, blood pressure etc.System controller 10 determines accordingly with the information that is detected by biological sensor 22 whether the user is nervous or excited.
When system controller 10 detected the predetermined action that causes image acquisition and display device 1 to carry out infrared image obtaining operation (for example, he or she rocked its neck), flow process advanced to step F 703 from step F 702.In step F 703, system controller 10 determines whether the user is nervous or excited.
Determine that users had not only taken it easy but also excitation time not when system controller 10, system controller 10 determines that user actions are valid functions.After this, flow process advances to step F 704.In step F 704, system controller 10 has determined to have occurred to increase the image acquisition operations triggering of infrared sensitivity.
When having carried out the processing of step F 704, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order image acquisition section 3 increase infrared image and obtain sensitivity.Therefore, the image of display part 2 demonstrations shown in Fig. 7 B.
On the contrary, the definite result when step F 703 represents user's anxiety or excitation time, system controller 10 definite image acquisition operations triggerings that infrared sensitivity also occurs increasing.In other words, system controller 10 is forbidden and the corresponding operation of user action.
Therefore, can determine validity with he or she the corresponding operation of action together with the condition of user's physical condition.In this case, can prevent effectively that particular image such as the image acquisition operations that increases infrared sensitivity from obtaining function and being used irrelevantly.
Figure 17 B shows and obtains the described exemplary processes with the image that increases ultraviolet sensitivity with reference to figure 8B.
In the step F 710 shown in Figure 17 B, the information (acceleration signal and angular velocity signal) that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and determine and the result that detected is corresponding, the motion of the motion of user's neck or he or her whole health etc.
Carried out when causing image acquisition and display device 1 to carry out ultraviolet image obtaining the predetermined action of operation when system controller 10 detects the user, flow process advances to step F 712 from step F 711.In step F 712, system controller 10 has determined to have occurred to increase the image acquisition operations triggering of ultraviolet sensitivity.
After the processing of having carried out step F 712, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 10 order image acquisition section 3 increase ultraviolet image and obtain sensitivity.Therefore, the image of display part 2 demonstrations shown in Fig. 8 B.
Display mode handover trigger and the display mode that is used to obtain image exemplarily has been described hereinbefore.Should be noted that to have other example.
When switching the display mode of display part 2 accordingly with image control triggering; shown in Fig. 9 B; regional AR1 in the screen of display part 2 can be the image that normally obtains by state or demonstration, and can be with another kind of display mode display image as the regional AR2 of a screen part.As an alternative, the AR1 as broad area can show and the corresponding image of image control triggering.As an alternative, can equally divide screen, and can show the image that normally obtains therein and trigger corresponding image with image control.
Next, will be described in detected triggering in the step F shown in Figure 10 105, that is, cause monitoring that show state is switched to the triggering of passing through state of obtaining image with reference to figure 18A, Figure 18 B, Figure 19 A and Figure 19 B.
Figure 18 A shows with user's voluntary action and finishes the exemplary processes that monitors show state accordingly.
In the step F 800 shown in Figure 18 A, the information that system controller 10 monitoring is detected by acceleration transducer 20 and gyroscope 21, and determine and the information that detected is corresponding, the motion of the motion of user's neck or he or her whole health etc.
Carried out when causing image acquisition and display device 1 to be finished monitoring the predetermined action of show state when system controller 10 detects the user, flow process advances to step F 802 from step F 801.In step F 802, system controller 10 determines that the image monitoring demonstration having occurred obtaining finishes triggering.
When having carried out the processing of step F 802, flow process advances to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control sections 14 switch to show state and pass through state.Therefore, as shown in Figure 3A, display part 2 reverts to show state passes through state.
Figure 18 B shows with user's voluntary action and finishes the exemplary processes that monitors show state accordingly.
In the step F 810 shown in Figure 18 B, system controller 10 is analyzed the information that is detected by vision sensor 19.Continuously nictation, three times predetermined action was defined as causing image acquisition and display device 1 to finish the user who monitors show state when operating as the user, and system controller 10 comes supervisory control action by analysis image.
Blinked continuously three times the time when system controller 10 detects the user, flow process advances to step F 812 from step F 811.In step F 812, system controller 10 is determined to have occurred to be used for the supervision show state of the picture signal of obtaining and is finished triggering.
When having carried out the processing of step F 812, flow process advances to step F 101 from step F shown in Figure 10 105.In step F 101, system controller 10 order display control sections 14 switch to show state and pass through state.Therefore, as shown in Figure 3A, display part 2 reverts to show state passes through state.
In the processing shown in Figure 18 A and Figure 18 B, when the user causes image acquisition and display device 1 when operating by state, display part 2 becomes accordingly with he or she hope and passes through state.
Certainly, can have and cause display part 2 that show state is reverted to user action by other type of state.
Figure 19 A shows with user movement (motion that he or she be not considered to operates) and accordingly show state is reverted to automatically the exemplary processes by state.
In the step F 900 shown in Figure 19 A, system controller 10 is monitored the information that is detected by acceleration transducer 20 and gyroscope 21, and determines whether the motion of the whole health of user.Especially, system controller 10 detects users and has a rest, walks, or running.
When system controller 10 determined that the user has begun walking or running, flow process advanced to step F 902 from step F 901.In step F 902, system controller 10 is determined to have occurred to be used for the supervision show state of the picture signal of obtaining and is finished triggering.
When having carried out the processing of step F 902, flow process turns back to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control sections 14 switch to show state and pass through state.Therefore, as shown in Figure 3A, display part 2 reverts to show state passes through state.
When the user is walking or running, from a security point of view, be preferably show state reverted to and pass through state.
Be not show state to be reverted to pass through state; but when the user is walking or is running; system controller 10 can order display control section 14 that show state is switched to the supervision show state that normally obtains image, and wherein this normally obtains images category and is similar to user shown in Fig. 3 B at that image by seeing under the state.
The physical condition that Figure 19 B shows with the user reverts to show state by state accordingly automatically so that prevent to use irrelevantly infrared image to obtain the exemplary processes of operation.
In step F 910 shown in Figure 19 B, system controller 10 is checked the information of relevant for example brain wave by biological sensor 22 detections, heart rate, volume of perspiration, blood pressure etc.System controller 10 determines accordingly with the information that is detected by biological sensor 22 whether the user is nervous or excited.
When carrying out the image acquisition operations that increases infrared sensitivity, flow process advances to step F 912 from step F 911.In step F 912, system controller 10 determines whether the user is nervous or excited.
When definite result represents that the user had not only taken it easy but also excitation time not, system controller 10 allows image acquisition and display device 1 to continue the image acquisition operations that this increases infrared sensitivity.On the contrary, when definite result represents user's anxiety or excitation time, flow process advances to step F 913.In step F 913, system controller 10 is determined to have occurred to be used for the supervision show state of the image that obtains and is finished triggering.
When having carried out the processing of step F 913, flow process turns back to step F 101 from step F shown in Figure 10 105.In this case, in step F 101, system controller 10 order display control sections 14 switch to show state and pass through state.In other words, system controller 10 order display control sections 14 are finished the supervision show state of the image acquisition operations that increases infrared sensitivity.Therefore, display part 2 reverts to show state and passes through state.
Be preferably with user's physical condition and finish the image acquisition operations that increases infrared sensitivity accordingly, and show state is reverted to by state, to prevent that he or she uses the image acquisition operations that has increased infrared sensitivity improperly.
Be not that show state is reverted to by state, but can finish the image acquisition operations that increases infrared sensitivity, and can show the image that normally obtains.
[the 6. effect of first embodiment, modification and expansion]
According to this embodiment, the image that obtains by the image acquisition section 3 that is arranged in glasses type installation unit or the helmet type installation unit, that is,, be displayed on the display part 2 of he or she eyes front as the image that obtains in the eyes of user direction of subject direction.In this case, control image acquisition operations or display operation accordingly with relevant he or she motion or physical condition.Therefore, can create the situation of assistance in fact or extending user visual capacity.
Because the image acquisition operations of image acquisition section 3 and change with the corresponding display mode of the signal processing of obtaining image signal processing section 15 and display image processing section 12 and to be that the user who determines accordingly with the information relevant with user's motion or physical condition wishes or situation is carried out accordingly.Therefore, do not apply the operation burden to the user.In addition, because controlled image acquisition and display device 1 rightly, so the user can easily use it.
In addition, because make it become the transparent or translucent state that passes through, so when the user put on installation unit, it can not upset he or she normal life by the transmissivity of control display part 2.Therefore, in user's normal life, can use effectively according to the image acquisition of this embodiment and the benefit of display device 1.
In this embodiment, the image acquisition operations of image acquisition section 3 and the display mode of realizing by the signal processing of obtaining image signal processing section 15 and display image processing section 12 have mainly been described.For example, the switching that can control energising, outage and standby accordingly with user's action and/or physical condition, and from the volume and the tonequality of the sound of voice output part 5 outputs.For example, can consider that accordingly user's comfort level adjusts volume with the information that for example biological sensor 22 is detected.
The outward appearance of image acquisition and display device 1 and structure are not limited to illustrated in figures 1 and 2 those.As an alternative, can carry out various modifications.
For example, can in image acquisition and display device 1, arrange the storage area of storing the picture signal of obtaining by image acquisition section 3, and the hop that picture signal is sent to miscellaneous equipment.
Except image acquisition section 3, can in image acquisition and display device 1, arrange importation and receiving unit from the external equipment input picture as the source of the image of demonstration on the display part 2.
In addition, can in image acquisition and display device 1, arrange the character recognition part and the synthetic speech synthesiser branch of handling of execution sound of the character that comprises in the recognition image.When the image that obtains comprised character, the speech synthesiser branch can generate the reading voice signal, and voice output part 5 can be exported and the corresponding voice of this signal.
In this embodiment, having described wherein, image acquisition and display device 1 are the examples of glasses type installation unit or helmet type installation unit.Yet the direction of family eyes is obtained image and at he or she eyes front display image, this equipment can be any kind that the user can put on, such as headphone type, neckline type, ear-hung etc. as long as image acquisition and display device are continued to use.As an alternative, image acquisition and display device 1 can be to use installing component such as clip to be attached to the unit of glasses, bongrace, headphone etc.
(second embodiment)
Next, will be with image acquisition and display device and image acquisition and the display packing of following order description according to second embodiment of the invention.
[the 1. exemplary outward appearance of image acquisition and display device]
[the 2. demonstrative structure of image acquisition and display device]
[3. exemplary display image]
[4. user situation determines]
[5. example operation]
[the 6. effect of second embodiment, modification and expansion]
[the 1. demonstration outward appearance of image acquisition and display device]
Identical according to the exemplary outward appearance of the image acquisition of second embodiment and display device with exemplary outward appearance according to first embodiment.
[the 2. exemplary surface structure of the demonstration of image acquisition and display device]
Figure 20 shows the exemplary internal configuration according to the image acquisition of second embodiment of the invention and display device 101.
System controller 110 comprises microcomputer, and it comprises for example CPU (CPU), ROM (read-only memory), RAM (random access memory), nonvolatile memory part and interface section.System controller 110 is control sections of the whole parts in control image acquisition and the display device 101.
System controller 110 and external circumstances are controlled each part in image acquisition and the display device 101 accordingly.In other words, system controller 110 be used to detect and the operation sequence of definite external circumstances is operated accordingly, and control each part accordingly with the situation that institute detects and determines.Therefore, as shown in figure 20, system controller 110 comprises that on function the external circumstances of determining external circumstances determines function 110a and determine that with external circumstances definite result of function 110a controls and order the operation controlled function 110b of each part accordingly.
In image acquisition and display device 101, arranged image acquisition section 103, image acquisition control section 111 and obtained image signal processing section 115, as the structure that is used to obtain at the image of user front.
Image acquisition section 103 comprises lens combination (it has image acquisition lens 103a (as shown in Figure 1), aperture, zoom lens, condenser lens etc.), make lens combination carry out the drive system of focusing operation and zoom operation, and the solid state image sensor array, wherein the light that obtains image that obtains by lens combination of solid state image sensor array detection, light be converted to electricity and generate and the electric corresponding picture signal of obtaining.The solid state image sensor array comprises for example CCD (charge coupled device) sensor array or CMOS (complementary metal oxide semiconductors (CMOS)) sensor array.
Obtain image signal processing section 115 and comprise sampling maintenance/AGC (automatic gain control) circuit and video a/d converter, wherein the gain of the signal that obtained by the solid state image sensor array in the image acquisition section 103 of this sampling maintenance/agc circuit adjustment and repair the waveform of this signal.Obtain image signal processing section 115 and obtain picture signal as numerical data.Obtaining image signal processing section 115 is that the picture signal of being obtained is carried out white balance processing, brightness processed, colour signal processing, vibration correction processing etc.
Image acquisition control section 111 is controlled image acquisition section 103 and the operation of obtaining image signal processing section 115 accordingly with the order that receives from system controller 110.The operation that image acquisition control section 111 for example begins and stops image acquisition section 103 and obtains image signal processing section 115.In addition, image acquisition control section 111 control image acquisition section 103 are carried out automatic focus operation, automatic exposure adjustment operation, aperture adjustment operation, zoom operation etc. with (passing through motor).
In addition, image acquisition control section 111 comprises timing generator.Image acquisition control section 111 utilizes by the timing signal control solid state image sensor array of timing generator generation and sampling maintenance/agc circuit and the video a/d converter in the image acquisition control section 111.In addition, image acquisition control section 111 can utilize timing signal to change the frame frequency that obtains image.
In addition, image acquisition control section 111 control solid state image sensor arrays and image acquisition sensitivity and the signal processing of obtaining image signal processing section 115.In order to control image acquisition sensitivity, image acquisition control section 111 control examples are as gain, the black level setting of the signal that read from the solid state image sensor array, the various types of coefficients that obtain the numerical data of picture signal handling, the correcting value that vibration correction is handled etc.For the image acquisition sensitivity adjustment, image acquisition control section 111 can be carried out the total sensitivity adjustment, and do not consider wave band and be used for the particular sensitivity adjustment (for example, can obtain image like this so that intercepting predetermined band) of the specific band such as infrared and ultraviolet range.Can be by in the image acquisition lens combination, inserting wavelength filter, and, carry out the specific sensitivity adjustment of wavelength by being that the picture signal of being obtained is carried out the wavelength filtering computing.In these cases, image acquisition control section 111 can be by inserting wavelength filter and/or specifying the filtering design factor to control sensitivity.
As the structure to user's video data, image acquisition and display device 101 comprise display part 102, display image processing section 112, display driving part 113 and display control section 114.
Its image obtained by image acquisition section 103 and offer display image processing section 112 by obtaining the picture signal of obtaining that image signal processing section 115 handles then.Display image processing section 112 for example is so-called video processor.Various types of demonstrations processing are carried out for the picture signal of obtaining that is provided in display image processing section 112.For example intensity level adjustment, colour correction, contrast adjustment, acutance (edge enhancing) adjustment etc. can be carried out for the picture signal of being obtained in display image processing section 112.In addition, display image processing section 112 can generate the enlarged image and the downscaled images of wherein amplifying a part of obtaining picture signal, highlight a part of image, separate picture is so that separate demonstration, combination image, generate character picture and graph image, and with the image that generated with to obtain image superimposed.In other words, display image processing section 112 can be for carrying out various types of processing as the data image signal that obtains picture signal.
Display driving part 113 comprises pixel-driving circuit, and it shows the picture signal that provides from display image processing section 112 on for example for the display part 102 of LCD.In other words, display driving part 113 will be applied to based on the drive signal of picture signal in each pixel that forms with matrix shape in the display part 102, to cause display part 102 display images with predetermined horizontal/vertical driving timing.In addition, the transmissivity of display driving part 113 each pixel of control is passed through state to cause display part 102 to become.
Display control section 114 is controlled the processing of display image processing section 112 and the operation of operation and display driving part 113 accordingly with the order that receives from system controller 110.In other words, display control section 114 makes display image processing section 112 carry out above-mentioned various types of processing.In addition, display control section 114 control display driving parts 113 switch show state to cause display part 102 between by state and image display status.
In the following description, wherein display part 102 becomes transparent or translucent state and is called as " passing through state ", and wherein the operation (and state) of display part 102 display images is called as " supervision show state ".
In addition, image acquisition and display device 101 comprise sound importation 106, sound signal processing part 106 and sound output 105.
Sound importation 106 comprises microphone 106a and 106b shown in Figure 1, and the microphone amplifier section, and it handles the voice signal that is obtained by microphone 106a and 106b.
Sound signal processing part 116 comprises for example A/D converter, digital signal processor, D/A converter etc.Sound signal processing part 116 will be converted to numerical data from the voice signal that sound importation 106 provides, and processing such as the adjustment of execution volume, tonequality adjustment, acoustics under the control of system controller 110.Sound signal processing part 116 is converted to analog signal with the voice signal that produces, and analog signal is offered voice output part 105.Sound signal processing part 116 is not limited to the structure of combine digital signal processing.On the contrary, sound signal processing part 116 can utilize analogue amplifier and analog filter to carry out signal processing.
Voice output part 105 comprises a pair of earphone speaker 105a shown in Figure 1, and the amplifier circuit that is used for earphone speaker 105a.
Sound importation 106, sound signal processing part 116 and sound output 105 allow the user to hear external voice by image acquisition and display device 101.
Voice output part 105 can be constructed to so-called sclerotin (osseous) conduction loud speaker.
In addition, image acquisition and display device 101 comprise that speech synthesiser divides 127.Speech synthesiser divide 127 synthetic with from the corresponding sound of system controller 110 issued command, and the voice signal that synthesized of output.
Speech synthesiser divides 127 synthetic voice signal outputed to sound signal processing part 116.Sound signal processing part 116 is handled synthetic voice signal, and the signal that produces is offered voice output part 105.Voice output part 105 is given the user with voice output.
Speech synthesiser divides 127, and generate will be at the voice signal of the reading voice of describing after a while.
In addition, image acquisition and display device 1 comprise that illumination section 104 and lighting control section divide 118.Illumination section 104 comprises luminous component 104a shown in Figure 1 (for example, light-emitting diode) and causes the luminous illuminating circuit of luminous component 104a.Lighting control section divides 118 to make illumination section 104 and the order that provides from system controller 110 carry out light emission operation accordingly.
Because the luminous component 104a in the illumination section 104 is arranged to luminous component 104a is forwards thrown light on, so illumination section 104 is carried out the illumination operation on user's visual direction.
As the structure that obtains external information, image acquisition and display device 101 comprise ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.
Ambient sensors 119 particularly is luminance sensor, temperature sensor, humidity sensor, barometric pressure sensor etc., is used to detect the surrounding environment of the information of relevant surrounding brightness, temperature, humidity, weather etc. as image acquisition and display device 101.
Obtain sensor of interest 120 and be the transducer of the information of obtaining target of the image acquisition operations that detects relevant image acquisition section 103.Obtaining sensor of interest 120 can be distance measuring sensor, and it detects about for example information from image acquisition and display device 101 to the distance of obtaining target.
Obtain sensor of interest 120 and can be the transducer such as infrared sensor, it is the thermoelectric pickup that detects the information of the predetermined wavelength that obtains the emission of target infrared line and energy.In this case, obtain sensor of interest 120 and can detect whether obtain target be live body such as people or animal.
As an alternative, obtaining sensor of interest 120 can be such as one of various types of UV (ultraviolet ray) transducer such transducer, and the predetermined wavelength of ultraviolet ray emission of target and the information of energy are obtained in its detection.In this case, obtain sensor of interest 120 and can detect whether obtain target be fluorescent material or phosphor, and detect the tanned necessary external ultraviolet radiation emission measure of the antagonism sun.
GPS receiving unit 121 receives radio wave from GPS (global positioning system) satellite, obtains the current location of image acquisition and display device 101, and the latitude and longitude information of output current location.
Date and time segment count 122 is so-called clock parts, and it is counted date and time (year, month, day, hour, minute and second), and output current date and the information of time.
Image analyzing section 128 is analyzed by image acquisition section 103 and is obtained the image that image signal processing section 115 is obtained and handled.In other words, the image that image analyzing section 128 is analyzed as subject, and obtain to be included in the information of obtaining the subject in the image.
Communications portion 126 is carried out data communication with external equipment.The example of external equipment comprises the equipment of any kind with the information processing function and communication function, such as computer equipment, PDA (personal digital assistant), mobile phone, video equipment, audio frequency apparatus and tuner apparatus.
In addition, the example of external equipment can comprise terminal equipment and the server apparatus that is connected to the network such as Internet.
In addition, the example of external equipment can comprise non-contact type communication IC-card, and it has the therefrom holographic memory of acquired information of build-in IC chip, the two-dimentional bar code such as the QR sign indicating number and communications portion 126.
In addition, the example of external equipment can comprise another image acquisition and display device 101.
Communications portion 126 can with for example with Wireless LAN system, Bluetooth system etc. corresponding near access point communicate.As an alternative, communications portion 126 can directly communicate with the external equipment with respective communication function.
These ambient sensors 119, obtain the external information that sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126 obtain image acquisition and display device 101, and the information that is obtained is offered system controller 110.
The processing of system controller 110 executable operations controlled function 110b is so that control image acquisition operations and display operation accordingly with determined the external information that function 110a obtains by external circumstances.In other words, system controller 110 order image acquisition control sections 111 control image acquisition section 103 and the operation of obtaining image signal processing section 115.In addition, the operation of system controller 110 order display control sections, 114 control display image processing sections 112 and display driving part 113.
In this embodiment, the structure that obtains external information comprise ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.Yet, in this structure, can omit some in them.In addition, can in image acquisition and display device 101, arrange other transducer, such as the speech analysis part of voice around detecting and analyzing.
[3. exemplary display image]
System controller 110 is controlled image acquisition operations and display operation accordingly with the external information that is obtained.Therefore, the various display modes of User Recognition display part 102.Fig. 3 A to Fig. 3 C to Fig. 9 A to Fig. 9 C, Figure 21 A and Figure 21 B and Figure 22 A exemplarily illustrated various display modes to Figure 22 B.
Fig. 3 A shows display part 102 wherein and is in state by state.In other words, under this state, display part 102 is simple transparent flat parts, and the user can see scene in the visual field by transparent display part 102.
Fig. 3 B shows the image that is wherein obtained by image acquisition section 103 and is displayed on the state that monitors on the display part 102 of operating under the show state that is in.Image acquisition section 103, obtain image signal processing section 115, display image processing section 112 and display driving part 113 and operate, so that make them normally on display part 102, show the image that obtains with the state shown in Fig. 3 A.In this case, show on the display part 102 to obtain image (image that normally obtains) almost identical with the image that manifests on by the display part 102 of operating under the state.In other words, under this state, the user sees that the normal visual field is as the image that obtains.
Fig. 3 C shows wherein, and system controller 110 makes image acquisition section 103 obtain the state of distant view image by image acquisition control section 111.
By contrast, when system controller 110 causes image acquisition section 103 to obtain wide angle picture by image acquisition control section 111, on display part 102, show closely wide angle picture (not shown).Though image acquisition section 103 is carried out distant view and wide-angle control by the zoom lens that drive image acquisition section 103, obtains image processing section 115 and can carry out these control by processing signals.
Fig. 4 A show display part 102 wherein be in by under the state, for example user just reading the state of newspaper.
Fig. 4 B shows so-called wide-angle zoom state.In other words, Fig. 4 B shows such state, under this state, obtains nearly focal length zoom image, and it is presented at display part 102, so that for example amplify the character in the newspaper.
Fig. 5 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in by under the state.
At this moment, when 114 carries out image processing and amplifying are controlled by showing in system controller 110 order display image processing sections 112, on display part 102, show the enlarged image shown in Fig. 5 B.
Fig. 6 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in by under the state.Especially, Fig. 6 A shows the state that user is wherein reading newspaper or books.In this case, suppose because around be dim, so the user can not or pass through to see under the state character in the newspaper etc. in display part 102 with the image that normally obtains.
In this case, system controller 110 order image acquisition control sections 111 (image acquisition section 103 and obtain image signal processing section 115) increase image acquisition sensitivity, and/or make display control section 114 (display image processing section 112 and display driving part 113) increase brightness and adjustment contrast and acutance, so that on display part 102, show than more sharpening, shown in Fig. 6 B the image of the image shown in Fig. 6 A.On the contrary, when system controller 110 makes illumination section 104 carry out the illumination operation, can be on display part 102 display image sharply.
Fig. 7 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in by under the state.In this case, the user is in the dark bedroom that wherein child is sleeping.Because the user is in the dark room, thus he or she can not utilize the image that normally obtains or display part 102 pass through clearly see child under the state.
At this moment, when system controller 110 order image acquisition control sections 111 (image acquisition section 103 and obtain image signal processing section 115) increase infrared image and obtain sensitivity, on display part 102, show the infrared image that obtains shown in Fig. 7 B, so that make the user can see child's sleeping face etc.
Fig. 8 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in by under the state.
When system controller 110 order image acquisition control sections 111 (image acquisition section 103 and obtain image signal processing section 115) increase ultraviolet image and obtain sensitivity, on display part 102, show shown in Fig. 8 B, have ultraviolet component obtain image.
Fig. 9 A shows wherein, and display part 102 is in by the state under the state.
When system controller 110 order display control sections 114 (display image processing section 112 and display driving part 113) display images or display image and partly during enlarged image respectively respectively, can be on display part 102 image shown in the displayed map 9B.In other words, the screen of display part 102 is separated into regional AR1 and AR102, and wherein regional AR1 is in by state or is under the normal picture show state, and regional AR102 is under the enlarged image show state.
Fig. 9 C shows another exemplary separation and shows.In this case, the screen of display part 102 is separated into regional AR1, AR102, AR3 and AR4, and these zones show the frame with the image of predetermined amount of time interval acquiring.System controller 110 makes display image processing section 112 extract a frame with 0.5 second interval from the picture signal of being obtained, and shows the frame that is extracted with the order of regional AR1, AR2, AR3, AR4, AR1, AR2 etc.In this case, the image that on display part 102, shows so-called flicker display mode discretely.
Figure 21 A shows such state, and wherein display part 102 shows that the image or the display part 102 that normally obtain are in by under the state.Because the image shown in Figure 21 A is the scene that the Association football stadium of sunlight and shadow edge is wherein arranged, be difficult to see image on the roof.
System controller 110 is to increase image acquisition sensitivity or display brightness with the corresponding pixel of dash area on ccd sensor or the cmos sensor.On the contrary, system controller 110 is to reduce image acquisition sensitivity or display brightness with the corresponding pixel of sunlight.Therefore, shown in Figure 21 B, show the image that has wherein reduced sunlight and shade influence.
Figure 22 A and Figure 22 B show wherein and to show that the image that comprises bird for example is so that highlight the state of bird.
When in image, detecting bird,, can prevent that then the user from omitting the bird as subject if highlight bird.
As highlighting treatment of picture, can reduce the brightness of interested part.As an alternative, can reduce the brightness of the part that is different from part interested.Can show interested part with colour.Can monochromaticly show the part that is different from part interested.As an alternative, can utilize any character image such as framework, cursor, indicator marker etc. to highlight interested part.
Above-mentioned display image only is exemplary.In other words, when control image acquisition section 103, the processing of obtaining image signal processing section 115, display image processing section 112 and display driving part 113 and operation, can realize various display modes.
For example, pre-display mode in respect of many types, such as, the distant view display mode, the wide-angle display mode, what change the scope from the distant view display mode to the wide-angle display mode dwindles display mode and enlarged display mode, the image enlarged display mode, image dwindles display mode, variable frame frequency display mode (obtain image or obtain image) with low frame rate with high frame frequency, the high brightness display mode, the low-light level display mode, variable contrast display mode, variable acutance display mode, increased the image display mode that obtains of sensitivity, increased the image display mode that obtains of infrared sensitivity, increased the image display mode that obtains of ultraviolet sensitivity, wherein intercepted the image display mode of predetermined band, the image effect display mode is (such as mosaic image, the brightness reverse image, the soft focus image, the part screen highlights image, has the image of variable color atmosphere etc.), slow display mode, display mode frame by frame, separate display mode with these display modes combine, with by state with obtain the display mode that separates that image combines, the flicker display mode, has the rest image display mode of the frame in the image that obtains etc.
[the 4. detection of pair external information]
As mentioned above, as the structure that obtains external information, image acquisition and display device 101 comprise ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126.
The example of ambient sensors 119 can comprise luminance sensor, temperature sensor, humidity sensor, barometric pressure sensor etc.
The luminance sensor detected image is obtained the brightness information with display device 101.
Temperature sensor, humidity sensor and barometric pressure sensor obtain to utilize it to determine the information of temperature, humidity, atmospheric pressure and weather.
Because ambient sensors 119 can be determined surrounding brightness, outdoor weather situation etc., so system controller 110 can be controlled image acquisition operations and display operation accordingly with surrounding brightness that is confirmed as external information and weather condition.
Obtain sensor of interest 120 and detect the relevant information of obtaining target.Obtaining sensor of interest 120 can be distance measuring sensor or thermoelectric pickup.Obtaining sensor of interest 120 can acquire the distance of obtaining target and utilize it to determine to obtain the information of target.
When obtain sensor of interest 120 after testing to obtain target apart from the time, system controller 110 can be controlled image acquisition operations and display operation accordingly with the distance that is detected.When obtaining sensor of interest 120 and detected that to obtain target be live body such as the people, system controller 110 can with obtain target and control image acquisition operations and display operation accordingly.
GPS receiving unit 121 obtains the latitude and longitude information of current location.When GPS receiving unit 121 when Reference Map database etc. has detected the longitude and latitude of current location, can obtain the information (the contiguous information of relevant current location) of relevant current location.When comprising system controller 110, image acquisition and display device 101 can visit and have the recording medium (such as HDD (hard disk drive) or flash memory) (not illustrating) of relative big recording capacity at Figure 20, and recording medium can obtain the information of relevant current location when having write down map data base.
Even image acquisition and display device 101 do not have built-in map data base, image acquisition and display device 101 also can cause communications portion 126 visits for example to have the webserver or the equipment of storing map database, the latitude and longitude information of current location is sent to this webserver or equipment, ask this webserver or equipment that the information of relevant current location is sent to communications portion 126, and receive this information.
The information example of relevant current location comprises place name, building name, organization names, firm name, name of station etc.
In addition, the information example about current location comprises information such as parking lot, theme park, music hall, arenas, cinema and sports ground, that represent building type.
In addition, the information example about current location comprises type such as beach, river, mountain range, mountain top, forest, lake and grassland, natural things and title.
About the information example of detail location more comprises zone in zone, ball park and the Association football stadium in the theme park and the zone in the music hall.
When having obtained the information of relevant current location, system controller 110 can be controlled image acquisition operations and display operation accordingly with near current location, the current location geographical conditions, mechanism etc.
122 pairs of date and time segment counts are for example counted year, month, day, hour, minute and second.System controller 110 can identify and corresponding current time of value, daytime or the evening of being counted by date and time segment count 122, the moon, season etc.Therefore, system controller 110 can be controlled image acquisition operations and display operation accordingly with daytime or evening (time), and with control these operations current season accordingly.
Image analyzing section 128 can detect relevant various types of information of obtaining target from the image that obtains.
At first, image analyzing section 128 can identify the type that target is obtained in conducts such as people, animal, natural things, building, machine from the image that obtains.For animal, image analyzing section 128 can be discerned and wherein obtain the situation of bird as subject, has wherein obtained kitten as situation of subject etc.For natural things, image analyzing section 128 can be discerned from the image that obtains and go to sea, mountain range, tree, river, lake, sky, the sun, the moon etc.For building, image analyzing section 128 can identify house, building, stadium etc. from the image that obtains.For equipment, image analyzing section 128 can identify conducts such as personal computer, AV (audiovisual) equipment, mobile phone, PDA, IC-card, two-dimensional bar and obtain target from the image that obtains.
When registering to various types of shape facilities that obtain target in the image analyzing section 128 in advance, the subject that can determine to obtain in the image accordingly and comprised with the feature of being registered.
In addition, image analyzing section 128 can be for example comes to detect subject by the difference of the consecutive frame of detected image from the image that obtains motion, for example rapid movement of subject.For example, image analyzing section 128 can detect the situation of wherein obtaining subject (for example, the player in sports tournament or the automobile that moving).
In addition, image analyzing section 128 can be determined ambient conditions by analysis image.For example, image analyzing section 128 can be determined because the brightness that daytime, evening or weather cause.In addition, image analyzing section 128 can identify rainy intensity etc.
In addition, image analyzing section 128 can determine wherein for example obtaining the situation of books or newspaper by analysis image.For example, image analyzing section 128 can be by identifying character or identifying books or the shape of newspaper is determined such situation from image.
When image analyzing section 128 had identified character, system controller 110 can offer speech synthesiser as text data with the character that is identified and divide 127.
In addition, as people during as subject, image analyzing section 128 can be come according to the face recognition people by analysis image.Well-knownly be, people's face can be registered as the personal characteristics data, it is the relative position information of facial construction unit.For example, at the center and the ratio between the nose (Ed/EN) of eyes, and, be unique for everyone at the center and the ratio between the mouth (Ed/EM) of eyes apart from Ed apart from EM and eyes apart from Ed apart from EN and eyes.In addition, people's face can not be subjected to the influence of the wear such as hair style, glasses etc.In addition, well-known is that people's face can not change along with he or she age.
Therefore, when the image that obtains comprised people facial, image analyzing section 128 can detect above-mentioned personal characteristics data by analysis image.
When image analyzing section 128 detects the personal characteristics data from the image that is obtained, can visit and wherein write down the recording medium of individual database if system controller 110 has for example HDD (hard disk drive), flash memory etc. as system controller 110, then image analyzing section 128 can obtain the personal information of subject from individual database.Even image acquisition and display device 101 do not have built-in individual database, system controller 110 also can make communications portion 126 visits for example have the webserver or the equipment of built-in individual database, ask this server or equipment that information is sent to communications portion 126, and receive unique individual's information from it.
When the user together with personal characteristics data when individual database has been registered that the user has run into, personal information such as everyone name, unit etc., if the user has run into specific people's (having obtained he or she image), then image acquisition can be retrieved relevant this individual information with display device 101 from individual database.
When the individual database of the information of having prepared wherein to have registered relevant famous person and personal characteristics data, if the user has run into a certain famous person, then image acquisition can be retrieved relevant this individual information with display device 101 from individual database.
Communications portion 126 can obtain various types of external informations.
For example, as mentioned above, that communications portion 126 can obtain is corresponding with the latitude and longitude information that sends from image acquisition and display device 101, personal characteristics data etc., by the information of external equipment retrieval.
In addition, communications portion 126 can be from the weather information of external equipment acquisition such as Weather information, temperature information and humidity information.
In addition, communications portion 126 can obtain mechanism's use information, forbid/permit photographic information, mechanism guides information etc. from external equipment.
In addition, communications portion 126 can obtain the identification information of external equipment.The example of identification information is included in device type and the device id that is identified as the network equipment in the predefined communication protocol.
In addition, communications portion 126 can obtain to be stored in the view data in the external equipment, the view data of being reproduced or being shown by external equipment, and the view data of being received by external equipment.
Hereinbefore, exemplarily illustrated ambient sensors 119, obtain the information that sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and communications portion 126 can obtain separately.As an alternative, a plurality of in these parts can detect and determine predetermined external information.
In conjunction with the humidity information that obtains by ambient sensors 119 etc. and by the Weather information that communications portion 126 receives, can determine the weather of current time more accurately.
In addition, compare with said structure, can with the information of the relevant current location that obtains by GPS receiving unit 121 and communications portion 126 and current location and the situation of determining to obtain target by the information that image analyzing section 117 obtains accordingly, more accurately.
[5. example operation]
In the image acquisition and display device 101 of this embodiment, system controller 110 with by ambient sensors 119, obtain sensor of interest 120, GPS receiving unit 121, date and time segment count 122, image analyzing section 128 and external information that communications portion 126 obtained is determined surrounding environment accordingly, obtained the situation of target etc., and control image acquisition operations and display operation accordingly, so that the vision of assistance and extending user with definite result.
Next, the various example operation that the control that is described in system controller 110 is carried out down.
Figure 10 shows the control and treatment as the operation controlled function 110b of system controller 110.
In step F 101, system controller 110 control display control sections 114 become display part 102 to pass through state.When initial conducting image acquisition and display device 1, flow process advances to step F 101.In step F 101, state is passed through to become in system controller 110 control display parts 102.
2 when being in by state in the display part, and in step F 102, system controller 110 determines whether to occur monitoring that show state starts triggers.Can in image acquisition and display device 101, arrange and monitor demonstration starting switch (not shown).When the user had operated this switch, system controller 110 can be determined to have occurred the supervision show state and start triggering.As an alternative, as will be as described in after a while, system controller 110 can be determined to have occurred the supervision show state accordingly with external information and start and trigger.
When definite result represented to have occurred supervision show state start trigger, flow process advanced to step F 103.In step F 103, system controller 110 execution monitorings show start-up control.In other words, system controller 110 order image acquisition control sections 111 make image acquisition section 103 and obtain image signal processing section 115 and carry out normal image acquisition operations.In addition, system controller 110 order display control sections 114 impel display image processing section 112 and display driving part 113 to make display part 102 show that the picture signal of being obtained is as the image that normally obtains.
In this processing procedure, as shown in Figure 3A the state that passes through is switched to the supervision show state that normally obtains image shown in Fig. 3 B.
When display part 102 showed the image that normally obtains (its with user identical by the scene seen under the state), flow process advanced to step F 104.In step F 104, whether system controller 110 monitoring image control occurred is triggered.In step F 105, whether system controller 110 monitoring the supervision show state occurred is finished triggering.
When system controller 110 determines and must change accordingly when monitoring display mode with determined external circumstances (surrounding environment, subject, current date and time, current location etc.) that function 110a determines by external circumstances that system controller 110 determines to have occurred images control triggering in step F 104.
When the user had utilized predetermined switch to carry out supervision display mode complete operation, system controller 110 determined that in step F 105 the supervision show state having occurred finishes triggering.Be similar to the triggering in the step F 102, system controller 110 can determine that having occurred the supervision show state accordingly with the external information that is detected finishes triggering.
When definite result represented to have occurred image control triggering, flow process advanced to step F 106 from step F 104.In step F 106, the display operation of image is obtained in system controller 110 controls.In other words, system controller 110 order image acquisition control sections 111 and display control section 114 make display part 102 with the corresponding display mode display image of the external circumstances on this time point.
After step F 106 has been controlled display mode, flow process advances to step F 104 or F105 at system controller 110.In step F 104 or step F 105, whether system controller 110 monitoring triggering occurred.
Finish when triggering when definite result represents to have occurred the supervision show state, flow process turns back to step F 101 from step F 105.In step F 101, system controller 110 order image acquisition control sections 111 are finished image acquisition operations, and order display control section 114 that display part 102 is become and pass through state.
When user's its power supply of having put on image acquisition and display device 101 and conducting, the operation controlled function 110b in the system controller 110 carries out control and treatment shown in Figure 10.
In this processing procedure, carry out display mode control accordingly with in step F 104, showing the definite result whether the control triggering has occurred.To Figure 30 A and Figure 30 B the concrete example that triggering is determined and controlled be described with reference to Figure 23 after a while.
The external circumstances that Figure 23 shows system controller 110 to Figure 30 A and Figure 30 B is determined the exemplary processes of function 110a.Suppose that these are handled with the processing of operation controlled function 110b shown in Figure 10 and carry out concurrently.Carry out these parallel processings,, carry out Figure 23 termly as Interrupt Process and handle to the detection shown in Figure 30 A and Figure 30 B so that for example when system controller 110 is being carried out processing shown in Figure 10.Figure 23 can be built in the program of carrying out processing shown in Figure 10 to the program of these processing shown in Figure 30 A and Figure 30 B.As an alternative, these programs can be other programs of regularly calling.In other words, the structure of these programs is not limited to specific structure.
Figure 23 shows the exemplary processes that determines whether to occur image control triggering in step F shown in Figure 10 104 to Figure 30 A and Figure 30 B.Figure 23 the information that determines whether with providing from ambient sensors 119 or image analyzing section 128 is provided has occurred the exemplary processes that image control triggers accordingly.
In step F shown in Figure 23 1201, in information that system controller 110 monitoring is detected by ambient sensors 119 and the information that obtains by image analyzing section 117 one or these two.In this example, ambient sensors 119 is luminance sensors.Image analyzing section 128 is analyzed and the corresponding surrounding brightness of image that obtains.
System controller 110 information definite and from one of ambient sensors 119 and image analyzing section 128 or these two acquisition are dark or too bright accordingly.For example, quantize detected brightness.When the brightness that detects is x lux or still less the time, system controller 110 is dark around determining.When the brightness that detects is y lux or more for a long time, system controller 110 is too bright around determining.
When being dark around definite result represents, flow process advances to step F 1204 from step F 1202.In step F 1204, system controller 110 is determined to have occurred image control and is triggered.
In step F 1205, system controller 110 calculates and the corresponding adjusted value of current brightness (darkness) on every side.For example, system controller 110 obtains for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can see comfily with the image that obtains.
When having carried out the processing of step F 1204 and step F 1205, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition section 103 are adjusted image acquisition sensitivity, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.When these parts have been carried out these processing, adjust the quality of the image that shows on the display part 102.Therefore, even be dark on every side, around the user also can be clear that with the image that shows on the display part 102.For example, be dark around wherein and situation show image as shown in Figure 6A on display part 102 is changed to user wherein and can be clear that the situation of this image.
When being dark around definite result represents, system controller 110 can be controlled illumination section 104 and throw light on.
When definite result was too bright around representing, flow process advanced to step F 1206 from step F 1203.In step F 1206, system controller 110 is determined to have occurred image control and is triggered.
In step F 1207, system controller 110 calculates and the corresponding adjusted value of current brightness on every side.System controller 110 obtains for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can see comfily.In this case, since too bright on every side, so the user feels to dazzle the eyes.Therefore, system controller 110 obtains such adjusted value, utilizes this adjusted value to reduce image acquisition sensitivity and display brightness.
When having carried out the processing of step F 1206 and step F 1207, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 order image acquisition section 103 are adjusted image acquisition sensitivity, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In these are handled, adjust the quality of the image that shows on the display part 102.Therefore, even too bright on every side, the user also can see on every side with the image that shows on the display part 102 and not feel to dazzle the eyes.
Figure 24 A shows and determines whether with that detected from ambient sensors 119 or occurred the exemplary processes that image control triggers accordingly from the information that communications portion 126 provides.
In the step F 1301 shown in Figure 24 A, in information that system controller 110 monitoring is detected by ambient sensors 119 and the information that receives by communications portion 126 one or these two.The example of ambient sensors 119 comprises temperature sensor, humidity sensor and barometric pressure sensor.Communications portion 126 receives weather information continuously from for example webserver etc.
System controller 110 can be determined and the corresponding weather condition on every side of the atmospheric pressure, humidity and the temperature that are detected by for example ambient sensors 119.In addition, system controller 110 can be determined and the corresponding weather condition of weather information that is received by communications portion 126.In order to receive weather condition from the webserver, system controller 110 will send to the webserver continuously by the current location information that GPS receiving unit 121 obtains, and from the weather information of webserver reception with the corresponding zone of current location.
Though system controller 110 can determine with the information that receives by ambient sensors 119 detected information or by communications portion 126 corresponding around weather condition, system controller 110 can be determined and two types the corresponding weather condition of information more accurately than said structure.
Weather conditions such as system controller 110 determines whether to rain, overcast with weather condition such as clear sky, broken sky, rainy day, lightning, typhoon, snowing etc. or such as beginning to rain, stopping to change adjusts image accordingly.When definite result represented to adjust image, flow process advanced to step F 1303 from step F 1302.In step F 1303, system controller 110 is determined to have occurred image control and is triggered.In step F 1304, system controller 110 calculates and the corresponding adjusted value of current weather.System controller 110 obtains for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make around the user can be clear that with the image that obtains.
When having carried out the processing of step F 1303 and F1304, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition section 103 are adjusted image acquisition sensitivity, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In these are handled, adjust the quality of the image that shows on the display part 102 accordingly with weather condition.Around the user can clearly see with the image that shows on the display part 102.
As an alternative, system controller 110 can be controlled illumination section 104 accordingly with weather condition and throws light on.
In this example, system controller 110 is determined and information that is detected by ambient sensors 119 or the corresponding weather condition of information that received by communications portion 126.When image analyzing section 128 identified the image of rainy day, system controller 110 can detect exactly and begin to rain, finish to rain, lightning appearance etc.
Figure 24 B shows the exemplary processes that the image control that determines whether to have occurred accordingly causing night vision function to be operated with the information that is detected by ambient sensors 119 triggers.
In the step F 1310 shown in Figure 24 B, the information that system controller 110 monitoring are detected by ambient sensors 119.In this example, ambient sensors 119 is luminance sensors.
Whether system controller 110 is dark around determining accordingly with the information that is detected by ambient sensors 119.The detected brightness of system controller 110 digitlizations.When the brightness that detects is x lux or still less the time, system controller 110 is dark around determining.
When being dark around definite result represents, flow process advances to step F 1313 from step F 1311.In step F 1313, system controller 110 determines that the image control that has occurred causing night vision function to be opened triggers.
When having carried out the processing of step F 1313, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 control image acquisition and display device 101 are opened night vision function.In other words, the infrared image of system controller 110 order image acquisition control sections 111 increase image acquisition section 103 obtains sensitivity.
In this processing procedure, carry out night vision function.Therein shown in Fig. 7 A because around be dark and make under the situation of user around can not seeing, on display part 102, show increase shown in Fig. 7 B infrared sensitivity obtain image.Therefore, the user can see the ambient conditions in the dark place.
When dark, flow process advances to step F 1312 from step F 1311 around definite result represents.In this case, in step F 1312, when having opened night vision function (increasing the image acquisition operations of infrared sensitivity), flow process advances to step F 1314.In this case, in step F 1314, system controller 110 determines that the image control that has occurred causing night vision function to be closed triggers.When having carried out the processing of step F 1314, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 control image acquisition and display device 101 are closed night vision function.In other words, system controller 110 order image acquisition control sections 111 obtain sensitivity with infrared image and revert to normal image acquisition sensitivity and carry out normal image acquisition operations.
In the processing shown in Figure 24 B, when the user is in the darkroom etc., automatically open night vision function so that he or she in the darkroom, can clearly see around.On the contrary, when the user has left darkroom etc., automatically close night vision function.In other words, realized increasing accordingly the processing of user's visual capacity with ambient conditions.
As an alternative, whether image analyzing section 128 is dark around can detecting by the image that analysis is obtained.When having reduced whole brightness of obtaining image widely, is dark around determining, system controller 110 can determine that the image control that has occurred causing night vision function to be opened triggers.
Figure 25 shows and determines whether and occurred the exemplary processes that image control triggers accordingly by obtaining information that sensor of interest 120 detects or the information that is obtained by image analyzing section 128.
In step F shown in Figure 25 1401, system controller 110 monitoring is by one in the information of obtaining information that sensor of interest 120 detects and being obtained by image analyzing section 128 or these two.Obtain sensor of interest 120 and for example be distance measuring sensor.Image analyzing section 128 is by analyzing the distance that the image that is obtained calculates subject.
In other words, system controller 110 determines accordingly that with the information that is obtained by the information of obtaining sensor of interest 120 detections and/or image analyzing section 128 target (obtaining target) that the user is watching is away from for example he or she hand or approaching with it.
When the user is watching at a distance scene or match away from the place, seat of place, stadium, and system controller 110 determined to obtain target a long way off the time, and flow process advances to step F 1404 from step F 1402.In step F 1404, system controller 110 determines that the image control that has occurred causing display mode to be switched to distant view zoom display mode triggers.After this, flow process advances to step F 1405.In step F 1405, system controller 110 calculate with to the corresponding appropriate zoom magnification ratio of the distance of obtaining target.
When the user is watching scene nearby or at he or her on hand newspaper, and system controller 110 determined to obtain target on hand the time, and flow process advances to step F 1406 from step F 1403.In step F 1406, system controller 110 determines that the image control that has occurred causing display mode to be switched to amplification (opening zoom) display mode triggers.After this, flow process advances to step F 1407.In step F 1407, system controller 110 calculate with to the corresponding appropriate zoom magnification ratio of the distance of obtaining target.
When the processing of the processing of having carried out step F 1404 and step F 1405 or step F 1406 and step F 1407, flow process advances to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are carried out the zoom operation with the magnification ratio that is calculated.
Therefore, display part 102 show that the scene of just seeing with the user is corresponding, the distant view image shown in Fig. 3 C or the wide-angle zoom image shown in Fig. 4 B.
In this example, system controller 110 control distant views/wide-angle zoom operation.As an alternative, system controller 110 can be controlled image acquisition and display device 101 and change focal position and amplification/downscaled images accordingly to the distance of obtaining target.
Figure 26 shows and determines whether and occurred the exemplary processes that image control triggers accordingly by obtaining information that sensor of interest 120 detects and the information that is obtained by image analyzing section 128.Especially, in this exemplary processes, determine to obtain target and whether comprise character in newspaper, the books etc.
In step F shown in Figure 26 1501, system controller 110 monitoring is by obtaining information that sensor of interest 120 detects and the information that is obtained by image analyzing section 128.In this example, obtaining sensor of interest 120 is distance measuring sensors.The image that image analyzing section 128 is obtained by analysis detects subject and whether comprises character.
System controller 110 with by obtaining information that sensor of interest 120 detects and/or determining accordingly that by the information that image analyzing section 128 obtains whether on hand and comprise character in newspaper, the books etc. target (obtaining target) that the user watching.In other words, system controller 110 determines whether users are reading in the newspaper in he or her hand.
When definite result represents to obtain target nearby and when comprising character, flow process advances to step F 1503 from step F 1502.In step F 1503, system controller 110 is determined to have occurred image control and is triggered.
In step F 1504, system controller 110 calculates the adjusted value that the user can use it to read comfily newspaper or books.For example, system controller 110 obtains for example adjusted value of display brightness, contrast, acutance, image acquisition sensitivity etc., so that make the user can read newspaper etc. comfily.
When having carried out the processing of step F 1503 and step F 1504, flow process advances to step F 106 from step F shown in Figure 10 104.In this case, in step F 106, system controller 110 order image acquisition section 103 are adjusted image acquisition sensitivity, and image signal processing section 115 is obtained in order or brightness, contrast, acutance etc. are adjusted in display image processing section 12.In this processing procedure, adjust the quality of the image that shows on the display part 102.Therefore, shown in Fig. 6 B, display part 102 explicit users can be known the image of reading.
Except character detects, can also detect surrounding brightness, and testing result may be influential to the calculating of adjusted value.
In addition, when image analyzing section 128 analysis images, image analyzing section 128 can also identify the shape of newspaper or books, advances to the condition of step F 1503 as flow process.
When system controller 110 determined that the targets of being obtained are newspapers etc., system controller 110 can be controlled illumination section 104 and throw light on.
Be not to adjust picture quality, but processing and amplifying can be carried out in display image processing section 112, and the enlarged image that display part 102 is shown shown in Fig. 4 B, so that make the user can clearly read character.
When image comprised character, image analyzing section 128 can be determined these characters, and they are offered system controller 110 as text data.In this case, system controller 110 makes speech synthesiser divide 127 to carry out and the synthetic processing of the corresponding sound of text data that detects from image.
Therefore, speech synthesiser divides 127 voice signals that generate as the reading voice of the character that obtains in the image to be comprised.System controller 110 makes voice output part 105 these reading voice of output.
Therefore, when the user watched newspaper etc., he or she can hear that it reads voice.
Figure 27 shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with current date of being counted by date and time segment count 122 and temporal information.
In step F shown in Figure 27 1601, system controller 110 is checked the current date and the time of being counted by date and time segment count 122.System controller 110 determines and corresponding time zones of current time (for example morning), for example early morning, morning, daytime, at dusk with the time zone in evening.For example, the time zone from 4 o'clock to 7 o'clock is early morning, and the time zone from 7 o'clock to 9 o'clock is morning, and the time zone from 9 o'clock to 17 o'clock is daytime, and the time zone from 17 o'clock to 19 o'clock is at dusk, and the time zone from 19 o'clock to 4 o'clock is evening.
Be preferably with determined month and date and change the time zone accordingly.For example because at sunrise and sunset time and month and date change accordingly, so change the time zone accordingly with month and date.For example, in summer, the time zone in early morning from 4 o'clock to 7 o'clock.For example, in the winter time, the time zone in early morning from 6 o'clock to 8 o'clock.
When definite result of step F 1601 represented that the time zone has changed, flow process advanced to step F 1603 from step F 1602.
When the time zone changed early morning into, flow process advanced to step F 1607 from step F 1603.In step F 1607, system controller 110 determine to have occurred causing for early morning carries out image obtain the image control triggering of operation/display operation.
When the time zone changed morning into, flow process advanced to step F 1608 from step F 1604.In step F 1608, system controller 110 determine to have occurred causing for morning carries out image obtain the image control triggering of operation/display operation.
When the time zone changed daytime into, flow process advanced to step F 1609 from step F 1605.In step F 1609, system controller 110 determine to have occurred causing for daytime carries out image obtain the image control triggering of operation/display operation.
When the time zone had changed at dusk, flow process was from the step F 1606 step F1610 that advances.In step F 1610, system controller 110 determines that the image control that has occurred causing obtaining operation/display operation for the dusk carries out image triggers.
When the time zone changed evening into, flow process advanced to step F 1611.In step F 1611, system controller 110 determine to have occurred causing for evening carries out image obtain the image control triggering of operation/display operation.
When the definite result at step F 1607, F1608, F1609, F1610 or F1611 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 and display control section 114 are carried out and the corresponding image acquisition operations/display operation in time zone.For example, system controller 110 order image acquisition control sections 111 and display control section 114 are adjusted for example image acquisition sensitivity, brightness, contrast, acutance etc.As an alternative, system controller 110 can be ordered image acquisition control section 111 and the image effect operation of display control section 114 execution such as the soft focus display operation.
In this processing procedure, will offer the user with the corresponding image in time zone.For example, in the morning, the softening image is offered the user.In the morning, provide picture rich in detail to the user with strong contrast.Between the lights, provide tan image to the user.At night, provide dark image to the user.Therefore, can provide and the corresponding image of he or she mood and time zone to the user.
Certainly, can adjust picture quality accordingly, so that improve observability with the brightness that in each time zone, changes.
Except the time zone, can also with weather condition and user whether the user situation in the room adjust the quality of image accordingly.
As an alternative, can adjust the quality of image accordingly accordingly rather than with the time zone with utilizing date and time information determined season.In summer, for example increase the weight of the blue component of image.In autumn, for example increase the weight of the red component of image.In the winter time, for example increase the weight of the white color component of image.In spring, for example increase the weight of green/pink component.Therefore, can provide image to the user with sensation in season.
Figure 28 A shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is received by GPS receiving unit 121 and communications portion 126.
In the step F 1701 shown in Figure 28 A, system controller 110 makes the latitude and longitude information of the current location that communications portion 126 will obtain from GPS receiving unit 121 send to the webserver or the equipment with storing map database, make this server or equipment information, and make communications portion 126 therefrom receive the information of relevant current location from the relevant current location of database retrieval.When image acquisition and display device 101 have the storing map database, system controller 110 can with the latitude and longitude information that receives from GPS receiving unit 121 accordingly from the information of the relevant current location of storing map database retrieval.
System controller 110 determines accordingly with the current location information that is obtained whether the user is in the place that wherein must carry out predetermined image control.When determining that the result represents that current location is that flow process advances to step F 1703 from step F 1702 in the time of wherein must carrying out the place of predetermined image control.In step F 1703, system controller 110 determines that the image control that has occurred causing carrying out predetermined image control triggers.
When definite result represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 and display control section 114 are carried out predetermined picture control.
In this case, the example of image control is as follows.
When testing result represents that current location is sports ground, circuit etc., because the target that the user sees (obtaining object) is people, automobile of fast moving etc., so system controller 110 order image acquisition control sections 111 increase the image acquisition frame frequency, so that can show the subject of fast moving rightly.
When current location was concert hall, music hall, amusement assembly hall, sports ground etc., system controller 110 can be ordered image acquisition control section 111 to be carried out with the corresponding distant view image of the distance of obtaining target to stage and be obtained operation.In the time can being determined to the distance of obtaining target such as stage, can the distant view magnification ratio be set accordingly with the distance that is detected as current location information.Can detect the distance of obtaining target by obtaining sensor of interest 120 (distance measuring sensor).Can the distant view magnification ratio be set accordingly with the distance that is detected.Be not to carry out the distant view operation, but system controller 110 can be ordered and obtained image processing section 115 or display image processing section 112 carries out image processing and amplifying.
When current location is beach or mountain range, system controller 110 order image acquisition control sections 111 are carried out the image acquisition operations that increases ultraviolet sensitivity, the image that display part 2 is shown shown in Fig. 8 B, and make the user identify amount of ultraviolet irradiation.
In addition, can be with being superimposed upon the place name in the mechanism that obtains target, shop etc. or nominally with corresponding character picture of the current location information that is obtained or text.Can on display part 102, show advertising message, mechanism guides information and the alarm signal of peripheral region.
Figure 28 B shows and determines whether with by the information of GPS receiving unit 121 receptions and the information that is received by communications portion 126 to have occurred the exemplary processes that image control triggers accordingly.Especially, when carrying out the image acquisition operations that increases infrared sensitivity, carry out this exemplary processes.
When image acquisition section 103 was being carried out the image acquisition operations that increases infrared sensitivity, flow process advanced to step F 1711 from the step F 1710 shown in Figure 28 B.
In step F 1711, system controller 110 makes the latitude and longitude information of the current location that communications portion 126 will obtain by GPS receiving unit 121 send to the webserver or the equipment with storing map database, make this server or equipment information, and make communications portion 126 therefrom receive the information of relevant current location from the relevant current location of map datum library searching.When image acquisition and display device 101 had the storing map database, system controller 110 and the latitude and longitude information that is received by GPS receiving unit 121 were accordingly from the information of the relevant current location of storing map database retrieval.
When system controller 110 had obtained the information of relevant current location, system controller 110 determined whether must forbid increasing the image acquisition operations of infrared sensitivity on current location.
When definite result is illustrated in must forbid increasing the image acquisition operations of infrared sensitivity on the current location time, flow process advances to step F 1713 from step F 1712.In step F 1713, system controller 110 is determined to have occurred to cause to increase the image control that the image acquisition operations of infrared sensitivity finishes and is triggered.
When the definite result in step F 1713 represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 are finished the image acquisition operations that increases infrared sensitivity.
Because forbidden accordingly increasing the image acquisition operations of infrared sensitivity, so can prevent to use improperly the particular image reed such as the image acquisition operations that increases infrared sensitivity to get function with this position.
Figure 29 A shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 128.
In the step F 1801 shown in Figure 29 A, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained by analysis detects subject and whether comprises target.
When analysis result represented that the image that obtains comprises target, flow process advanced to step F 1803 from step F 1802.In step F 1803, system controller 110 is determined to have occurred image control and is triggered.
When the definite result in step F 1803 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 order image acquisition control sections 111 and display control section 114 are carried out predetermined picture control.
The example of image control is as described below.
If target is a bird, when then having detected bird in obtaining image, system controller 110 can be ordered display image processing section 112 to be depicted as bird in the image as Figure 22 A and Figure 22 B and partly be carried out and highlight operation.In this case, when the user watched wild bird, he or she can easily find and follows them.
If being kitten and user, target likes cat, then when kitten has entered he or she the visual field, because in display image, highlighted kitten, so he or she can easily recognize it.
If target is the people, then when in the image that is obtained, detecting he or she the time, system controller 110 can order display image processing section 112, obtain image signal processing section 115 or image acquisition section 103 highlights, enlarges or enlarged image in people's part.
If target is people, animal, building etc., display-object rendering context not only then.
As an alternative, when the man-hour that detects as target, can carry out the image processing of only from image, wiping the people.For example, can show the image of wherein from natural scene, covering people and the artificiality such as automobile.In this case, can be by carrying out the processing that interpolation processing is carried out the pixel portion of filling target with the surrounding pixel of wanting concealed target.
In addition, can operate for the image effect that the target such as the people is carried out such as the mosaic display operation.
With the information processing shown in the execution graph 29A accordingly that obtains by image analyzing section 128.As an alternative, if target is the live body such as people or animal, then when having detected target as the thermoelectric pickup that obtains sensor of interest 120, system controller 110 can be determined to have occurred image control and trigger.
Figure 29 B shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 12.
In the step F 1810 shown in Figure 29 B, the information that system controller 110 monitoring are obtained by image analyzing section 128.Image analyzing section 128 is obtained image by for example utilizing the difference analysis of obtaining between the frame of image, whether detects subject just in fast moving.
When testing result was represented the positive fast moving of subject, flow process advanced to step F 1812 from step F 1811.In step F 1812, system controller 110 determines that the image control that has occurred causing carrying out the flicker display operation triggers.
When the definite result in step F 1812 represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order display control sections 114 carries out image processing are so that the image of demonstration shown in Fig. 9 C.
In this processing procedure, carry out the flicker display operation, when watching athletic competition with convenient user, if the positive fast moving of player, then he or she can see player's motion.
In this example, when detecting rapid movement, the triggering of flicker display operation appears causing carrying out.As an alternative, can occur causing display mode to be switched to the triggering of high frame frequency display operation.As an alternative, when obtaining target and comprise rapid movement, the triggering that causes display mode to be switched to the zoom display mode or highlight pattern can appear.
Figure 30 A shows and determines whether to have occurred the exemplary treatments that image control triggers accordingly with the information that is obtained by image analyzing section 128.In this example, obtained a man-hour when, discern he or she.
In the step F 1901 shown in Figure 30 A, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained by analysis determines whether subject comprises people's face.When subject comprised people facial, image analyzing section 128 generated the personal characteristics data according to face-image.As mentioned above, the personal characteristics data for example are the ratio apart from Ed apart from EN and eyes (Ed/EN) between eye center and nose, and the ratio apart from Ed apart from EM and eyes between eye center and mouth (Ed/EM).
When having extracted the personal characteristics data, flow process advances to step F 1903 from step F 1902.In step F 1903, system controller 110 is retrieved from individual database and the corresponding personal information of personal characteristics data.
For example, system controller 110 makes communications portion 126 the personal characteristics data be sent to the webserver or the equipment with built-in individual database, make this server and equipment from individual database, retrieve personal information, and make communications portion 126 therefrom receive the result who is retrieved.When image acquisition and display device 101 had built-in individual database, system controller 110 can be retrieved from individual database and the corresponding personal information of personal characteristics data.
When external equipment or system controller 110 from the personal data library searching during personal information of predetermined persons, flow process advances to step F 1905 from step F 1904.In step F 1905, system controller 110 determines that the image control that has occurred causing showing personal information triggers.
When definite result represented to have occurred image control triggering, flow process advanced to F106 from step F shown in Figure 10 104.In step F 106, system controller 110 order display control sections 114 for example are added to the personal information of being retrieved on the image that obtains.
When the user sees a people or a certain famous person that the user had met in the crowd who is walking, and when this people or famous person be registered in the individual database, display part 102 was presented at the information of registering in the individual database (place that name, unit, user once met etc.) together with this people's image.Therefore, the user can recognize this people exactly.
Figure 30 B shows and determines whether to have occurred the exemplary processes that image control triggers accordingly with the information that is obtained by image analyzing section 128.When shown in Figure 21 A, when making that owing to sunlight and shade image is difficult for seeing, carry out this processing procedure.
In the step F 1910 shown in Figure 30 B, the information that system controller 110 monitoring are obtained by image analyzing section 128.The image that image analyzing section 128 is obtained by analysis whether detect in the image that is obtained since sunshine situation and bright areas and dark area have appearred.
When analysis result is illustrated in when having sunlight and shade difference in this image, flow process advances to step F 1912 from step F 1911.In step F 1912, system controller 110 is determined to have occurred image control and is triggered.
When the definite result in step F 1912 represented to have occurred image control triggering, flow process advanced to step F 106 from step F shown in Figure 10 104.In step F 106, system controller 110 is ordered image acquisition control section 111 and display control section 114 carries out image processing or is partly changed image acquisition sensitivity, so that the difference of sunlight and shade is disappeared.Therefore, shown in Figure 21 B, provide the image that not too is subjected to sunlight and shade to influence and be easy to see to the user.
When image comprises because the influence of the illumination in room or the mechanism etc. rather than owing to the influence when causing bright and dark part difference of situation at sunshine, perhaps when image section was not known, system controller 110 can order image acquisition control section 111 and display control section 114 parts to adjust brightness, image acquisition sensitivity, contrast etc.
In Figure 30 A and Figure 30 B, described in the step F 104 of Figure 10, determining whether to occur the exemplary processes that image control triggers at Figure 23.This exemplary processes can be applied to determine whether in the step F shown in Figure 10 102 to occur monitor that show state starts the processing that triggers, and has determined whether to occur to monitor that show state finishes the processing of triggering in the step F 105 shown in Figure 10.
When show state in step F shown in Figure 10 101 is when passing through state, be similar to exemplary processes shown in Figure 23, if dark or too bright situation around having detected, then can determine to have occurred the supervision show state and start triggering, and can be switched to the supervision show state by state.
When the exemplary process shown in the image pattern 24A determines like that the result represents because weather condition and must adjusting when obtaining image can determine to have occurred supervision show state startup triggering.In this case, under predetermined weather condition, can the execution monitoring Presentation Function.
Determine the result like that when the exemplary process shown in the image pattern 24B and represent when dark on every side, can determine to have occurred the supervision show state and start and trigger.In this case, when dark on every side, execution monitoring Presentation Function automatically.
Obtain target like that a long way off or nearby the time when the exemplary processes shown in the image pattern 25, can determine to have occurred the supervision show state and start and trigger.
When the exemplary processes shown in the image pattern 26 like that detected near the user comprise the image of character the time, can determine to have occurred the supervision show state and start trigger.
Exemplary processes similar and shown in Figure 27 can be determined to have occurred to start triggering with the corresponding supervision show state in time zone.
When the such current location of the exemplary processes shown in the image pattern 28A is the predetermined area, can determines to have occurred the supervision show state and start triggering.In this case, can with the predetermined area or class of establishment execution monitoring Presentation Function accordingly.
When the exemplary processes shown in the image pattern 29A has predeterminated target like that, can determine to have occurred the supervision show state and start triggering.
Like that obtained when detecting rapid movement in the target when the exemplary processes shown in the image pattern 29B, can determine to have occurred the supervision show state and start and trigger.
When the exemplary processes shown in the image pattern 30A has like that detected predetermined persons, can determine to have occurred the supervision show state and start triggering.
When there be bright and dark the distribution like that in the exemplary processes shown in the image pattern 30B in image, can determine to have occurred the supervision show state and start and trigger.
In these exemplary processes, when determining that the supervision show state having occurred starts triggering, flow process advances to step F shown in Figure 10 103.Therefore, when having put on, the user is in by the image acquisition under the state and display device 101 and when not carrying out special operational, image acquisition and display device 101 to be operating with the corresponding supervision show state of situation, and the user can with the corresponding supervision show state of this situation under see image.
Similarly, can determine whether to have occurred the supervision show state and finish triggering.
In exemplary processes shown in Figure 23, when detecting on every side when dark or too bright, therein around under the neither dark not too bright again situation, determine that the supervision show state having occurred finishes triggering, and show state can return to and passes through state.
In the exemplary processes shown in Figure 24 A, determine whether the image that must obtain owing to weather condition adjustment.When definite result represents to adjust when obtaining image, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Whether be similar to the exemplary processes shown in Figure 24 B, be dark around can determining.When around when not dark, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Be similar to exemplary processes shown in Figure 25, can determine to obtain target and be a long way off or on hand.To represent to obtain target not only not far but also when near as definite result, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 26 is such, when near the user, detecting the image that comprises character, also do not detect therein under the situation of image, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 27 is such, can determine and time zone, month and/or date, season etc. have occurred the supervision show state accordingly in and finish triggering.
Exemplary processes shown in the image pattern 28A is such, when current location is the precalculated position, can determines to have occurred the supervision show state and finish triggering.In this case, can stop image-acquisition functions accordingly and monitor Presentation Function with precalculated position or class of establishment.
Exemplary processes shown in the image pattern 28B is such, when stopping to increase the image acquisition operations of infrared sensitivity, can determine that the supervision show state having occurred finishes triggering in step F 1713, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 29A is such, when having predetermined subject, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.For example, in this case, forbid under the supervision show state, obtaining and/or showing predetermined subject.
As an alternative, when definite result represents predetermined subject, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 29B is such, when detecting the rapid movement that obtains target, also do not detect therein under the situation of rapid movement, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 30A is such, when detecting predetermined persons, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.In this case, forbid under the supervision show state, obtaining and/or showing this predetermined persons.
As an alternative, when determining in image, not have predetermined persons, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
Exemplary processes shown in the image pattern 30B is such, when in image, detecting bright and dark the distribution, also do not detect therein under the situation of bright and dark difference, can determine to have occurred the supervision show state and finish triggering, and show state can return to and passes through state.
In these exemplary processes, trigger and flow process when turning back to step F 101 shown in Figure 10 when determining to have occurred monitoring that show state is finished, therein the user needs that monitor show state are reduced or situation about disappearing under, forbid therein perhaps monitoring under the situation of Presentation Function that show state can automatically switch to and pass through state.
[the 6. effect of second embodiment, modification and expansion]
According to this embodiment, the image that obtains by the image acquisition section 103 that is arranged in glasses type installation unit or the helmet type installation unit, that is,, be displayed on the display part 102 of user he or she eyes front as the image that obtains on the eyes of user direction of subject direction.In this case, relevant with the motion of relevant surrounding brightness as external circumstances, weather, situation, identification, subject, position, date and time etc. information is controlled image acquisition operations or display operation accordingly.Therefore, can create the situation of assistance in fact or extending user visual capacity.
Because with external circumstances accordingly carries out image obtain the image acquisition operations of part 103 and change with the corresponding display mode of the signal processing of obtaining image signal processing section 115 and display image processing section 112, so do not apply the operation burden to the user.In addition, because controlled image acquisition and display device 101 rightly, so the user can easily use it.
In addition, because make it become the transparent or translucent state that passes through, so when the user put on installation unit, it can not upset he or she normal life by the transmissivity of control display part 102.Therefore, in user's normal life, can use effectively according to the image acquisition of this embodiment and the benefit of display device 101.
In this embodiment, the image acquisition operations of image acquisition section 103 and the display mode of being realized by the signal processing of obtaining image signal processing section 115 and display image processing section 112 have mainly been described.For example, can control the switching of energising, outage and standby accordingly with external circumstances, and from the volume and the tonequality of the sound of voice output part 105 output.For example, can adjust volume accordingly with time and/or place.As an alternative, volume on every side can be detected, and the output volume of loud speaker can be adjusted accordingly with the volume on every side that is detected.
The outward appearance of image acquisition and display device 101 and structure are not limited to Fig. 1, Fig. 2 and those outward appearances and structure shown in Figure 20.As an alternative, can carry out various modifications.
For example, can in image acquisition and display device 101, arrange the storage area of storing the picture signal of obtaining by image acquisition section 103, and the hop that picture signal is sent to miscellaneous equipment.
Except image acquisition section 103, can also in image acquisition and display device 101, arrange importation and receiving unit from the external equipment input picture as the source of the image of demonstration on the display part 102.
In this embodiment, having described wherein, image acquisition and display device 101 are examples of glasses type installation unit or helmet type installation unit.Yet the direction of family eyes is obtained image and at the front display image of he or her eyes, this equipment can be any kind that the user can put on, such as headphone type, neckline type, ear-hung etc. as long as image acquisition and display device are continued to use.As an alternative, image acquisition and display device 101 can be to use installing component such as clip to be attached to unit on glasses, bongrace, the headphone etc.
Those skilled in the art are to be understood that: as long as within the scope of claim and their equivalent, depend on designing requirement and other factor, various modifications, combination, sub-portfolio and replacement can occur.

Claims (53)

1. image acquisition and display device comprise:
Image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of described subject;
Display unit is arranged in the front of eyes of user, and is used to show the image that is obtained by image acquiring device;
User profile obtains device, is used to obtain relevant user's the motion and the information of physical condition; And
Control device is used for controlling the operation of image acquiring device or the operation of display unit accordingly with the information that is obtained the device acquisition by user profile.
2. image acquisition as claimed in claim 1 and display device,
Wherein said control device is determined and is wished or user situation by the user profile acquisition corresponding user of information that device obtained, and controls the operation of described image acquiring device or the operation of described display unit accordingly with definite result.
3. image acquisition as claimed in claim 1 and display device,
Wherein said display unit can be passed through state from switching to image display status by state or conversely show state being switched to from image display status with show state, wherein said is transparent or translucent by state, and in described image display status, show the image that obtains by described image acquiring device.
4. image acquisition as claimed in claim 1 and display device,
Wherein said user profile obtains the transducer that device is sense acceleration, angular speed or vibration.
5. image acquisition as claimed in claim 1 and display device,
The transducer of the motion of the motion of the motion that wherein said user profile acquisition device is a detection user head, the motion of user's arm, user's hand, the motion of user's shank or the whole health of user.
6. image acquisition as claimed in claim 1 and display device,
It is the transducer that detects user's not walking states, walking states and running state that wherein said user profile obtains device.
7. image acquisition as claimed in claim 1 and display device,
It is the vision sensor that detects user's visual information that wherein said user profile obtains device.
8. image acquisition as claimed in claim 1 and display device,
The sight line that wherein said user profile acquisition device is the detection user, user's focal length, user's pupil dilation, user's eyeground pattern or user's eyelid movement is as the transducer of user's visual information.
9. image acquisition as claimed in claim 1 and display device,
It is the biological sensor that detects user's biological information that wherein said user profile obtains device.
10. image acquisition as claimed in claim 1 and display device,
It is to detect heart rate information, pulse information, perspiration information, brain wave information, electrodermal response information, blood pressure information, body temperature information or the respiratory activity information transducer as user's biological information that wherein said user profile obtains device.
11. image acquisition as claimed in claim 1 and display device,
The biological sensor of the information of tense situation that wherein said user profile acquisition device is the detection representative of consumer or user's excitatory state.
12. image acquisition as claimed in claim 1 and display device,
Wherein said user profile obtains device and is formed the importation that can import visual field information at least.
13. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled described image acquiring device and is started and stop image acquisition operations.
14. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled image acquiring device changeably, obtains from distant view image with execution and operates the image acquisition operations that wide angle picture obtains operation.
15. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the focal length of described image acquiring device.
16. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the image acquisition sensitivity of described image acquiring device changeably.
17. image acquisition as claimed in claim 1 and display device,
The infrared image that wherein said control device is controlled described image acquiring device changeably obtains sensitivity.
18. image acquisition as claimed in claim 1 and display device,
The ultraviolet image that wherein said control device is controlled described image acquiring device changeably obtains sensitivity.
19. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the frame frequency of described image acquiring device changeably.
20. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the operation of the image acquisition lens combination in the described image acquiring device.
21. image acquisition as claimed in claim 1 and display device,
The operation of wherein said control device control image acquisition signal processing, this image acquisition signal processing part divisional processing by described image acquiring device imageing sensor obtained obtains picture signal.
22. image acquisition as claimed in claim 1 and display device,
Wherein said display unit can be passed through state from switching to image display status by state or conversely show state being switched to from image display status with show state, wherein said is transparent or translucent by state, and in described image display status, show the image that obtains by described image acquiring device; And
Wherein said control device is controlled described display unit so that show state is passed through state from switching to image display status by state or conversely show state being switched to from image display status.
23. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled described display unit to enlarge or to dwindle the image that shows thereon.
24. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled described display unit to separate the image that shows thereon.
25. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the display brightness of the image that shows on the described display unit.
26. image acquisition as claimed in claim 1 and display device,
Wherein said control device is controlled the signal processing to the picture signal that shows on the described display unit.
27. image acquisition as claimed in claim 1 and display device also comprise:
Lighting device is used for illuminating this subject along the subject direction,
Wherein said control device is controlled described lighting device so that operate with the illumination of carrying out described lighting device accordingly by the described user profile acquisition information that device obtained.
28. image acquisition and display packing in image acquisition and the display device, this image acquisition and display device have: image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of described subject; And display unit, being arranged in user's front and being used to show the image that obtains by image acquiring device, described method comprises step:
Obtain the information of relevant user's motion or user's physical condition; And
Control the operation of described image acquiring device or the operation of described display unit accordingly with the information that obtains to obtain in the step in user profile.
29. image acquisition and display device comprise:
Image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of described subject;
Display unit is arranged in the front of eyes of user, and is used to show the image that is obtained by described image acquiring device;
External information obtains device, is used to obtain external information; And
Control device is used for controlling the operation of described image acquiring device or the operation of described display unit accordingly with the information that is obtained the device acquisition by described external information.
30. image acquisition as claimed in claim 29 and display device,
Wherein said display unit can be passed through state from switching to image display status by state or conversely show state being switched to from image display status with show state, wherein said is transparent or translucent by state, and in described image display status, show the image that obtains by described image acquiring device.
31. image acquisition as claimed in claim 29 and display device,
It is to detect the transducer of ambient environmental conditions as external information that wherein said external information obtains device.
32. image acquisition as claimed in claim 29 and display device,
It is to detect the transducer of the information of obtaining target of relevant described image acquiring device as external information that wherein said external information obtains device.
33. image acquisition as claimed in claim 29 and display device,
Wherein said external information obtains device and obtains current location information as external information.
34. image acquisition as claimed in claim 29 and display device,
Wherein said external information acquisition device acquisition current date and time are as external information.
35. image acquisition as claimed in claim 29 and display device,
Wherein said external information obtains device by obtaining external information with communicating by letter of external equipment.
36. image acquisition as claimed in claim 29 and display device,
Wherein said external information obtains device and obtains external information by analyzing the image that is obtained by described deriving means.
37. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is the information of surrounding brightness, temperature, humidity, atmospheric pressure or weather.
38. image acquisition as claimed in claim 29 and display device,
Wherein obtain the information that external information that device obtains is the relevant distance of obtaining target to described image acquiring device by described external information.
39. image acquisition as claimed in claim 29 and display device,
Wherein obtaining external information that device obtains by described external information is the information or the energy that obtain the infrared radiation that target launches by described image acquiring device.
40. image acquisition as claimed in claim 29 and display device,
Wherein obtaining external information that device obtains by described external information is the information or the energy that obtain the ultra-violet radiation that target launches by described image acquiring device.
41. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is the information of obtaining target of the described image acquiring device of sign.
42. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is the information of identification as the people who obtains target, animal, building or the equipment of described image acquiring device.
43. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is the movable information that obtains target of described image acquiring device.
44. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is the information of obtaining the people in the target of the described image acquiring device of identification.
45. image acquisition as claimed in claim 29 and display device,
Wherein the external information that is obtained by described external information acquisition device is to utilize whether its target of obtaining of determining described image acquiring device is the information of character.
46. image acquisition as claimed in claim 29 and display device,
Wherein by described external information obtain external information that device obtains be about with the information in the corresponding place of current location.
47. image acquisition as claimed in claim 29 and display device,
Wherein by described external information obtain external information that device obtains be about with the corresponding zone of current location in building or the information of natural thing.
48. image acquisition as claimed in claim 29 and display device,
Wherein obtain the information that external information that device obtains is brightness, darkness or the acutance of the relevant image that is obtained by described image acquiring device by described external information.
49. image acquisition as claimed in claim 29 and display device,
Wherein obtain the information that external information that device obtains is local luminance, darkness or the acutance of the image that obtained by described image acquiring device by described external information.
50. image acquisition as claimed in claim 29 and display device also comprise:
Speech synthesizing device is used for synthesizing and the corresponding sound of the character that image comprised that is obtained by described image acquiring device; And
Voice output is used to export the synthetic video that is generated by described speech synthesizing device;
Wherein said control device and the output function of controlling the synthetic video that generates by described speech synthesizing device by the information of described external information acquisition device acquisition accordingly.
51. image acquisition and display packing in image acquisition and the display device, this image acquisition and display device have: image acquiring device is used for obtaining image like this so that make the user see that the direction of subject is the direction of described subject; And display unit, being arranged in user's front and being used to show the image that obtains by image acquiring device, described method comprises step:
Obtain external information; And
Control the operation of described image acquiring device or the operation of described display unit accordingly with the information that obtains to obtain in the step in described external information.
52. image acquisition and display device comprise:
Image acquisition section is obtained image like this so that make the user see that the direction of subject is the direction of this subject;
The display part is arranged in the front of eyes of user, and shows the image that is obtained by image acquisition section;
User profile obtains part, obtains relevant user's the motion and the information of physical condition; And
Control section is controlled the operation of described image acquisition section or the operation of described display part accordingly with the information that is obtained the part acquisition by described user profile.
53. image acquisition and display device comprise:
Image acquisition section is obtained image like this so that make the user see that the direction of subject is the direction of this subject;
The display part is arranged in the front of eyes of user, and shows the image that is obtained by described image acquisition section;
External information obtains part, obtains external information; And
Control section is controlled the operation of described image acquisition section or the operation of described display part accordingly with the information that is obtained the part acquisition by described external information.
CN 200710153620 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method Expired - Fee Related CN101141567B (en)

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
JP2006244685 2006-09-08
JP2006244685A JP4961914B2 (en) 2006-09-08 2006-09-08 Imaging display device and imaging display method
JP2006-244685 2006-09-08
JP2006261975 2006-09-27
JP2006261975A JP2008083289A (en) 2006-09-27 2006-09-27 Imaging display apparatus, and imaging display method
JP2006-261975 2006-09-27

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN2009101268115A Division CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Publications (2)

Publication Number Publication Date
CN101141567A true CN101141567A (en) 2008-03-12
CN101141567B CN101141567B (en) 2012-12-05

Family

ID=39193289

Family Applications (2)

Application Number Title Priority Date Filing Date
CN2009101268115A Expired - Fee Related CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method
CN 200710153620 Expired - Fee Related CN101141567B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN2009101268115A Expired - Fee Related CN101520690B (en) 2006-09-08 2007-09-07 Image capturing and displaying apparatus and image capturing and displaying method

Country Status (2)

Country Link
JP (1) JP4961914B2 (en)
CN (2) CN101520690B (en)

Cited By (49)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102213831A (en) * 2010-04-08 2011-10-12 索尼公司 Image displaying method for a head-mounted type display unit
CN102223560A (en) * 2011-05-04 2011-10-19 友达光电股份有限公司 Audio and video playing system and method related to double-image application
CN102918444A (en) * 2011-03-25 2013-02-06 松下电器产业株式会社 Dispay device
CN103064188A (en) * 2011-11-30 2013-04-24 微软公司 Head-mounted display based education and instruction
CN103258107A (en) * 2012-02-17 2013-08-21 普天信息技术研究院有限公司 Monitoring method and assistant monitoring system
CN103369212A (en) * 2012-03-28 2013-10-23 联想(北京)有限公司 Image acquisition method and device
CN103499885A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103499886A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103501406A (en) * 2013-09-16 2014-01-08 北京智谷睿拓技术服务有限公司 Image collecting system and image collecting method
CN103576315A (en) * 2012-07-30 2014-02-12 联想(北京)有限公司 Display device
CN103595984A (en) * 2012-08-13 2014-02-19 辉达公司 3D glasses, a 3D display system, and a 3D display method
CN103593051A (en) * 2013-11-11 2014-02-19 百度在线网络技术(北京)有限公司 Head-mounted type display equipment
CN103713387A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic device and acquisition method
CN103976733A (en) * 2014-05-21 2014-08-13 蓝江涌 Multi-passage brain wave control glasses
CN104169784A (en) * 2011-11-03 2014-11-26 福尔图纳乌尔比斯有限公司 Eyeglasses
CN104238120A (en) * 2013-12-04 2014-12-24 全蕊 Smart glasses and control method
CN104281266A (en) * 2014-10-17 2015-01-14 栾林林 Head-mounted display equipment
CN104395857A (en) * 2012-05-09 2015-03-04 英特尔公司 Eye tracking based selective accentuation of portions of a display
CN104850217A (en) * 2014-02-19 2015-08-19 联想(北京)有限公司 Human eye movement monitoring device, method and equipment
CN104871525A (en) * 2012-12-26 2015-08-26 索尼公司 Image processing device, and image processing method and program
WO2015131483A1 (en) * 2014-06-05 2015-09-11 中兴通讯股份有限公司 Image shooting processing method and device
CN105137601A (en) * 2015-10-16 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent glasses
CN105144693A (en) * 2013-03-15 2015-12-09 高通股份有限公司 Always-on camera sampling strategies
WO2016070683A1 (en) * 2014-11-05 2016-05-12 努比亚技术有限公司 Screen color temperature adjustment method, device, terminal and computer storage medium
CN105607256A (en) * 2016-01-04 2016-05-25 深圳市华星光电技术有限公司 Intelligent wearable device
CN105654894A (en) * 2014-11-12 2016-06-08 西安诺瓦电子科技有限公司 LED brightness correction method
CN105676458A (en) * 2016-04-12 2016-06-15 王鹏 Wearable calculation device and control method thereof, and wearable equipment with wearable calculation device
CN105850111A (en) * 2013-10-14 2016-08-10 纳拉提弗股份公司 Method of operating a wearable lifelogging device
CN105988590A (en) * 2015-03-23 2016-10-05 卡西欧计算机株式会社 Information output apparatus and information output method
CN106445164A (en) * 2016-10-18 2017-02-22 北京小米移动软件有限公司 Adjusting method and device of intelligent glasses
CN106951316A (en) * 2017-03-20 2017-07-14 北京奇虎科技有限公司 Changing method, device and the virtual reality device of Virtualization Mode and Realistic model
CN107111743A (en) * 2014-11-13 2017-08-29 英特尔公司 The vital activity tracked using gradual eyelid is detected
US9798139B2 (en) 2013-01-28 2017-10-24 Beijing Lenovo Software Ltd. Wearable electronic device and display method
CN107402698A (en) * 2016-05-19 2017-11-28 杨冬源 A kind of image display method
CN107526164A (en) * 2016-05-31 2017-12-29 Fove股份有限公司 Image providing system
CN107851334A (en) * 2015-08-06 2018-03-27 索尼互动娱乐股份有限公司 Information processor
CN108234980A (en) * 2017-12-28 2018-06-29 北京小米移动软件有限公司 Image processing method, device and storage medium
CN108345384A (en) * 2015-11-03 2018-07-31 安溪县智睿电子商务有限公司 A kind of Intelligent control device
CN108391049A (en) * 2018-02-11 2018-08-10 广东欧珀移动通信有限公司 Filming control method and relevant device
CN108431667A (en) * 2015-12-28 2018-08-21 索尼公司 Information processing unit, information processing method and program
CN108508599A (en) * 2017-02-27 2018-09-07 精工爱普生株式会社 The control method of display device and display device
WO2018176235A1 (en) * 2017-03-28 2018-10-04 深圳市柔宇科技有限公司 Head-mounted display device and display switching method therefor
CN108769883A (en) * 2012-12-17 2018-11-06 联想(北京)有限公司 Electronic equipment and sound collection method
CN110095869A (en) * 2013-12-05 2019-08-06 索尼公司 Show equipment
CN110625625A (en) * 2019-09-18 2019-12-31 天津工业大学 Music robot based on electroencephalogram control
CN110688005A (en) * 2019-09-11 2020-01-14 塔普翊海(上海)智能科技有限公司 Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
WO2020259527A1 (en) * 2019-06-24 2020-12-30 京东方科技集团股份有限公司 Glasses and control method therefor
US11915570B2 (en) 2020-07-16 2024-02-27 Ventec Life Systems, Inc. System and method for concentrating gas
US11931689B2 (en) 2020-07-16 2024-03-19 Ventec Life Systems, Inc. System and method for concentrating gas

Families Citing this family (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4730569B2 (en) 2009-03-27 2011-07-20 カシオ計算機株式会社 Imaging apparatus, imaging method, and program
KR100957575B1 (en) * 2009-10-01 2010-05-11 (주)올라웍스 Method, terminal and computer-readable recording medium for performing visual search based on movement or pose of terminal
CN102404494B (en) * 2010-09-08 2015-03-04 联想(北京)有限公司 Electronic equipment and method for acquiring image in determined area
EP2923638B1 (en) * 2011-03-18 2019-02-20 SensoMotoric Instruments Gesellschaft für innovative Sensorik mbH Optical measuring device and system
WO2013033195A2 (en) * 2011-08-30 2013-03-07 Microsoft Corporation Head mounted display with iris scan profiling
KR20220032059A (en) * 2011-09-19 2022-03-15 아이사이트 모빌 테크놀로지 엘티디 Touch free interface for augmented reality systems
CN103294178A (en) * 2012-02-29 2013-09-11 联想(北京)有限公司 Man-machine interaction control method and electronic terminal
JP5938977B2 (en) * 2012-03-23 2016-06-22 ソニー株式会社 Head mounted display and surgical system
CN104471616A (en) 2012-07-25 2015-03-25 索尼公司 Information processing device and program
CN103677704B (en) * 2012-09-20 2018-11-09 联想(北京)有限公司 Display device and display methods
KR20140066848A (en) * 2012-11-22 2014-06-02 경북대학교 산학협력단 Face detecting device and method for detect of face
KR101811817B1 (en) 2013-02-14 2018-01-25 세이코 엡슨 가부시키가이샤 Head mounted display and control method for head mounted display
JP6299067B2 (en) * 2013-02-14 2018-03-28 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP6264855B2 (en) * 2013-11-18 2018-01-24 セイコーエプソン株式会社 Head-mounted display device and method for controlling head-mounted display device
JP5273323B1 (en) * 2013-03-13 2013-08-28 パナソニック株式会社 Head mounted display device
EP3013042A4 (en) * 2013-06-19 2016-08-17 Panasonic Ip Man Co Ltd Image display device and image display method
KR102083596B1 (en) 2013-09-05 2020-03-02 엘지전자 주식회사 Display device and operation method thereof
CN103558971A (en) * 2013-10-29 2014-02-05 小米科技有限责任公司 Browsing method, browsing device and terminal device
JP5751315B2 (en) * 2013-11-20 2015-07-22 ソニー株式会社 Image display method for head mounted display
WO2015098253A1 (en) * 2013-12-26 2015-07-02 株式会社ニコン Electronic device
CN103823563B (en) * 2014-02-28 2016-11-09 北京云视智通科技有限公司 A kind of head-wearing type intelligent display device
JP2015192697A (en) 2014-03-31 2015-11-05 ソニー株式会社 Control device and control method, and photographing control system
KR102184272B1 (en) * 2014-06-25 2020-11-30 엘지전자 주식회사 Glass type terminal and control method thereof
CN105511750B (en) * 2014-09-26 2020-01-31 联想(北京)有限公司 switching method and electronic equipment
JP6405991B2 (en) * 2014-12-24 2018-10-17 セイコーエプソン株式会社 Electronic device, display device, and control method of electronic device
WO2016139976A1 (en) * 2015-03-02 2016-09-09 ソニー株式会社 Information processing apparatus, information processing method, and program
CN105007424A (en) * 2015-07-22 2015-10-28 深圳市万姓宗祠网络科技股份有限公司 Automatic focusing system, method and wearable device based on eye tracking
JP2017102618A (en) * 2015-11-30 2017-06-08 株式会社ニコン Display device and display program
WO2017094800A1 (en) * 2015-11-30 2017-06-08 株式会社ニコン Display device, display program, and display method
US10908694B2 (en) 2016-02-01 2021-02-02 Microsoft Technology Licensing, Llc Object motion tracking with remote device
JP6647150B2 (en) * 2016-06-15 2020-02-14 株式会社Nttドコモ Information display device
JP6685397B2 (en) * 2016-07-12 2020-04-22 三菱電機株式会社 Equipment control system
JP7016211B2 (en) * 2016-08-05 2022-02-04 株式会社コーエーテクモゲームス Production processing program and information processing equipment
IL307292A (en) 2016-09-22 2023-11-01 Magic Leap Inc Augmented reality spectroscopy
JP7008424B2 (en) 2017-04-10 2022-01-25 株式会社ジャパンディスプレイ Display device
WO2019087996A1 (en) * 2017-10-30 2019-05-09 ピクシーダストテクノロジーズ株式会社 Retinal projection device and retinal projection system
CN108108022B (en) * 2018-01-02 2021-05-18 联想(北京)有限公司 Control method and auxiliary imaging device
US10859830B2 (en) * 2018-01-31 2020-12-08 Sony Interactive Entertainment LLC Image adjustment for an eye tracking system
CN108536284A (en) * 2018-03-14 2018-09-14 广东欧珀移动通信有限公司 Image display method and relevant device
US11480467B2 (en) 2018-03-21 2022-10-25 Magic Leap, Inc. Augmented reality system and method for spectroscopic analysis
JP2020016869A (en) * 2018-07-27 2020-01-30 伸也 佐藤 Digital telescopic eyeglasses
CN112684603B (en) * 2019-10-17 2023-03-28 杭州海康威视数字技术股份有限公司 Intelligent glasses
JP2021089351A (en) * 2019-12-03 2021-06-10 キヤノン株式会社 Head-mounted system and information processing apparatus
JP7170277B2 (en) * 2019-12-09 2022-11-14 株式会社辰巳菱機 Reporting device
CN113960788B (en) * 2020-07-17 2023-11-14 宇龙计算机通信科技(深圳)有限公司 Image display method, device, AR glasses and storage medium

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5333029A (en) * 1990-10-12 1994-07-26 Nikon Corporation Camera capable of detecting eye-gaze
JPH0923451A (en) * 1995-07-05 1997-01-21 Sanyo Electric Co Ltd Sensitivity response controller
JPH09211382A (en) * 1996-02-07 1997-08-15 Canon Inc Optical device
JP3877366B2 (en) * 1997-01-20 2007-02-07 本田技研工業株式会社 Head mounted display device for vehicle
EP1027627B1 (en) * 1997-10-30 2009-02-11 MYVU Corporation Eyeglass interface system
JPH11164186A (en) * 1997-11-27 1999-06-18 Fuji Photo Film Co Ltd Image recorder
JP2003011722A (en) * 2001-06-29 2003-01-15 Toyota Motor Corp Nighttime running support device
JP4182730B2 (en) * 2002-11-19 2008-11-19 ソニー株式会社 Imaging apparatus and method
JP3642336B2 (en) * 2003-07-01 2005-04-27 松下電器産業株式会社 Eye imaging device
JP4239738B2 (en) * 2003-07-22 2009-03-18 ソニー株式会社 Imaging device
JP2005078045A (en) * 2003-09-04 2005-03-24 Pentax Corp Optical equipment for observation with image display function
JP3968522B2 (en) * 2003-10-06 2007-08-29 ソニー株式会社 Recording apparatus and recording method
JP3979394B2 (en) * 2004-02-19 2007-09-19 松下電器産業株式会社 Imaging device
JP2005318973A (en) * 2004-05-07 2005-11-17 Sony Corp Biological sensor apparatus, content reproducing method and content reproducing apparatus
JP2006129288A (en) * 2004-10-29 2006-05-18 Konica Minolta Photo Imaging Inc Video display device
JP2006148541A (en) * 2004-11-19 2006-06-08 Denso Corp Navigation device and program
JP2006208997A (en) * 2005-01-31 2006-08-10 Konica Minolta Photo Imaging Inc Video display device and video display system
JP4378636B2 (en) * 2005-02-28 2009-12-09 ソニー株式会社 Information processing system, information processing apparatus, information processing method, program, and recording medium
JP2008067219A (en) * 2006-09-08 2008-03-21 Sony Corp Imaging apparatus and imaging method

Cited By (70)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102213831A (en) * 2010-04-08 2011-10-12 索尼公司 Image displaying method for a head-mounted type display unit
CN102213831B (en) * 2010-04-08 2015-05-13 索尼公司 Image displaying method for a head-mounted type display unit
CN102918444A (en) * 2011-03-25 2013-02-06 松下电器产业株式会社 Dispay device
CN102918444B (en) * 2011-03-25 2015-12-23 松下电器产业株式会社 Display device
CN102223560A (en) * 2011-05-04 2011-10-19 友达光电股份有限公司 Audio and video playing system and method related to double-image application
US8537207B2 (en) 2011-05-04 2013-09-17 Au Optronics Corporation Video-audio playing system relating to 2-view application and method thereof
TWI425498B (en) * 2011-05-04 2014-02-01 Au Optronics Corp Video-audio playing system relating to 2-views application and method thereof
CN104169784A (en) * 2011-11-03 2014-11-26 福尔图纳乌尔比斯有限公司 Eyeglasses
CN104169784B (en) * 2011-11-03 2016-08-24 福尔图纳乌尔比斯有限公司 Glasses
CN103064188A (en) * 2011-11-30 2013-04-24 微软公司 Head-mounted display based education and instruction
CN103258107A (en) * 2012-02-17 2013-08-21 普天信息技术研究院有限公司 Monitoring method and assistant monitoring system
CN103369212A (en) * 2012-03-28 2013-10-23 联想(北京)有限公司 Image acquisition method and device
CN104395857A (en) * 2012-05-09 2015-03-04 英特尔公司 Eye tracking based selective accentuation of portions of a display
CN103576315B (en) * 2012-07-30 2017-03-01 联想(北京)有限公司 Display device
CN103576315A (en) * 2012-07-30 2014-02-12 联想(北京)有限公司 Display device
US9316835B2 (en) 2012-07-30 2016-04-19 Lenovo (Beijing) Co., Ltd. Head worn display device
CN103595984A (en) * 2012-08-13 2014-02-19 辉达公司 3D glasses, a 3D display system, and a 3D display method
CN103713387A (en) * 2012-09-29 2014-04-09 联想(北京)有限公司 Electronic device and acquisition method
CN108769883A (en) * 2012-12-17 2018-11-06 联想(北京)有限公司 Electronic equipment and sound collection method
CN104871525A (en) * 2012-12-26 2015-08-26 索尼公司 Image processing device, and image processing method and program
US9798139B2 (en) 2013-01-28 2017-10-24 Beijing Lenovo Software Ltd. Wearable electronic device and display method
CN105144693A (en) * 2013-03-15 2015-12-09 高通股份有限公司 Always-on camera sampling strategies
US10002293B2 (en) 2013-09-16 2018-06-19 Beijing Zhigu Rui Tuo Tech Co., Ltd. Image collection with increased accuracy
CN103501406A (en) * 2013-09-16 2014-01-08 北京智谷睿拓技术服务有限公司 Image collecting system and image collecting method
CN103501406B (en) * 2013-09-16 2017-04-12 北京智谷睿拓技术服务有限公司 Image collecting system and image collecting method
WO2015043274A1 (en) * 2013-09-30 2015-04-02 Beijing Zhigu Rui Tuo Tech Co., Ltd Imaging to facilitate object observation
US9961257B2 (en) 2013-09-30 2018-05-01 Beijing Zhigu Rui Tuo Tech Co., Ltd. Imaging to facilitate object gaze
CN103499886B (en) * 2013-09-30 2015-07-08 北京智谷睿拓技术服务有限公司 Imaging device and method
US10271722B2 (en) 2013-09-30 2019-04-30 Beijing Zhigu Rui Tuo Tech Co., Ltd. Imaging to facilitate object observation
CN103499886A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN103499885A (en) * 2013-09-30 2014-01-08 北京智谷睿拓技术服务有限公司 Imaging device and method
CN105850111A (en) * 2013-10-14 2016-08-10 纳拉提弗股份公司 Method of operating a wearable lifelogging device
CN103593051A (en) * 2013-11-11 2014-02-19 百度在线网络技术(北京)有限公司 Head-mounted type display equipment
CN103593051B (en) * 2013-11-11 2017-02-15 百度在线网络技术(北京)有限公司 Head-mounted type display equipment
CN104238120A (en) * 2013-12-04 2014-12-24 全蕊 Smart glasses and control method
CN110095869A (en) * 2013-12-05 2019-08-06 索尼公司 Show equipment
CN104850217A (en) * 2014-02-19 2015-08-19 联想(北京)有限公司 Human eye movement monitoring device, method and equipment
CN103976733A (en) * 2014-05-21 2014-08-13 蓝江涌 Multi-passage brain wave control glasses
US10136047B2 (en) 2014-06-05 2018-11-20 Zte Corporation Focusing method and device for image shooting
WO2015131483A1 (en) * 2014-06-05 2015-09-11 中兴通讯股份有限公司 Image shooting processing method and device
CN104281266A (en) * 2014-10-17 2015-01-14 栾林林 Head-mounted display equipment
WO2016070683A1 (en) * 2014-11-05 2016-05-12 努比亚技术有限公司 Screen color temperature adjustment method, device, terminal and computer storage medium
CN105654894A (en) * 2014-11-12 2016-06-08 西安诺瓦电子科技有限公司 LED brightness correction method
CN107111743A (en) * 2014-11-13 2017-08-29 英特尔公司 The vital activity tracked using gradual eyelid is detected
CN105988590A (en) * 2015-03-23 2016-10-05 卡西欧计算机株式会社 Information output apparatus and information output method
CN105988590B (en) * 2015-03-23 2019-02-19 卡西欧计算机株式会社 The recording medium that information output apparatus and information output method, computer capacity are read
CN107851334A (en) * 2015-08-06 2018-03-27 索尼互动娱乐股份有限公司 Information processor
CN105137601A (en) * 2015-10-16 2015-12-09 上海斐讯数据通信技术有限公司 Intelligent glasses
CN105137601B (en) * 2015-10-16 2017-11-14 上海斐讯数据通信技术有限公司 A kind of intelligent glasses
CN108345384A (en) * 2015-11-03 2018-07-31 安溪县智睿电子商务有限公司 A kind of Intelligent control device
US10809530B2 (en) 2015-12-28 2020-10-20 Sony Corporation Information processing apparatus and information processing method
CN108431667A (en) * 2015-12-28 2018-08-21 索尼公司 Information processing unit, information processing method and program
CN105607256A (en) * 2016-01-04 2016-05-25 深圳市华星光电技术有限公司 Intelligent wearable device
CN105676458A (en) * 2016-04-12 2016-06-15 王鹏 Wearable calculation device and control method thereof, and wearable equipment with wearable calculation device
CN107402698A (en) * 2016-05-19 2017-11-28 杨冬源 A kind of image display method
CN107526164A (en) * 2016-05-31 2017-12-29 Fove股份有限公司 Image providing system
CN106445164A (en) * 2016-10-18 2017-02-22 北京小米移动软件有限公司 Adjusting method and device of intelligent glasses
CN108508599A (en) * 2017-02-27 2018-09-07 精工爱普生株式会社 The control method of display device and display device
CN108508599B (en) * 2017-02-27 2021-11-02 精工爱普生株式会社 Display device and control method of display device
CN106951316A (en) * 2017-03-20 2017-07-14 北京奇虎科技有限公司 Changing method, device and the virtual reality device of Virtualization Mode and Realistic model
CN106951316B (en) * 2017-03-20 2021-07-09 北京安云世纪科技有限公司 Virtual mode and real mode switching method and device and virtual reality equipment
WO2018176235A1 (en) * 2017-03-28 2018-10-04 深圳市柔宇科技有限公司 Head-mounted display device and display switching method therefor
CN108234980A (en) * 2017-12-28 2018-06-29 北京小米移动软件有限公司 Image processing method, device and storage medium
CN108391049A (en) * 2018-02-11 2018-08-10 广东欧珀移动通信有限公司 Filming control method and relevant device
WO2020259527A1 (en) * 2019-06-24 2020-12-30 京东方科技集团股份有限公司 Glasses and control method therefor
US11336880B2 (en) 2019-06-24 2022-05-17 Boe Technology Group Co., Ltd. Eyewear and control method thereof
CN110688005A (en) * 2019-09-11 2020-01-14 塔普翊海(上海)智能科技有限公司 Mixed reality teaching environment, teacher and teaching aid interaction system and interaction method
CN110625625A (en) * 2019-09-18 2019-12-31 天津工业大学 Music robot based on electroencephalogram control
US11915570B2 (en) 2020-07-16 2024-02-27 Ventec Life Systems, Inc. System and method for concentrating gas
US11931689B2 (en) 2020-07-16 2024-03-19 Ventec Life Systems, Inc. System and method for concentrating gas

Also Published As

Publication number Publication date
CN101520690A (en) 2009-09-02
CN101520690B (en) 2011-07-06
CN101141567B (en) 2012-12-05
JP2008067218A (en) 2008-03-21
JP4961914B2 (en) 2012-06-27

Similar Documents

Publication Publication Date Title
CN101520690B (en) Image capturing and displaying apparatus and image capturing and displaying method
US20180067313A1 (en) Display method and display apparatus in which a part of a screen area is in a through-state
US7855743B2 (en) Image capturing and displaying apparatus and image capturing and displaying method
CN101512632B (en) Display apparatus and display method
CN101165538B (en) Imaging display apparatus and method
CN114040098A (en) Method for obtaining an image and electronic device for performing the method
US20080211921A1 (en) Imaging apparatus and imaging method
CN101506868A (en) Display device and display method
JP2013110764A (en) Imaging display device, and imaging display method
JP2013083994A (en) Display unit and display method
JP2015046885A (en) Display device and display method
CN109922207A (en) A kind of method of dynamic adjustment application software interactive interface

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C41 Transfer of patent application or patent right or utility model
TA01 Transfer of patent application right

Effective date of registration: 20080314

Address after: Tokyo, Japan

Applicant after: Sony Corp.

Address before: Tokyo

Applicant before: Sony Corp.

Co-applicant before: Sony Computer Entertainment Inc.

C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20121205

Termination date: 20210907

CF01 Termination of patent right due to non-payment of annual fee