WO2007094152A1 - Affichage portable - Google Patents

Affichage portable Download PDF

Info

Publication number
WO2007094152A1
WO2007094152A1 PCT/JP2007/050978 JP2007050978W WO2007094152A1 WO 2007094152 A1 WO2007094152 A1 WO 2007094152A1 JP 2007050978 W JP2007050978 W JP 2007050978W WO 2007094152 A1 WO2007094152 A1 WO 2007094152A1
Authority
WO
WIPO (PCT)
Prior art keywords
unit
image
wearable display
image quality
display
Prior art date
Application number
PCT/JP2007/050978
Other languages
English (en)
Japanese (ja)
Inventor
Kazuou Isagi
Shigeru Kato
Original Assignee
Nikon Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nikon Corporation filed Critical Nikon Corporation
Publication of WO2007094152A1 publication Critical patent/WO2007094152A1/fr

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4126The peripheral being portable, e.g. PDAs or mobile phones
    • H04N21/41265The peripheral being portable, e.g. PDAs or mobile phones having a remote control device for bidirectional communication between the remote control device and client device
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/42202Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/44Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
    • H04N21/44008Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving operations for analysing video streams, e.g. detecting features or characteristics in the video stream
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44222Analytics of user selections, e.g. selection of programs or purchase activity
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/47End-user applications
    • H04N21/485End-user interface for client configuration
    • H04N21/4854End-user interface for client configuration for modifying image parameters, e.g. image brightness, contrast
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0118Head-up displays characterised by optical features comprising devices for improving the contrast of the display / brillance control visibility
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2330/00Aspects of power supply; Aspects of display protection and defect management
    • G09G2330/02Details of power systems and of start or stop of display operation
    • G09G2330/021Power management, e.g. power saving
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2340/00Aspects of display data processing
    • G09G2340/14Solving problems related to the presentation of information to be displayed
    • G09G2340/145Solving problems related to the presentation of information to be displayed related to small screens
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2360/00Aspects of the architecture of display systems
    • G09G2360/14Detecting light within display terminals, e.g. using a single or a plurality of photosensors
    • G09G2360/144Detecting light within display terminals, e.g. using a single or a plurality of photosensors the light being ambient light
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/443OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
    • H04N21/4436Power management, e.g. shutting down unused components of the receiver
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/57Control of contrast or brightness
    • H04N5/58Control of contrast or brightness in dependence upon ambient light

Definitions

  • the present invention relates to a wearable display that can be worn on a user's head.
  • Patent Document 1 the brightness of an electronic image is adjusted to an optimum brightness with respect to the brightness of the outside world, so that it can be used for a long time with less power consumption.
  • Patent Document 1 Japanese Patent Laid-Open No. 5-300451
  • Patent Document 1 since the initial value of the brightness of the electronic video is set according to the brightness of the outside world, there are cases where display according to the user cannot be performed. For example, a user who is accustomed to using a wearable display and can observe a dark display even when the outside world is bright may give an excessively bright display.
  • An object of the wearable display of the present invention is to enable long-time use by reducing power consumption while displaying an image having an image quality according to the characteristics of a user's eyes. Means for solving the problem
  • the wearable display of the present invention determines a display unit for displaying an image, a recognition unit for recognizing a use history of the display unit, and a ratio for changing the image quality of the image based on the use history. And a determination unit.
  • the determination unit may further determine an initial value for changing the image quality based on the use history.
  • the information processing apparatus further includes a detection unit that detects a surrounding environment of the display unit, and the determination unit may determine the initial value based on the use history and the surrounding environment.
  • the determination unit includes the image density, color tone, and sharpness as the image quality.
  • the recognizing unit may recognize at least one of the number of times the display unit is used and a cumulative usage time as the usage history.
  • the recognizing unit may recognize an elapsed time from the previous use of the display unit to the current use of the display unit as the use history.
  • a receiving unit that receives a user instruction to change the image quality to a predetermined image quality may be further provided.
  • the wearable display of the present invention it is possible to display an image having an image quality according to the characteristics of the user's eyes and reduce power consumption to enable long-term use.
  • FIG. 1 is a top external view of a wearable display 100.
  • FIG. 1 is a top external view of a wearable display 100.
  • FIG. 2 is a functional block diagram of wearable display 100.
  • FIG. 3 is a flowchart showing the operation of wearable display 100.
  • FIG. 1A is a top external view of wearable display 100.
  • the wearable display 100 includes an image display unit 101, a support arm 102 that supports the image display unit 101, an arm storage unit 103, and headphones 104 and 105 as shown in FIG. 1A.
  • the image display unit 101 includes a transmissive LCD, an image display element 111 such as a backlight or an organic EL display, and an optical (not shown) such as a magnifying lens that magnifies and projects the image of the image display element 111 to the user's eyes. System. Note that as the image display element, an image display element such as a reflective LCD or other self-luminous element may be used.
  • the image display unit 101 is formed integrally with the support arm 102. As shown in FIG.
  • the display unit 101 can be rotated manually or automatically in the direction of the arrow b with the connecting portion with the support arm 102 as the rotation axis a.
  • the support arm 102 can move by sliding in the direction of the arrow in FIG. Image display unit 101 and support
  • the arm 102 is disposed at a position indicated by a solid line in FIG. 1A when observing an image, and is disposed at a position indicated by a dotted line in FIG. 1A when not observing the image.
  • the image display unit 101 is arranged in the vicinity of the arm storage unit 103 in a case other than the time of image observation, and does not block the user's view.
  • the arm storage unit 103 includes a feeding position sensor 112 and a control circuit 113 therein.
  • the image display unit 101 includes an ambient illuminance sensor 114, an eyeball incident light amount sensor 115, a rotation position sensor 116, and an iris sensor 117 on the outside or inside.
  • the headphones 104 include a temperature sensor 118 inside.
  • the feeding position sensor 112 is provided in the vicinity of the support arm 102 and detects the position of the display unit 101 by detecting the feeding position of the support arm 102.
  • the ambient illuminance sensor 114 is provided outside the display unit 101 (on the side opposite to the user's eyes) and detects ambient illumination.
  • the eyeball incident light amount sensor 115 is provided inside the display unit 101 (a position facing the user's eye) and detects the amount of light incident on the user's eyeball.
  • a known encoder is provided at a connection portion between the display unit 101 and the support arm 102, and detects the rotational position of the display unit 101.
  • the iris sensor 117 is provided inside the display unit 101 (a position facing the user's eye) at a position facing the user's eye, and acquires an iris image of the user's eye using an imaging device. Note that the iris sensor 117 is disposed at a position that does not hinder the user from observing the display element 111.
  • the temperature sensor 118 is provided in the vicinity of a position where the headphones 104 come into contact with the user's ears, and detects the temperature. Then, according to the detection result, it is detected whether the headphones 104 are in contact with the user's auricle, and whether the wearable display 100 is worn on the user's head is detected.
  • Each sensor described above has a configuration similar to that of a known technique, and thus detailed description thereof is omitted.
  • the control circuit 113 is connected to each unit, controls each unit, and acquires information detected by each sensor described above.
  • FIG. 2 is a functional block diagram of wearable display 100.
  • the control circuit 113 includes an image processing unit 120, a display processing unit 121, an audio processing unit 122, a CPU 123, and a memory 124.
  • the image processing unit 120 acquires image information from the outside and performs predetermined image processing.
  • the display processing unit 121 acquires the image information after the image processing by the image processing unit 120, performs predetermined display processing, and outputs it to the display element 111.
  • the sound processing unit 122 acquires sound information from the outside, performs predetermined sound processing, and outputs the sound information to the speakers 104 and 105.
  • the CPU 123 controls each of the image processing unit 120, the display processing unit 121, and the audio processing unit 122, and acquires information detected by each sensor described above.
  • Wearable display 100 includes operation unit 125 such as a button in addition to the components described in FIG.
  • the operation unit 125 may be provided in the main body of the wearable display 100, or may be provided in a remote controller or the like that can be connected to the wearable display 100 by wire or wireless.
  • the CPU 123 is connected to the memory 124 and detects the state of the operation unit 125.
  • the transmission source of the image information and the sound information may be provided in the form of a memory or a recording medium in the main body of the wearable display 100, and is connected to the wearable display 100 via a wire or wirelessly. May be provided!
  • wearable display 100 having the above-described configuration will be described using the flowchart shown in FIG.
  • the wearable display 100 includes an automatic mode that automatically displays an image having an image quality according to the characteristics of the user's eyes.
  • the automatic mode is set by a user operation via the operation unit 125. Below, wearable display when automatic mode is set
  • step S1 the CPU 123 determines whether or not the image display unit 101 of the wearable display 100 is in a use state where the image can be observed.
  • the CPU 123 determines whether or not the image display unit 101 is in a use state according to the detection results of the feeding position sensor 112, the rotation position sensor, and the temperature sensor 118.
  • the CPU 123 waits until it is determined that the image display unit 101 is in use, and when it is determined that the image display unit 101 is in use, the CPU 123 starts measurement of the use time by a clock (not shown) in the CPU 123, and step S2 Proceed to As described above, when the image display unit 101 is determined to be in use, more accurate use history (details will be described later) can be obtained by starting measurement of use time.
  • step S2 the CPU 123 recognizes the usage history.
  • Usage history is displayed as an image This includes the usage status such as the number of times the unit 101 has displayed an image, the cumulative usage time, and the wearable display 1 00 last time used. Every time an image is displayed on the image display unit 101, it is recorded in the usage history memory 124.
  • step S3 the CPU 123 detects the state of the iris.
  • the CPU 123 analyzes the iris image of the user's eye acquired by the iris sensor 117 through the image processing unit 120 and detects the iris state.
  • step S4 the CPU 123 determines whether or not the power to change the image quality of the image displayed on the display element 111 is sufficient. Based on the usage history recognized in step S2 and the iris state detected in step S3, the CPU 123 determines whether or not it is the power to change the image quality of the image displayed on the display element 111. The usage history can determine how familiar the user is with the wearable display 100. The CPU 123 determines whether or not to change the image quality according to such a determination result.
  • the display start force is also analyzed by analyzing the time until the iris starts to change and the amount of change of the iris when the iris becomes dark, thereby improving the user's ability to adapt to light and dark. Judgment can be made.
  • the CPU 123 determines that the image quality of the image is to be changed, the CPU 123 proceeds to step S5. On the other hand, when determining that the image quality of the image is not changed, the CPU 123 ends the series of processes. As a result, the display element 111 displays an image with a predetermined image quality.
  • the predetermined image quality may be predetermined by the CPU 123 or specified by the user.
  • step S 5 the CPU 123 detects the surrounding environment according to the detection results of the ambient illuminance sensor 114 and the eyeball incident light amount sensor 115.
  • the CPU 123 can detect the surrounding environment more accurately by obtaining the difference between the detection results of the ambient illuminance sensor 114 and the eyeball incident light amount sensor 115.
  • step S6 the CPU 123 determines the initial value of image quality and the rate of change based on the usage history recognized in step S2 and the surrounding environment detected in step S5.
  • the image quality includes elements of image density, color tone, sharpness, and brightness.
  • CPU123 each For an element, the initial value of image quality and the rate of change are determined.
  • the density is preferably determined to be higher than a predetermined density as the user is not used to the wearable display 100 (determined by the usage history).
  • the user is used to use the wearable display 100! / !, and is determined to be stronger than a predetermined color tone.
  • the brightness is determined to be stronger than a predetermined color tone as the surrounding environment is brighter.
  • the sharpness is determined to be larger than a predetermined predetermined sharpness as the user gets used to the wearable display 100!
  • the brightness be determined to be larger than a predetermined sharpness as the surrounding environment is brighter.
  • the brightness it is preferable that the user is used to use the wearable display 100 and is determined to be brighter than a predetermined brightness as the user gets used to the wearable display 100!
  • the brighter the surrounding environment the brighter the predetermined brightness is determined. This is because the brighter the surrounding environment, the greater the difference between the initial value and the predetermined image quality, so the time required for image quality change is shortened by increasing the rate of change.
  • the rate of change is the degree of how close the image quality is when displayed without changing the image quality relative to the initial value of the image quality. In terms of image density, color tone, sharpness, and brightness, it is preferable to determine the rate of change as small as the user is not used to using the wearable display 100. In addition, the brighter the surrounding environment, the greater the rate of change is preferably determined.
  • the CPU 123 determines the initial value of the image quality and the rate of change according to a predetermined formula or table. Then, the determined initial value of image quality and the rate of change are recorded in the memory 124 as usage history. The recorded initial value of the image quality and the rate of change may be used for determining whether or not the image quality is changed in subsequent use (see step S4). Further, in subsequent use, it may be used when determining the initial value of image quality and the rate of change.
  • step S7 the CPU 123 executes the original image with respect to the image displayed with the initial image quality. Start changing the image quality back to the image. For the density, color tone, and brightness, the CPU 123 controls the display processing unit 121 to start changing the image quality, and for the sharpness, the image processing unit 120 is controlled to start changing the image quality.
  • step S8 the CPU 123 determines whether or not it has received the interruption instruction.
  • the interruption instruction is given by a user operation via the operation unit 125.
  • the user wants to interrupt the automatic mode being executed by the wearable display 100, the user can issue an interruption instruction via the operation unit 125.
  • CPU 123 When CPU 123 receives the interruption instruction, it proceeds to step S9. On the other hand, if the interruption instruction is not received, the CPU 123 proceeds to step S10 described later.
  • step S9 the CPU 123 controls each unit and changes the image to a predetermined image quality.
  • step S10 the CPU 123 determines whether or not the image quality change has been completed.
  • the CPU 123 determines that the change in image quality has been completed, the CPU 123 ends the series of processes.
  • the image is displayed on the display element 111 with a predetermined image quality.
  • the CPU 123 returns to step S8.
  • a display unit that displays an image, a recognition unit that recognizes a usage history, and a rate at which the image quality of the image is changed is determined based on the usage history. A part. Therefore, it is possible to reduce the power consumption and display the image for a long time while displaying an image having an image quality according to the characteristics of the user's eyes.
  • the initial value when changing the image quality is determined based on the usage history. Therefore, by determining an appropriate initial value according to the characteristics of the user's eyes, it is possible to realize a change in image quality according to the characteristics of the user's eyes.
  • the detection unit that detects the surrounding environment of the display unit is provided, and the above-described initial value is determined based on the usage history and the surrounding environment. Therefore, by determining the initial value in consideration of the surrounding environment, it is possible to realize a change in image quality in accordance with the use situation.
  • the image quality is low in image density, color tone, sharpness, and brightness. Determine the rate at which at least one is changed. Therefore, by changing a plurality of elements at the same time, it is possible to display an image that is easier for the user to observe. As a result, the time required for changing the image quality of the image can be shortened.
  • the use history at least one of the number of uses and the accumulated use time, which is the number of times the image can be observed by the image display unit, is recognized. Therefore, it is possible to infer whether the user has become accustomed to using the wearable display 100! /, And to determine the characteristics of the user's eyes.
  • the wearable display is used for the observation of the image by the image display unit, and the elapsed time until the current use is recognized. Therefore, it can be estimated whether the user is accustomed to using the wearable display 100, and the characteristics of the user's eyes can be determined.
  • the information processing apparatus further includes a reception unit that receives a user instruction to change the image quality to a predetermined image quality. Therefore, an image with a predetermined image quality can be displayed promptly according to the user's request.
  • step S4 in FIG. 3 the determination is made based on either one of the forces shown in the example of determining whether or not the image quality is changed based on the use history and the iris state. It is also good. Further, the determination may be made in consideration of other conditions.
  • an example is shown in which the initial value of image quality and the rate of change are determined based on the usage history and the surrounding environment in step S6 of FIG.
  • a configuration may be made based on the determination. Moreover, it is good also as a structure which considers other conditions and determines.
  • the interruption instruction is received in step S8 in FIG. 3 is shown, but a configuration in which the interruption instruction is received at another timing may be employed.
  • step S1 determination of wearing
  • step S3 detection of iris state
  • step S4 determination of whether to change the image quality
  • step S5 detection of surrounding environment
  • step S6 determination of initial value and change rate
  • step S6 of FIG. 3 the method for determining the initial value and the rate of change described in step S6 of FIG. 3 is an example. Prefer to determine the term value and the rate of change.
  • the example in which the image quality is changed according to the change rate determined before the start of the image quality change is shown.
  • the change rate is changed over time. It is good also as composition changed.
  • the iris state may be analyzed based on the detection result of the iris sensor 117, and the degree of eye fatigue may be estimated to change the rate of change over time.
  • the present invention can be similarly applied to a wearable display having an image display unit on both sides in the example in which the image display unit is provided on only one side.
  • present invention can be similarly applied to shapes other than the wearable display 100 described in the present embodiment.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Databases & Information Systems (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • General Engineering & Computer Science (AREA)
  • Optics & Photonics (AREA)
  • Business, Economics & Management (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biodiversity & Conservation Biology (AREA)
  • Ecology (AREA)
  • Emergency Management (AREA)
  • Environmental & Geological Engineering (AREA)
  • Environmental Sciences (AREA)
  • Remote Sensing (AREA)
  • Controls And Circuits For Display Device (AREA)

Abstract

La présente invention concerne un affichage portable qui comprend une unité d'affichage pour afficher une image, une unité de reconnaissance pour reconnaître l'historique d'utilisation de l'unité d'affichage, et une unité de détermination pour déterminer le taux de variation de la qualité d'image selon l'historique d'utilisation. Lors de l'affichage d'une image de qualité correspondant aux caractéristiques des yeux de l'utilisateur, la consommation d'énergie est réduite et une utilisation à long terme est réalisée. L'unité de détermination peut déterminer la valeur initiale utilisée lorsque la qualité d'image est variée en fonction de l'historique d'utilisation. L'affichage portable peut en outre comprendre une unité de détection pour détecter l'environnement ambiant de l'unité d'affichage. L'unité de détermination peut déterminer le taux de variation d'au moins un parmi la densité, le ton, la netteté et la brillance de l'image en tant que qualité d'image. L'unité de reconnaissance peut reconnaître au moins un parmi le nombre d'utilisations en tant qu'historique d'utilisation et temps d'utilisation total de l'unité d'affichage. L'unité de reconnaissance peut reconnaître en tant qu'historique d'utilisation le temps écoulé depuis l'utilisation précédente de l'unité d'affichage jusqu'à l'utilisation présente. L'affichage portable peut en outre comprendre une unité de réception pour recevoir une commande d'utilisateur afin de varier la qualité d'image en une qualité d'image prédéterminée.
PCT/JP2007/050978 2006-02-15 2007-01-23 Affichage portable WO2007094152A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006038229 2006-02-15
JP2006-038229 2006-02-15

Publications (1)

Publication Number Publication Date
WO2007094152A1 true WO2007094152A1 (fr) 2007-08-23

Family

ID=38371340

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/050978 WO2007094152A1 (fr) 2006-02-15 2007-01-23 Affichage portable

Country Status (1)

Country Link
WO (1) WO2007094152A1 (fr)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013054728A1 (fr) * 2011-10-11 2013-04-18 ソニー株式会社 Visiocasque et procédé de commande d'affichage
WO2013136484A1 (fr) * 2012-03-15 2013-09-19 Necディスプレイソリューションズ株式会社 Appareil et procédé d'affichage d'image
WO2017084091A1 (fr) * 2015-11-20 2017-05-26 深圳市柔宇科技有限公司 Procédé de réglage de luminosité d'appareil visiocasque et appareil visiocasque
WO2023276566A1 (fr) * 2021-07-02 2023-01-05 株式会社ソニー・インタラクティブエンタテインメント Système et procédé d'affichage d'images

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0390444A (ja) * 1989-09-01 1991-04-16 Nippon Seiki Co Ltd 調光表示装置
JPH05300451A (ja) * 1992-04-22 1993-11-12 Olympus Optical Co Ltd 頭部装着式ディスプレイ装置
JPH0619444A (ja) * 1992-07-02 1994-01-28 Hitachi Ltd 情報処理装置
JPH1124598A (ja) * 1997-07-08 1999-01-29 Canon Inc ヘッドマウント型映像表示装置
JP2000298246A (ja) * 1999-02-12 2000-10-24 Canon Inc 表示装置、表示方法および記憶媒体
JP2001184046A (ja) * 1999-10-28 2001-07-06 Gateway Inc バッテリ電力を節約するディスプレイ輝度制御方法及びその装置
JP2003076353A (ja) * 2001-09-04 2003-03-14 Sharp Corp ヘッドマウント型表示装置

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0390444A (ja) * 1989-09-01 1991-04-16 Nippon Seiki Co Ltd 調光表示装置
JPH05300451A (ja) * 1992-04-22 1993-11-12 Olympus Optical Co Ltd 頭部装着式ディスプレイ装置
JPH0619444A (ja) * 1992-07-02 1994-01-28 Hitachi Ltd 情報処理装置
JPH1124598A (ja) * 1997-07-08 1999-01-29 Canon Inc ヘッドマウント型映像表示装置
JP2000298246A (ja) * 1999-02-12 2000-10-24 Canon Inc 表示装置、表示方法および記憶媒体
JP2001184046A (ja) * 1999-10-28 2001-07-06 Gateway Inc バッテリ電力を節約するディスプレイ輝度制御方法及びその装置
JP2003076353A (ja) * 2001-09-04 2003-03-14 Sharp Corp ヘッドマウント型表示装置

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013054728A1 (fr) * 2011-10-11 2013-04-18 ソニー株式会社 Visiocasque et procédé de commande d'affichage
US9316831B2 (en) 2011-10-11 2016-04-19 Sony Corporation Head mounted display and display control method
WO2013136484A1 (fr) * 2012-03-15 2013-09-19 Necディスプレイソリューションズ株式会社 Appareil et procédé d'affichage d'image
WO2017084091A1 (fr) * 2015-11-20 2017-05-26 深圳市柔宇科技有限公司 Procédé de réglage de luminosité d'appareil visiocasque et appareil visiocasque
CN107003519A (zh) * 2015-11-20 2017-08-01 深圳市柔宇科技有限公司 用于头戴显示装置的亮度调节方法及头戴显示装置
CN107003519B (zh) * 2015-11-20 2019-12-27 深圳市柔宇科技有限公司 用于头戴显示装置的亮度调节方法及头戴显示装置
WO2023276566A1 (fr) * 2021-07-02 2023-01-05 株式会社ソニー・インタラクティブエンタテインメント Système et procédé d'affichage d'images

Similar Documents

Publication Publication Date Title
JP4884417B2 (ja) 携帯型電子装置及びその制御方法
JP5309448B2 (ja) 表示装置、表示方法
TWI378262B (fr)
JP4462329B2 (ja) 撮像装置、撮像方法
US8907866B2 (en) Head mount display
JP5332392B2 (ja) 撮像装置
EP1898632A1 (fr) Appareil et procédé de capture d'images
US7791642B2 (en) Image-taking apparatus
JP6464729B2 (ja) 表示装置、及び、表示装置の制御方法
JP2013210643A (ja) 表示装置、表示方法
JP2008182544A (ja) 画像保存装置、画像保存方法
JP2004326118A (ja) アイスタート能力を組み込んだ機器
US20240062583A1 (en) Electronic apparatus and method for controlling the same
WO2007094152A1 (fr) Affichage portable
JP4880303B2 (ja) 表示機能付き電子機器、表示方法及びプログラム
US20050237420A1 (en) Camera system and camera main unit
JP2012222387A (ja) 撮像装置
JP2004140736A (ja) 撮像装置
JP2008288821A (ja) 撮像装置、撮像方法
JP2011049988A (ja) 画像処理装置およびカメラ
JP6609920B2 (ja) 表示装置、及び、表示装置の制御方法
JP6638325B2 (ja) 表示装置、及び、表示装置の制御方法
JP6410548B2 (ja) 立体視装置、及びプログラム
JPWO2020170945A1 (ja) 表示制御装置、撮像装置、表示制御方法、及び表示制御プログラム
JP2009055080A (ja) 撮像装置、撮像方法

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 07707241

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: JP