WO2008047889A1 - Écran et procédé d'affichage - Google Patents

Écran et procédé d'affichage Download PDF

Info

Publication number
WO2008047889A1
WO2008047889A1 PCT/JP2007/070397 JP2007070397W WO2008047889A1 WO 2008047889 A1 WO2008047889 A1 WO 2008047889A1 JP 2007070397 W JP2007070397 W JP 2007070397W WO 2008047889 A1 WO2008047889 A1 WO 2008047889A1
Authority
WO
WIPO (PCT)
Prior art keywords
display
image
image signal
imaging
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Ceased
Application number
PCT/JP2007/070397
Other languages
English (en)
French (fr)
Japanese (ja)
Inventor
Yoichiro Sako
Masaaki Tsuruta
Taiji Ito
Masamichi Asukai
Kan Ebisawa
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Priority to US12/445,477 priority Critical patent/US8681256B2/en
Priority to CA2666692A priority patent/CA2666692C/en
Priority to EP07830131.4A priority patent/EP2081182B1/en
Publication of WO2008047889A1 publication Critical patent/WO2008047889A1/ja
Anticipated expiration legal-status Critical
Priority to US14/172,521 priority patent/US9182598B2/en
Priority to US14/734,308 priority patent/US9846304B2/en
Priority to US15/811,227 priority patent/US20180067313A1/en
Ceased legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • GPHYSICS
    • G06COMPUTING OR CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/14Display of multiple viewports
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/4104Peripherals receiving signals from specially adapted client devices
    • H04N21/4122Peripherals receiving signals from specially adapted client devices additional display device, e.g. video projector
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/41Structure of client; Structure of client peripherals
    • H04N21/422Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
    • H04N21/4223Cameras
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4312Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
    • H04N21/4316Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for displaying supplemental content in a region of the screen, e.g. an advertisement in a separate window
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/436Interfacing a local distribution network, e.g. communicating with another STB or one or more peripheral devices inside the home
    • H04N21/4363Adapting the video stream to a specific local network, e.g. a Bluetooth® network
    • H04N21/43637Adapting the video stream to a specific local network, e.g. a Bluetooth® network involving a wireless protocol, e.g. Bluetooth, RF or wireless LAN [IEEE 802.11]
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/442Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
    • H04N21/44213Monitoring of end-user related data
    • H04N21/44218Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/66Remote control of cameras or camera parts, e.g. by remote control devices
    • H04N23/661Transmitting camera control signals through networks, e.g. control via the Internet
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/44Receiver circuitry for the reception of television signals according to analogue transmission standards
    • H04N5/445Receiver circuitry for the reception of television signals according to analogue transmission standards for displaying additional information
    • H04N5/45Picture in picture, e.g. displaying simultaneously another television channel in a region of the screen
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0123Head-up displays characterised by optical features comprising devices increasing the field of view
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0127Head-up displays characterised by optical features comprising devices increasing the depth of field
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/014Head-up displays characterised by optical features comprising information/image processing systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0141Head-up displays characterised by optical features characterised by the informative content of the display
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/017Head mounted
    • G02B2027/0178Eyeglass type
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N21/00Selective content distribution, e.g. interactive television or video on demand [VOD]
    • H04N21/40Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
    • H04N21/43Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
    • H04N21/431Generation of visual interfaces for content selection or interaction; Content or additional data rendering
    • H04N21/4318Generation of visual interfaces for content selection or interaction; Content or additional data rendering by altering the content in the rendering process, e.g. blanking, blurring or masking an image region

Definitions

  • the present invention relates to a display device and a display method for displaying an image in front of a user's eyes while being worn by a user, for example, by a spectacle-type or head-mounted type mounting unit.
  • the present invention aims to realize user's visual assistance and expansion.
  • the display device of the present invention is arranged so as to be positioned in front of the user's eyes to display an image, and a display unit capable of setting a screen area for displaying an image to a transparent or translucent through state.
  • An image signal generating means for generating a display image signal of a scene different from a scene visually recognized by the user through the display means when the display means is in the through state, and a part of the screen area of the display means is in the through state.
  • the image signal generation means Control means for performing control for causing the display means to execute the display image signal thus formed.
  • the image signal generation unit includes an imaging unit and a signal processing unit that performs signal processing on the captured image signal obtained by the imaging unit.
  • the image pickup unit is an image pickup unit that is arranged so as to pick up an image in a direction that the user visually recognizes through the display unit when the screen area is in the through state.
  • the imaging unit is an imaging unit that is arranged so that a direction different from a direction visually recognized by the user through the display unit when the screen area is in a through state is taken as a subject direction.
  • the imaging unit is an imaging unit whose subject direction is variable.
  • the image signal generation means is configured to include a receiving unit that receives an image signal from an external device.
  • the image signal generation means is configured to have a signal processing unit that performs signal processing on the supplied image signal.
  • control means sets the parent screen area and the child screen area in the screen area in the display means, and sets one of the parent screen area and the child screen area to the through state and the other to the image display state by the display image signal.
  • the control means divides the screen area in the display means, and sets one of the divided areas to the through state and the other to the image display state based on the display image signal.
  • the display image signal is an image signal obtained by taking a foreground image or a distant image.
  • the display image signal is an image signal obtained by telephoto imaging or wide-angle imaging.
  • the display image signal is the image signal obtained by the enlargement process or the reduction process. No.
  • the display image signal is an image signal obtained by imaging with increased or decreased imaging sensitivity.
  • the display image signal is an image signal obtained by imaging with increased infrared imaging sensitivity.
  • the display image signal is an image signal obtained by imaging with increased ultraviolet imaging sensitivity.
  • the display image signal is an image signal obtained by imaging with a direction different from the direction visually recognized by the user through the display unit when the screen area is in the through state.
  • the image signal generation means obtains a display image signal as a scene different from the scene visually recognized by the user when the display means is in the through state by controlling the operation of the imaging lens system in the imaging unit.
  • the image signal generation means obtains a display image signal as a scene different from the scene visually recognized by the user when the display means is in the through state by the signal processing in the signal processing unit.
  • the image signal generation means receives the image signal captured by the external imaging device at the receiving unit, so that the display image signal as a scene different from the scene visually recognized by the user when the display means is in the through state. Get.
  • it further includes external world information acquisition means for acquiring external world information, and the control means causes the display means to display a display image signal based on the external world information obtained from the external world information acquisition means.
  • control means controls the area in which the display image signal is displayed based on the outside world information obtained from the outside world information obtaining means.
  • the display method of the present invention is arranged so as to be positioned in front of the user's eyes and displays an image, and a screen area for displaying an image is transparent or
  • a user includes a step of generating a display image signal, and a step of setting a part of the screen area of the display means to a through state and executing a display using the generated display image signal.
  • This display means can be in the through state.
  • the display by the display image signal generated by the image signal generation means is executed, so that the user can see a normal visual scene in the through state area. It is possible to see an image of a scene that is different from the visual scene that is normally viewed by displaying the display image signal. For example, it is possible to view a telephoto image, an enlarged image, a special captured image, an image of a rear scene, etc. by displaying the display image signal while looking forward in the through state area.
  • FIG. 1A and FIG. 1B are explanatory diagrams of an appearance example of a display device according to an embodiment of the present invention.
  • FIGS. 2A and 2B are other appearance examples of the display device according to the embodiment.
  • Explanatory drawing, Fig. 3A, Fig. 3B and Fig. 3C are explanatory diagrams of the relationship between the display device of the embodiment and the external device, and Fig. 4 is an illustration of the embodiment.
  • FIG. 5 is another block diagram of the display device according to the embodiment
  • FIG. 6 is still another block diagram of the display device according to the embodiment, FIG. 7A, FIG. FIG. B, FIG. 7 C, FIG. 7 D, FIG. 7 E, and FIG.
  • FIG. 7 F are explanatory diagrams of the area configuration of the display section of the embodiment
  • FIG. 8 A, FIG. 8 B, FIG. C and FIG. 8D are explanatory diagrams of the area configuration of the display unit of the embodiment
  • FIG. 9 is a flowchart of the control processing of the embodiment
  • Fig. 11 is an explanatory diagram of an image display state of the embodiment
  • Fig. 11 A and Fig. 11 B are explanatory diagrams of an image display state of the embodiment
  • Fig. 1 A and Fig. 1 B are
  • FIG. 13 is an explanatory diagram of the image display state of the embodiment
  • FIG. 13A and FIG. 13B are explanatory diagrams of the image display state of the embodiment
  • Explanatory drawing of the image display state of the form
  • FIG. 15 A, FIG. 15 B, FIG. 15 C and FIG. 15 D are explanatory diagrams of the image display state of the embodiment.
  • FIG. 1A and FIG. 1B show an example of the appearance of a display device 1 that is a glasses-type display.
  • the display device 1 is shown in FIG. As shown in Fig. 1A, it is worn by the user by wearing a wearing unit with a frame structure that makes a half turn from both sides of the head to the back of the head.
  • the display device 1 has a configuration in which a pair of display units 2 and 2 for the left eye and the right eye are arranged immediately in front of the user's eyes, that is, in a place where a lens in normal glasses is located, in the mounted state It is said that.
  • a liquid crystal panel is used for the display unit 2, and by controlling the transmittance, a through state as shown in the figure, that is, a transparent or translucent state can be obtained. Since the display unit 2 is in the through state, even if the user is always wearing it like glasses, there is no problem in normal life.
  • the imaging lens 3a is arranged in the forward direction of the user when the user is wearing it.
  • the imaging lens 3 a is configured to capture an image in a direction that the user visually recognizes through the display unit 2 when the display unit 2 is in the through state.
  • a light emitting unit 4 a that performs illumination in the imaging direction of the imaging lens 3 a is provided.
  • the light emitting unit 4a is formed by, for example, LED (Light Emitting Diode).
  • Fig. 1A and Fig. 1B are examples, and there are various possible structures for the display device 1 to be worn by the user.
  • it may be formed of a wearing unit that is a spectacle type or a head-mounted type, and at least in the present embodiment, the display unit 2 may be provided close to the front of the user's eyes.
  • a pair of display units 2 may be provided corresponding to both eyes, or one display unit 2 may be provided corresponding to one eye.
  • a configuration in which the light emitting unit 4a is not provided is also conceivable.
  • the imaging lens 3a is attached so that the front of the user is the subject direction. However, the imaging lens 3a bypasses the display unit 2. It may be attached so that the direction of the subject is different from the direction that the user visually recognizes through the display unit 2 when it is in one state.
  • FIGS. 2A and 2B Show.
  • the imaging lens 3a is not provided in the front, but the imaging lens 3a and the light emitting portion 4a are provided in the unit that is located on the back of the head. Yes. That is, in this case, the imaging lens 3a captures an image of the rear that the user cannot normally visually recognize.
  • the imaging lens 3a may be attached so that the upper direction, the left side, the right side, or the feet of the user are directed toward the subject.
  • the imaging lens 3a is fixedly attached so that the subject direction during imaging is fixed (in front of the user or behind the user). By attaching the imaging lens 3a via a movable mechanism capable of changing the subject direction, the subject direction at the time of imaging may be changed manually or automatically.
  • one imaging function unit is provided as the imaging lens 3a. However, a plurality of imaging lenses 3a may be attached and provided with a plurality of imaging function units. Good.
  • An image signal captured by an imaging function system described later including the imaging lens 3 a is displayed on the display unit 2 as a display image signal through a predetermined process.
  • some areas on the screen of the display unit 2 are in a through state, and image display based on the display image signal is performed in other areas.
  • the display device 1 includes a communication function (communication unit 26 described in FIG. 5) for communicating with external equipment. Therefore, the source of the display image signal to be displayed on the display unit 2 is assumed to be not only the imaging function unit including the imaging lens 3a but also the communication function unit. That is, an image signal transmitted from another imaging device or the like as an external device can be received by the communication function unit and displayed on the display unit 2.
  • a communication function communication unit 26 described in FIG. 5
  • FIG. 3A, FIG. 3B, and FIG. 3C exemplify how the display device 1 is used in the context of external equipment.
  • Fig. 3A shows the case where the display device 1 is used alone. In this case, since the display device 1 has an imaging function, a display image signal generated using the captured image signal as a source is displayed. Can be displayed on display unit 2.
  • FIG. 3B shows an example in which the display device 1 has a communication function and communicates with an external imaging device 70.
  • the display device 1 receives the image captured by the imaging device 70 and displays it on the display unit 2.
  • the external imaging device 70 can be assumed to be a video camera, a digital still camera, or the like having a communication function, and the display device 1 having an imaging function as shown in FIGS. 1A and 1B. It can also be considered as an external imaging device 70 for a certain display device 1.
  • the external imaging device 70 may be an imaging device owned by the user who uses the display device 1, an imaging device owned by an acquaintance of the user of the display device 1, or a public or service company that provides images.
  • Various imaging devices such as those that can communicate with the display device 1 are conceivable.
  • FIG. 3C shows that the display device 1 has a communication function, particularly a communication access function via the network 73 such as the Internet, so that it can communicate with an external imaging device 70 connected via the network 73. It is an example to do.
  • the display device 1 receives the captured image signal via the network 73.
  • the display unit 2 executes image display using the display image signal based on the received captured image signal.
  • FIG. 4 shows an example of the internal configuration of the display device 1.
  • the system controller 10 is composed of a microcomputer including a CPU (Central Processing Unit), a ROM (Read Only Memory), a RAM (Random Access Memory), a nonvolatile memory unit, and an interface unit, for example.
  • the control unit controls the entire system.
  • This system controller 10 controls each part in the display device 1 on the basis of an internal operation program and an operation trigger from the operation trigger information generation unit 16, and causes the display unit 2 to execute a required image display. .
  • an imaging unit 3 In the display device 1, an imaging unit 3, an imaging control unit 11 and an imaging signal processing unit 15 are provided.
  • the imaging unit 3 is configured to focus on the lens system including the imaging lens 3a shown in Fig. A or Fig. B, a diaphragm, a zoom lens, and a focus lens. And a drive system for performing zoom operation, and a solid-state image sensor array that detects imaging light obtained by the lens system and generates an imaging signal by performing photoelectric conversion, and the like.
  • the solid-state imaging device array is, for example, a CCD (Charge Coupled Device) sensor array or a CMOS (Complementary Metal Oxide Semiconductor) sensor array.
  • the imaging unit 3 captures a scene in front of the user, and in the case of the examples in FIGS. 2A and 2B, this imaging Part 3 captures the scene behind the user.
  • the imaging signal processing unit 15 is obtained by the solid-state imaging device of the imaging unit 3. It is equipped with a sample hold AGC (Automatic Gain Control) circuit that adjusts the gain and waveform shaping of the signal, and a video A / D converter. The imaging signal processing unit 15 also performs white balance processing, luminance processing, color signal processing, blur correction processing, and the like on the imaging signal.
  • sample hold AGC Automatic Gain Control
  • the imaging control unit 11 controls operations of the imaging unit 3 and the imaging signal processing unit 15 based on instructions from the system controller 10. For example, the imaging control unit 11 1 controls on / off of operations of the imaging unit 3 and the imaging signal processing unit 15. In addition, the imaging control unit 1 1 is supposed to perform control (motor control) for the imaging unit 3 to execute operations such as auto force, automatic exposure adjustment, aperture adjustment, zoom, and focus change. .
  • the imaging control unit 11 controls the operation of the movable mechanism based on an instruction from the system controller 10. Then, the direction of the imaging lens 3 a in the imaging unit 3 is changed.
  • the imaging control unit 11 has a timing generator.
  • a signal is generated by the timing signal generated by the timing generator. Control processing operations.
  • the imaging frame rate can be variably controlled by this timing control.
  • the imaging control unit 11 controls imaging sensitivity and signal processing in the solid-state imaging device and the imaging signal processing unit 15. For example, gain control of the signal read from the solid-state image sensor as imaging sensitivity control, black level setting control, various coefficient control of imaging signal processing at the digital data stage, correction amount control in blur correction processing, etc. be able to.
  • imaging sensitivity overall sensitivity adjustment without considering wavelength band, for example, Sensitivity adjustment (for example, imaging that cuts a specific wavelength band) that adjusts the imaging sensitivity of a specific wavelength band such as the infrared region or the ultraviolet region is also possible. Sensitivity can be adjusted according to the wavelength by inserting a wavelength filter in the imaging lens system or by processing the wavelength filter for the imaging signal.
  • the imaging control unit 11 can perform sensitivity control by insertion control of a wavelength filter, designation of a filter calculation coefficient, or the like.
  • the display device 1 includes a display unit 2, a display image processing unit 12, a display drive unit 1 3, and a display control unit 14 as a configuration for displaying to the user.
  • the imaging signal captured by the imaging unit 3 and processed by the imaging signal processing unit 15 is supplied to the display image processing unit 12.
  • the display image processing unit 12 is, for example, a so-called video processor, and is a part capable of executing various display processes on the supplied imaging signal.
  • the display image processing unit 1 2 generates an enlarged image obtained by enlarging a part of the imaging signal, or generates a reduced image, soft focus, mosaic, inversion of brightness, and highlight display of a part of the image (highlight display).
  • Image effects such as changes in the overall color atmosphere, separation and synthesis of images for split display of captured images, generation of character images and image images, and processing of combining generated images with captured images It can be carried out. That is, various processes can be performed on the digital video signal as the imaging signal.
  • the display driving unit 13 includes a pixel driving circuit for displaying the image signal supplied from the display image processing unit 12 on the display unit 2 which is a liquid crystal display, for example.
  • each pixel arranged in a matrix in the display unit 2 is driven in a predetermined horizontal and vertical direction. Apply drive signal based on video signal at timing and execute display
  • the display driving unit 13 can also control the transmittance of each pixel of the display unit 2 so that the entire screen or part of the screen is in a through state.
  • the display control unit 14 controls the processing operation of the display image processing unit 12 and the operation of the display driving unit 13 based on an instruction from the system controller 10. That is, the display image processing unit 12 is caused to execute the various processes described above. In addition, the display drive unit 13 is controlled to be switched between the through state and the image display state.
  • the illumination unit 4 includes a light emitting unit 4a shown in FIGS. 1A and 2B and a light emitting circuit that emits light from the light emitting unit 4 (for example, LED).
  • the lighting control unit 18 causes the lighting unit 4 to perform a light emission operation based on an instruction from the system controller 10.
  • the illuminating section 4 Since the light emitting section 4a in the illuminating section 4 is attached as shown in FIG. 1A or 2B, the illuminating section 4 performs the illumination operation with respect to the object direction by the imaging lens 3a. Become.
  • the operation trigger information generation unit 16 generates operation trigger information for starting and ending image display on the display unit 2 and switching the display mode.
  • the operation trigger information generation unit 16 can be configured by, for example, an operation key operated by a user or an operation element as an operation dial, and an operation detection mechanism that detects an operation of the operation element. In other words, the user's manual operation is used as the operation trigger information for various operations.
  • the operation trigger information generation unit 16 supplies user operation information to the system controller 10 as operation trigger information, so that the system controller 10 performs display operation control according to the user operation.
  • the motion trigger information generation unit 16 detects user information (detection information on the user's visual status, body behavior, biological status, etc.) or external information ( It is also possible to adopt a configuration that can detect the surrounding conditions of the display device, detection information such as the location, date / time, and subject status). For example, the system controller 10 may determine whether an operation trigger has occurred based on these user information and external information. Examples of user information and external information will be described later.
  • FIG. 5 shows another configuration example of the display device 1. Note that blocks having the same functions as those in Fig. 4 are given the same reference numerals to avoid redundant explanation.
  • the configuration of FIG. 5 does not include the imaging functional part (imaging unit 3, imaging control unit 11 and imaging signal processing unit 15) in the configuration of FIG. 4, illumination unit 4 and illumination control unit 18; Instead, the communication unit 26 is provided.
  • the communication unit 26 transmits / receives data to / from an external device.
  • the external equipment various kinds of equipment as the imaging device 70 described in FIG. 3B and FIG. 3C can be considered.
  • the communication unit 26 may be configured to perform network communication via short-range wireless communication with a network access point using a method such as a wireless local area network (LAN) or Bluetooth (registered trademark). It is also possible to perform direct wireless communication with an external device having a communication function.
  • LAN wireless local area network
  • Bluetooth registered trademark
  • communication is performed with the external imaging device 70 as described in FIG. 3B or FIG. 3C, and a captured image signal is received from the imaging device 70.
  • the communication unit 26 supplies the received captured image signal to the display image processing unit 12. Then, the image signal processed by the display image processing unit 12 is supplied to the display driving unit 13 and displayed on the display unit 2.
  • FIG. 6 shows still another configuration example of the display device 1. Note that blocks having the same functions as those in Fig. 4 are given the same reference numerals, and redundant explanations are given. Avoid.
  • FIG. 6 The configuration of FIG. 6 is provided with two systems of the imaging functional parts (imaging unit 3, imaging control unit 11 and imaging signal processing unit 15) in the configuration of FIG.
  • a second imaging function part is provided.
  • the imaging units 3 ⁇ and 3 ⁇ ⁇ ⁇ ⁇ may each take an image in the same subject direction, or may each take an image in a different subject direction.
  • the imaging units 3 X and 3 ⁇ ⁇ may both be arranged to image the front, the imaging unit 3 X may be arranged to perform the forward imaging, and the imaging unit 3 ⁇ ⁇ may be arranged to perform the rear imaging. .
  • the imaging unit 3 X may be arranged to capture the right rear and the imaging unit 3 ⁇ to capture the left rear, the imaging unit 3 X is arranged to capture the foot, and the imaging unit 3 ⁇ to capture the sky. May be.
  • the captured image signal obtained at each imaging function part is processed by the display image processing unit 12, supplied to the display driving unit 13, and displayed on the display unit 2.
  • the configuration example of the display device 1 is shown in FIGS. 4, 5, and 6. However, as the display device 1, more various configuration examples are assumed.
  • imaging unit 3 imaging lens 3 a
  • the imaging unit 3 can be considered to have a fixed object direction or to be movable so that the object direction can be changed.
  • Configuration example in which everything is fixed configuration example in which all is movable, part is fixed and part is movable 200 configuration examples are assumed.
  • the display drive unit 1 is processed by the display image processing unit 1 2.
  • the image signal supplied to 3 finally becomes a display image signal used for display on the display unit 2.
  • the display image signal is a signal for displaying an image as a scene different from the scene visually recognized by the user through the display unit 2 when the display unit 3 is in the through state.
  • an image as a display image signal that is, an image as a sight different from a sight visually recognized by the user through the display unit 2 when the display unit 3 is in a through state.
  • the image captured by the imaging unit 3 (3 X, 3 Y) or the image received by the communication unit 26 itself is an image as a scene that is different from the scene that the user visually recognizes via the display unit 2.
  • the image signal processing unit 15 and the display image processing unit 12 are subjected to predetermined processing so that an image equivalent to a scene visually recognized by the user through the display unit 2 is a different scene. It may be an image.
  • a part of the screen area of the display unit 2 is set to the through state, and the display is performed using the image signal output from the display image processing unit 1 2. Based on. That is, image display is performed in a partial area of the display unit 2 in the through state.
  • the image is displayed with the through-state area remaining.
  • the main screen area and the sub-screen area are set in the screen area, and one of the main screen area and the sub-screen area is in the through state. The other is set to the image display state by the display image signal.
  • the screen area may be divided so that one of the divided areas is in the through state and the other is in the image display state by the display image signal.
  • area AR 1 is an area in the through state
  • area AR 2 is an area in which image display is performed using a display image signal.
  • FIG. 7A shows an example in which the image area is displayed by setting the area A R 2 as a sub-screen in the lower right in the area A R 1 in the screen area of the display unit 2.
  • FIG. 7B shows an example in which an image display is performed by setting an area AR 2 as a sub-screen in the lower left of the area AR 1 in the screen area of the display unit 2.
  • the area AR 2 as the sub-screen at various positions such as upper right, upper left, center, center right, center left, etc. in the area AR 1. .
  • FIG. 7C shows an example in which the size of the area A R 2 is reduced.
  • FIG. 7D shows an example in which the size of the area A R 2 is increased.
  • FIG. 7E shows an example in which the area A R 1 and the area A R 2 are equally divided into left and right in the screen area of the display unit 2.
  • FIG. 7F shows an example in which the area AR 1 and the area AR 2 are equally divided into upper and lower parts in the screen area of the display unit 2. For example, when images are displayed in the configurations of FIGS. 4 and 5, as shown in FIGS. 7A, 7B, 7C, 7D, 7E and 7F. It is possible to display an image in the area AR 2 while keeping the area AR 1 in a through state.
  • FIG. 7A, FIG. 7B, FIG. C, FIG. 7D, FIG. 7E, and FIG. 7F may be used to select or switch the region display mode.
  • the position of the area AR 2 that is the child screen as shown in Fig. 7A or 7B can be changed, or the area as shown in Fig. 7C or 7D
  • the area AR 1 in FIGS. 7A, 7B, 7C, 7D, 7E and 7F is displayed as an image, and the area AR 2 is displayed. It is conceivable to perform switching so as to change to the through state.
  • FIG. 8D shows an example in which the entire screen area is displayed as an area AR 2 and image display is performed using a display image signal.
  • FIG. 7A, FIG. 7B, FIG. 7C Switch from the display state as shown in Fig. 7D, Fig. 7E and Fig. 7F to the state where the image is displayed on the full screen without the through area as shown in Fig. 8D.
  • Fig. 8A shows the screen area of display unit 2. Areas AR 2 a and AR 2 b are displayed as two sub-screens at the lower right and lower left in area AR 1. This is an example of setting and displaying an image.
  • FIG. 8B shows an example in which the areas AR 2 a and AR 2 b are set to the left and right of the area AR 1 in the screen area of the display unit 2 and image display is performed.
  • FIG. 8C shows an example in which images are displayed by setting areas AR 2 a and AR 2 b above and below area AR 1 in the screen area of display unit 2.
  • the areas AR1, AR2a, and AR2b may be divided into three equal parts.
  • FIG. 8A, FIG. 8B or FIG. 8C may be switched to the state where the image is displayed on the entire screen as shown in FIG. 8D.
  • the display device 1 when configured to have three or more display image sources, three or more areas as the area AR 2 are set so that images from the respective display image sources are displayed simultaneously. Also good.
  • FIG. 4 the image signal processed by the display image processing unit 12 and supplied to the display driving unit 13 is received. Finally, the display image signal used for display on the display unit 2 is displayed. In particular, this display image signal is shown in FIGS. 7A, 7B, 7C, 7D, 7D. As shown in Fig. E and Fig. 7F, and in Fig. 8A, Fig. 8B, Fig. 8C, and Fig. 8D, leave the through-state area in a partial area on the screen. Is displayed.
  • the display image signal is a signal for displaying an image as a sight different from a sight visually recognized by the user through the display unit 2 (that is, a sight visually recognized in the through state).
  • An image as a display image signal is an image captured by the imaging unit 3 (or 3 X, 3 Y) or an image received by the communication unit 26.
  • the image of the scene that is different from the scene viewed in the through state is set so that the user can expand the vision by viewing the display image in a partial area while viewing the scene of the outside world in the through state area. Is.
  • FIG. 9 shows the control process of the system controller 10.
  • Step F 1 0 1 shows a control process in which the system controller 1 0 puts the display unit 2 into the through state with respect to the display control unit 1 4.
  • the system controller 10 controls the entire screen of the display unit 2 to the through state in step F 1 0 1.
  • the system controller 10 checks whether or not a display start trigger has occurred in step F 1 0 2.
  • the operation trigger information generator 16 An operator that can be operated is provided, and it can be determined that a trigger that starts the display state by the user operating a predetermined operator is generated.
  • the operation trigger information generation unit 16 detects the user's situation and the outside world situation, and the system controller 10 can also determine that a trigger is generated that starts display according to a predetermined condition. it can.
  • step F 100 the system controller 10 advances the process to step F 100 and performs display start control of the display image signal. That is, control is performed so that a display image signal based on an image signal captured by the imaging unit 3 (or 3 X, 3 Y) or an image signal received by the communication unit 26 is displayed on the display unit 3.
  • a display image signal based on an image signal captured by the imaging unit 3 (or 3 X, 3 Y) or an image signal received by the communication unit 26 is displayed on the display unit 3.
  • image display is executed in a part of the screen.
  • the system controller 10 Monitor whether or not a display switching trigger has occurred at 4 and monitor whether or not a display end trigger has occurred at step F 1 0 5.
  • the occurrence of a display switching trigger at step F 1 0 4 is determined by the system controller 10 when the display image is switched based on the user operation or the determination of the system controller 10 according to the user situation or the external environment. It means that.
  • the display image switching is, for example, display image content switching, area type switching force S.
  • the display image content switching includes, for example, zoom processing at the imaging unit 3 and image change by changing the focal position, image change by changing the imaging sensitivity, Image change by signal processing in the image signal processing unit 15 5, Image change by changing the imaging direction (subject direction) when the image pickup unit 3 is a movable camera, Image change by signal processing in the display image processing unit 1 2
  • zoom processing at the imaging unit 3 and image change by changing the focal position image change by changing the imaging sensitivity
  • Image change by signal processing in the image signal processing unit 15 5 Image change by changing the imaging direction (subject direction) when the image pickup unit 3 is a movable camera
  • Image change by signal processing in the display image processing unit 1 2 A great variety of examples are possible, such as changing the image by switching the source when there are multiple display image signal sources.
  • area switching includes changing the position of the parent / child screen, changing the parent / child screen, changing the position of the divided screen, changing the area, and switching to full screen display.
  • the control is such as switching to the state of B, switching from the state of FIG. 7A to the state of FIG. 7E, or switching from the state of FIG. 7A to the state of FIG. 8D.
  • the system controller 10 determines that a trigger to end the display state has occurred. However, it is possible to detect the user's situation or the situation of the outside world, and the system controller 10 can determine that a display end trigger has occurred according to a predetermined condition.
  • step F 1 0 4 the system controller 10 proceeds from step F 1 0 4 to F 1 0 6 and performs switching control for the image display operation. As a result, the contents of the image displayed in the partial area or the area form on the display unit 2 is switched.
  • the system controller 10 monitors the trigger generation in steps F 1 0 4 and F 1 0 5 even after controlling the display switching in step F 1 0 6.
  • the system controller 10 returns the processing from step F 1 0 5 to F 1 0 1, and ends the image display for the display control unit 14 and Finger through the entire surface Make an indication.
  • the system controller 10 performs control processing as shown in FIG. 9, for example.
  • step F 1003 image display is performed in step F 1003 and display switching is performed in step F 10 06.
  • Examples of image display realized in these processes are shown in FIGS. Fig. 10 B, Fig. 11 A and Fig. 11 B, Fig. 1 2 Fig. A and Fig. 1 2 Fig. B, Fig. 1 3 Fig. A and Fig. 13 B, Fig. 14 Fig. A and This will be described in FIG. 14B, and FIG. 15A, FIG. 15B, FIG. 15C and FIG. 15D.
  • Fig. B and Fig. 14 A and Fig. 14 B it is basically the configuration example of Fig. 4, and the imaging unit 3 is located in front of the user, that is, through the area in the through state.
  • the camera is configured to capture a scene in the same direction as the scene visually recognized.
  • FIG. 10A shows a case where the entire surface of the display unit 2 is in the through state. That is, the display unit 2 is a simple transparent plate-like body, and the user is viewing the view scene through the transparent display unit 2.
  • FIG. 10B shows a state in which, for example, an image captured by the image capturing unit 3 is displayed on the display unit 2 as the image display state.
  • the area AR 1 is set to the through state, and the image is displayed in the area AR 2.
  • the imaging unit 3 is imaging the front of the user, and zoom control is performed on the imaging unit 3 to perform telephoto imaging, so that the user via the area AR 1 in the through state
  • An image of a scene (telephoto image) different from the field of view is displayed in area AR2. This gives the user a normal view You can enjoy telephoto images that cannot be seen with normal vision.
  • a telephoto image is obtained by the operation of the imaging unit 3
  • an image in which a short-distance view is displayed at a wide angle is displayed on the display unit 2 by conversely executing a wide-angle zoom.
  • the telephoto / wide-angle control can be performed not only by the zoom lens drive control in the image pickup unit 3 but also by signal processing in the image pickup signal processing unit 15.
  • the system controller 10 instructs the imaging unit 3 to change the focal position, so that images of the near view and the distant view are taken, and the image is displayed in the area of the display unit 2. It may be displayed on AR 2.
  • Fig. 11 A shows the case where the entire surface of display unit 2 is in the through state.
  • FIG. 11B shows an image display state, in which the system controller 10 sends an image obtained from the imaging unit 3 to the display control unit 14 (display image processing unit 12, display drive unit 13).
  • the area AR 1 is set to the through state as shown in FIG. 11B, and an enlarged image is displayed in the area AR 2.
  • an image of a scene different from the user's field of view through the area AR 1 in the through state is displayed in the area AR 2, so that the user can view the normal vision while viewing the normal scene. You can see images that cannot be seen.
  • the system controller 10 causes the display control unit 14 (display image processing unit 1 2, display drive unit 1 3) to execute a reduction process on the image obtained from the imaging unit 3, and the area AR 2 It is also possible to display a reduced image.
  • Fig. 1 2 A shows the case where the entire surface of display 2 is in the through state. However, it shows a particularly dark surrounding.
  • FIG. 12B shows the image display state.
  • the system controller 10 increases the imaging sensitivity with respect to the imaging control unit 11 (imaging unit 3, imaging signal processing unit 15). Or by instructing the image signal processor 15 or display image processor 12 to adjust the brightness level, contrast, or sharpness, etc. Display it.
  • the area A R 1 is in the through state
  • the area A R 2 is an example of displaying an image with improved brightness and brightness.
  • a brightness-adjusted image is displayed in the area AR 2 that is different from the user's field of view through the area AR 1 in the through state, so that the user can see in normal vision. You can see no image.
  • the illumination unit 4 it is also preferable to cause the illumination unit 4 to perform an illumination operation when performing such imaging.
  • the system controller 10 instructs the imaging control unit 11 (imaging unit 3, imaging signal processing unit 15) to reduce the imaging sensitivity, It is also possible to obtain a display image signal that is not dazzling by instructing the brightness level, contrast, and sharpness adjustment to the unit 15 and display image processing unit 1 2 and to display them. .
  • Fig. 13 A shows the case where the entire surface of the display unit 2 is in the through state. For example, when the user is in a dark bedroom where a child is sleeping, it looks almost dark. It is said that there is no situation.
  • Fig. 13B shows the image display state.
  • System controller 1 0 Force Imaging control unit 1 1 (Imaging unit 3 and imaging signal processing unit 1 5) AR 2 with increased sensitivity
  • An image is displayed. In other words, an image is displayed so that a child's sleeping face can be confirmed in a dark room. This allows the user to view a stare image that cannot be seen with normal vision.
  • Fig. 14A shows the case where the entire surface of display unit 2 is in the through state.
  • FIG. 14B shows the image display state.
  • the system controller 10 instructs the imaging control unit 11 (imaging unit 3, imaging signal processing unit 15) to increase the ultraviolet imaging sensitivity
  • imaging control unit 11 imaging unit 3, imaging signal processing unit 15
  • an image of UV sensitivity increased imaging is displayed. This allows the user to view an image that represents an ultraviolet light component that cannot be seen with normal vision.
  • FIG. 10 A and Fig. 10 B, Fig. 11 A and Fig. 11 B, Fig. 12 A and Fig. 12 B, Fig. 13 A and Fig. 13 B, and FIG. 14A and FIG. 14B show examples in which the imaging unit 3 is arranged to image the front of the user in the configuration of FIG. Fig. 5 ⁇ , Fig. 15 ⁇ , Fig. 15 C and Fig. 15 D show the image pickup unit 3 in Fig. 4 (or the image pickup unit 3 X, 3 ⁇ in Fig. 6) before the user visually recognizes it.
  • An example is shown in which the camera is arranged to capture a direction different from the direction.
  • Fig. 15 (5) shows the case where the entire display 2 is in the through state.
  • the image display state is as shown in FIG. 15B, the area AR 1 is set to the through state, and the user is in the area AR 2 The rear image is displayed.
  • the image display state is as shown in FIG. One state is displayed, and an image of the upper side of the user is displayed in the area AR2.
  • FIG. 15D shows a configuration provided with a plurality of imaging functional parts as shown in FIG. 6, and the imaging units 3 X and 3 Y are arranged so as to capture the right rear and left rear of the user, respectively. It is an example of image display in the case. That is, the area A R 1 is set to the through state, and images obtained by capturing the right rear and left rear of the user are displayed in the areas A R 2 a and A R 2 b, respectively.
  • a scene image different from the user's field of view through the area AR 1 in the through state will be displayed in the areas AR 2 a and AR 2 b.
  • the image signal displayed together with the through state area on the screen that is, the image display as a scene different from the scene that the user sees through the display section 2 when the display section 2 is in the through state.
  • An example of the display image signal to do is given.
  • the imaging unit 3 when the imaging unit 3 is arranged to capture the front of the user (that is, when imaging a scene that can be viewed in the through state), the imaging obtained by the imaging unit 3 Display image signal based on signal Examples are as follows.
  • the display image signal obtained by a combination of the above operations or signal processing
  • the imaging unit 3 captures a scene in front of the user that can be viewed in the through state
  • the display image signal is obtained by such operations and processing.
  • the display image signal becomes a signal for displaying an image as a scene different from the scene visually recognized by the user through the display unit 2 when the display unit 2 is in the through state.
  • the present invention is not limited to those listed above, and is obtained by the operation of the imaging unit 3, the signal processing of the imaging signal processing unit 15, and the signal processing of the display image processing unit 12.
  • Various display image signals can be considered.
  • an example of a display image signal based on the imaging signal obtained by the imaging unit 3 is You can think as follows.
  • the captured image obtained by the normal imaging in the imaging unit 3 is Since this is a captured image of a scene (for example, a scene of the rear, the upper, the feet, the right, the left, etc.) that is different from the scene that the user normally visually recognizes through the display unit 2 in the through state,
  • the image signal may be displayed on the display unit 2 as a display image signal as it is.
  • an image signal obtained by adding the imaging operation of the imaging unit 3, the signal processing of the imaging signal processing unit 15 and the signal processing of the display image processing unit 12 may be used as the display image signal.
  • the image pickup operation of the image pickup unit 3 and the signal processing of the image pickup signal processing unit 15 The display image signal may be an image signal having a different scene by the signal processing in the display image processing unit 12.
  • the captured image signal itself may be used as a display image signal.
  • the imaging operation of the imaging unit 3, the imaging signal processing unit 1 An image signal obtained by adding the signal processing of 5 and the display image processing unit 12 may be used as the display image signal.
  • the captured image other than the front of the user, the imaging operation of the imaging unit 3, the signal processing of the imaging signal processing unit 15 and the display image are displayed.
  • An image signal to which signal processing in the processing unit 12 is added may be used as a display image signal.
  • a movable control that tracks a specific target. For example, when an image analysis is performed on a captured image signal and a specific target is detected, the imaging direction is changed in accordance with the movement of the specific target in the captured image. Such control allows the user to An image that tracks can be seen, for example, in region AR2. The same applies to the case where the captured image of the external imaging device 70 received by the communication unit 26 is displayed.
  • the captured image of the external imaging device 70 is a captured image of a scene that is different from the scene normally viewed by the user through the display unit 2 in the normal state.
  • the processed image signal as it is as the display image signal on the display unit 2
  • various images can be provided to the user.
  • an image signal obtained by adding the signal processing of the display image processing unit 12 among the above list may be used as the display image signal.
  • the video from the imaging device 70 that captures images at another location in the stadium Power received by 26 and displayed in area AR 2 of display unit 2 as shown in Fig. 10 B
  • an imaging device installed in the vicinity of a director's seat 70 or a small image attached by the referee
  • the communication unit 26 receives the image captured by the imaging device 70 installed in the resort area or the imaging device 70 owned by a traveling acquaintance and displays it in the area AR 2 of the display unit 2.
  • Various examples are available, such as the ground image (bird view image) captured by the imaging device 70 installed in an aircraft or satellite received by the communication unit 26 and displayed in the area AR 2 of the display unit 2. With this display, the user can enjoy scenery that is not normally visible.
  • the system controller 10 Display is executed in response to determining that a display start trigger has occurred in step F 1 0 2 or in step F 1 0 4 to indicate that a display switching trigger has occurred. Furthermore, the display operation is terminated when it is determined in step F 1 0 5 that a display end trigger has occurred.
  • the triggers related to these display operations may be triggered by user operations. However, it is also possible to detect the user's situation or the outside world and determine that the system controller 10 is triggered by a predetermined condition. Said.
  • This section describes an example of determining whether a trigger has occurred based on the user situation or the external world situation.
  • the display device 1 is provided with a visual sensor, an acceleration sensor, a gyroscope, a biological sensor, and the like as the operation trigger information generation unit 16.
  • the visual sensor detects information related to the user's vision.
  • the visual sensor is an imaging unit that is arranged in the vicinity of the display unit 2, for example, and images the user's eye unit. Can be formed.
  • the system controller 10 captures the image of the user's eye captured by the imaging unit and analyzes the image to detect the gaze direction, focal length, pupil opening, fundus pattern, and eyelid opening / closing. Yes, based on this, the user's situation and will can be determined.
  • the visual sensor can be formed by a light emitting unit that is arranged in the vicinity of the display unit 2 and irradiates light to the user's eyes, and a light receiving unit that receives reflected light from the eyes.
  • a light emitting unit that is arranged in the vicinity of the display unit 2 and irradiates light to the user's eyes
  • a light receiving unit that receives reflected light from the eyes.
  • the system controller 10 can display, for example, the part of the image displayed on the display unit 2 that the user is paying attention to Can be determined.
  • the system controller 10 can also recognize the user's line-of-sight direction as an operation input. For example, the user may move the line of sight to the left or right as a predetermined operation input requested to the display device 1.
  • the focal length of the user it is possible to determine whether the scene that the user is paying attention to is far away or near, and it is possible to perform zoom control, enlargement / reduction control, focus change control, etc. accordingly. For example, a telephoto display is performed when the user looks far away.
  • Image sensitivity adjustment can be performed.
  • the detection of the fundus pattern of the user can be used, for example, for personal authentication of the user. Since the fundus pattern is unique to each person, the user wearing the fundus pattern is determined and control appropriate for the user is performed, or the display operation can be executed only for a specific user. Such control is possible.
  • determining to blink three times is a predetermined operation input.
  • the accelerometer and gyro output signals according to the user's movement.
  • an acceleration sensor is suitable for detecting movement in a linear direction and detecting movement and vibration of a rotating system with a gyro.
  • the accelerometer detects the movement of the user's entire body or parts of the body using the gyro. it can.
  • the detection information of the acceleration sensor is acceleration information as the movement of the user's head or whole body
  • the detection information of the gyro is information of angular velocity and vibration as the movement of the user's head or whole body.
  • the system controller 10 detects the behavior of moving the head from the user's neck, it can also recognize it as a user's conscious operation. For example, if you swing your head to the left twice, that will be the predetermined operation input.
  • the acceleration sensor and gyro it is also possible to determine whether the user is stationary (non-walking), walking or running. It is also possible to detect when sitting from a standing position or standing up.
  • an acceleration sensor or gyro is provided separately from the mounting unit attached to the head and attached to the arm or foot, the behavior of only the arm and the behavior of only the foot can be detected.
  • the biometric sensor is, for example, heart rate information (heart rate), pulse information (pulse rate), sweating information, electroencephalogram information (for example, information on ⁇ wave, / 3 wave, ⁇ wave, ⁇ wave), or skin electricity Reaction, body temperature, blood pressure, respiratory activity From this information, the system controller 10 is in a state of tension, excitement, or emotionally calm. Or in a comfortable or uncomfortable state.
  • the system controller 10 controls to a standby state in which only biometric information detection is performed.
  • the system controller 10 turns on the power. It is also possible to perform control such as returning to the standby state when the user removes the display device 1 from the display device 1.
  • the detection information from the biosensor can be used for user personal authentication (identification of the individual wearer).
  • the biometric sensor may be arranged inside the wearing frame of the glasses-type display device 1, for example, so that the above information can be detected, for example, in the user's temporal region or occipital region. It may be attached to a predetermined part of the body separately from the frame part.
  • the display device 1 is configured to acquire the outside world information as an operation trigger information generation unit 16 in the surrounding environment sensor, imaging target sensor, GPS reception unit, clock Several units, image analysis unit, and communication unit 26 are used.
  • an illuminance sensor As the ambient environment sensor, an illuminance sensor, a temperature sensor, a humidity sensor, an atmospheric pressure sensor, etc. are assumed.
  • the brightness information around the display device 1 can be detected.
  • the humidity sensor Depending on the temperature sensor, the humidity sensor, and the atmospheric pressure sensor, information for determining temperature, humidity, atmospheric pressure, or weather can be obtained.
  • These ambient environment sensors allow the display device 1 to determine the ambient brightness and outdoor weather conditions, etc., so the system controller 10 can use these as external information to suit the ambient brightness and weather conditions.
  • the display image signal generation operation can be controlled. For example, the brightness level of the image is raised or lowered according to the brightness of the surroundings, or the atmosphere of the image is changed according to the weather conditions.
  • the imaging target sensor detects information about the imaging target.
  • a distance measuring sensor or a pyroelectric sensor can be considered, and information for determining the distance to the imaging target and the imaging target itself can be obtained.
  • the system controller 10 can execute and control the imaging operation and display operation corresponding to the distance.
  • the generation operation of the display image signal corresponding to the object to be imaged can be controlled.
  • the GPS receiver acquires information on latitude and longitude as the current position. If latitude 'longitude is detected, information on the location (near the location) at the current location can be obtained by referring to the map database.
  • a recording medium that can be referred to by the system controller 10
  • a relatively large capacity recording medium such as an HDD (Hard Disk Drive) or a flash memory is mounted, and a map database is stored in these recording media. You can get information related to the current position.
  • the communication unit 26 even if the display device 1 does not have a built-in map database, for example, a network server or a map database built-in device is accessed via the communication unit 26, send the latitude ⁇ longitude requests information corresponding to the current position, it may be configured to receive information ⁇
  • Information related to the current location includes place names and building names near the current location. There is name information such as facility name, shop name, and station name.
  • Information related to the current location includes information indicating the type of building, such as parks, theme parks, concert halls, theaters, movie theaters, and sports facilities.
  • information related to the current position includes information on the types and names of natural objects such as coasts, seas, rivers, mountains, mountain peaks, forests, lakes, and plains.
  • information on the current location can also be obtained from information on the area in the theme park, the area for watching seats in baseball and soccer fields, and the area for seats in the concert hall.
  • the system controller 10 can execute and control the display image signal generation operation according to the current position, the geographical conditions in the vicinity of the current point, the facility, etc. Display start control or display end control can be performed at a place.
  • the date / time counting unit counts year / month / day / hour / minute / second, for example. This date and time counting unit allows the system controller 10 to recognize the current time, day / night, month, season, and so on. For this reason, for example, it is possible to execute and control a display image signal generation operation according to day and night (time) and a display image signal generation operation suitable for the current season.
  • an image analysis unit for analyzing a captured image the following various types of information about the imaging target can be detected from the captured image.
  • the types of objects to be imaged can identify persons, animals, natural objects, buildings, equipment, etc. included in the captured image. For example, as an animal, it is possible to determine a situation in which a bird is imaged as a subject or a situation in which a cat is imaged. Natural objects include sea, mountains, trees, rivers, lakes, sky, sun, The month can be determined. As buildings, you can distinguish houses, buildings, and stadiums. As equipment, personal computers, AV (Audio-Visual) equipment, mobile phones, PDAs, IC cards, and 2D barcodes can be identified.
  • AV Audio-Visual
  • imaging targets can be determined by setting characteristics of various shapes in advance and determining whether or not a subject corresponding to the shape is included in the captured image.
  • the image analysis by the image analysis unit it is possible to detect the movement of the subject, for example, a quick movement within the image by a method such as detecting the difference between the frames before and after the image. For example, it is possible to detect a situation in which a fast-moving subject is being imaged, such as when a player is imaged during a sporting event or when a moving car is being imaged.
  • the image analysis by the image analysis unit it is also possible to determine the surrounding situation. For example, it is possible to determine the brightness status due to day and night or weather, and it is also possible to recognize the intensity of rain.
  • a person's face can be converted into personal feature data as relative position information of the constituent elements of the face.
  • E d ZEN the distance between the center of the eye and the nose
  • EM the distance between the center of the eye and the mouth
  • E d ZEM the distance between the eyes
  • the image analysis unit detects the personal feature data as described above by analyzing the image. Can.
  • a HDD Hard Disk Drive
  • flash memory is installed as a recording medium that can be referenced by the system controller 10.
  • a human database is stored on these recording media. By doing so, it is possible to obtain information about the individual who is the subject.
  • the display device 1 accesses, for example, a network server or a person database built-in device via the communication unit 26, transmits personal characteristic data, requests information, Information on a specific person may be received.
  • the system controller can be used when a certain person is encountered (when captured). 1 0 can search the person's information.
  • a person database is prepared in which celebrity information is registered together with personal characteristic data, the person's information can be retrieved when the user meets the celebrity.
  • the system controller 10 can execute and control display image signal generation processing corresponding to the imaging target. For example, when a specific target or a specific person is imaged, a display image signal that highlights the specific target may be generated.
  • various information can be acquired as external information.
  • information searched by an external device can be acquired according to latitude / longitude transmitted from the display device 1 or personal characteristic data.
  • weather information such as weather information, temperature information, and humidity information from external devices Information can be acquired.
  • facility usage information from external devices imaging prohibition within the facility Z permission information
  • identification information of the external device itself can be acquired.
  • the device type or device ID identified as a network device in a given communication protocol can be acquired.
  • image data stored in an external device image data reproduced or displayed on the external device, image data received on the external device, and the like can be acquired.
  • the system controller 10 can execute and control display image signal generation processing.
  • the components as in the above example are provided as the operation trigger information generation unit 16 to detect the user's situation and the situation of the outside world, display start Z end according to those situations, display switching (change of display contents and area By switching the mode, an appropriate or interesting display operation can be realized without any user operation.
  • a part of the screen area of the display unit 2 is set to the through state, and the display by the display image signal is executed.
  • this area it is possible to see an image of a scene that is different from the normal visual scene while the normal visual scene is visible. For example, it is possible to view a telephoto image, an enlarged image, a special captured image, an image of a rear scene, etc. by viewing the display image signal while looking forward in the through state area. Because of this normal vision At the same time as the sight, you can see a sight that cannot be seen with normal vision, and you can create a situation that artificially expands your visual ability.
  • the advantages of the display device 1 can be obtained effectively in the normal life of the user.
  • the appearance and configuration of the display device 1 are not limited to the examples in FIG. 1A and FIG. 1B, FIG. 2A and FIG. 2B, FIG. 4, FIG. Various modifications are possible.
  • a storage unit that records an image signal captured by the imaging unit 3 on a recording medium may be provided.
  • an operation of transmitting the captured image signal from the communication unit 26 to another device may be performed.
  • a reproduced image reproduced from the recording medium by the storage unit may be displayed on the display unit 2 as a display image signal.
  • the data reproduced from the recording medium includes, for example, moving image contents such as movies and video clips, still image contents captured by a digital still camera or the like and recorded on the recording medium, data such as electronic books, and the user personal Computer-generated data such as image data, text data, and spreadsheet data created by a computer and recorded on a recording medium, game images based on game programs recorded on the recording medium, etc. Any data will be assumed.
  • the communication unit 26 not only receives an image signal from the external imaging device 70 but also receives an image (video / still image) provided from an external content source device and displays it on the display unit 2. I thought so It is.
  • Content source devices include, for example, AV (Audio-Visual) devices such as video devices, television tuners, and home server devices, information processing devices such as personal computer, PDA (Personal Digital Assistant), and cellular phones. Can be assumed.
  • AV Audio-Visual
  • Such content source devices can be variously considered, such as devices owned by the user who uses the display device 1 or his / her acquaintance, or server devices of public or service companies that provide various contents.
  • Examples of data transmitted from the content source device to the display device 1 include moving image content such as movies and video clips, still image content captured by a digital still camera and recorded on a recording medium, data such as electronic books, Any data that can be displayed, such as image data, text data, computer use data such as spreadsheet data, game images, etc., created by a user with a personal computer is assumed.
  • a microphone that collects ambient sound during imaging or an earphone type that outputs sound may be provided.
  • a character recognition unit that recognizes characters in the image and a speech synthesis unit that performs speech synthesis processing are provided, and when the captured image contains characters, the speech synthesis unit outputs the speech signal of the read-out speech. It is also possible to generate and output from the speaker unit.
  • the display image signal may be a still image.
  • the imaging unit 3 captures a still image at a predetermined trigger timing, and displays the captured still image on the display unit 2.
  • the display device 1 has a spectacle-type or head-mounted type mounting unit
  • the display device of the present invention performs display in front of the user's eyes. It may be configured to be able to be worn, for example, a headphone type, a neckband type, an ear hook type, or any other mounting unit that can be worn by the user. Further, for example, it may be configured to be attached to the user by attaching to a normal eyeglass, visor, or headphones with an attachment such as a clip.
  • a part of the screen area of the display unit is set to the through state and the display unit executes the display image signal generated by the image signal generation unit, so that the user can While the normal visual scene can be seen in the through state area, it is possible to see an image of a scene that is different from the visual scene that is normally viewed, and to create a situation that artificially expands the user's visual ability There is an effect.
  • the display means can prevent a normal life from being hindered even when the display unit is attached. Therefore, the advantages of the display device of the present invention can be obtained effectively in the normal life of the user.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Health & Medical Sciences (AREA)
  • Social Psychology (AREA)
  • General Health & Medical Sciences (AREA)
  • Optics & Photonics (AREA)
  • Computer Hardware Design (AREA)
  • Marketing (AREA)
  • Business, Economics & Management (AREA)
  • Databases & Information Systems (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Controls And Circuits For Display Device (AREA)
  • Control Of Indicators Other Than Cathode Ray Tubes (AREA)
PCT/JP2007/070397 2006-10-16 2007-10-12 Écran et procédé d'affichage Ceased WO2008047889A1 (fr)

Priority Applications (6)

Application Number Priority Date Filing Date Title
US12/445,477 US8681256B2 (en) 2006-10-16 2007-10-12 Display method and display apparatus in which a part of a screen area is in a through-state
CA2666692A CA2666692C (en) 2006-10-16 2007-10-12 Display apparatus, display method
EP07830131.4A EP2081182B1 (en) 2006-10-16 2007-10-12 Display device and display method
US14/172,521 US9182598B2 (en) 2006-10-16 2014-02-04 Display method and display apparatus in which a part of a screen area is in a through-state
US14/734,308 US9846304B2 (en) 2006-10-16 2015-06-09 Display method and display apparatus in which a part of a screen area is in a through-state
US15/811,227 US20180067313A1 (en) 2006-10-16 2017-11-13 Display method and display apparatus in which a part of a screen area is in a through-state

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2006-281042 2006-10-16
JP2006281042A JP5228307B2 (ja) 2006-10-16 2006-10-16 表示装置、表示方法

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US12/445,477 A-371-Of-International US8681256B2 (en) 2006-10-16 2007-10-12 Display method and display apparatus in which a part of a screen area is in a through-state
US14/172,521 Continuation US9182598B2 (en) 2006-10-16 2014-02-04 Display method and display apparatus in which a part of a screen area is in a through-state

Publications (1)

Publication Number Publication Date
WO2008047889A1 true WO2008047889A1 (fr) 2008-04-24

Family

ID=39314102

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2007/070397 Ceased WO2008047889A1 (fr) 2006-10-16 2007-10-12 Écran et procédé d'affichage

Country Status (8)

Country Link
US (4) US8681256B2 (enExample)
EP (1) EP2081182B1 (enExample)
JP (1) JP5228307B2 (enExample)
KR (1) KR20090069335A (enExample)
CN (2) CN101542584A (enExample)
CA (1) CA2666692C (enExample)
TW (1) TWI428903B (enExample)
WO (1) WO2008047889A1 (enExample)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015149612A1 (en) * 2014-04-01 2015-10-08 Beijing Zhigu Rui Tuo Tech Co., Ltd Image presentation control methods and image presentation control apparatuses
WO2016157677A1 (ja) * 2015-03-31 2016-10-06 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
US20170043252A1 (en) * 2008-10-24 2017-02-16 Excalibur Ip, Llc Reconfiguring reality using a reality overlay device
WO2023173323A1 (zh) * 2022-03-16 2023-09-21 深圳市大疆创新科技有限公司 无人机的功耗控制方法、装置、系统及存储介质

Families Citing this family (104)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5119636B2 (ja) 2006-09-27 2013-01-16 ソニー株式会社 表示装置、表示方法
JP5017989B2 (ja) 2006-09-27 2012-09-05 ソニー株式会社 撮像装置、撮像方法
JP2008096868A (ja) 2006-10-16 2008-04-24 Sony Corp 撮像表示装置、撮像表示方法
JP5228307B2 (ja) * 2006-10-16 2013-07-03 ソニー株式会社 表示装置、表示方法
WO2009094587A1 (en) * 2008-01-23 2009-07-30 Deering Michael F Eye mounted displays
US9812096B2 (en) 2008-01-23 2017-11-07 Spy Eye, Llc Eye mounted displays and systems using eye mounted displays
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays
DE102008049407A1 (de) * 2008-09-29 2010-04-01 Carl Zeiss Ag Anzeigevorrichtung und Anzeigeverfahren
USRE45394E1 (en) 2008-10-20 2015-03-03 X6D Limited 3D glasses
JP5651386B2 (ja) * 2010-06-23 2015-01-14 ソフトバンクモバイル株式会社 眼鏡型表示装置
JP5700963B2 (ja) * 2010-06-29 2015-04-15 キヤノン株式会社 情報処理装置およびその制御方法
EP2591440A1 (en) * 2010-07-05 2013-05-15 Sony Ericsson Mobile Communications AB Method for displaying augmentation information in an augmented reality system
JP5499985B2 (ja) * 2010-08-09 2014-05-21 ソニー株式会社 表示装置組立体
US9510765B2 (en) * 2010-11-24 2016-12-06 Awear Technologies, Llc Detection and feedback of information associated with executive function
US9081416B2 (en) 2011-03-24 2015-07-14 Seiko Epson Corporation Device, head mounted display, control method of device and control method of head mounted display
JP5953714B2 (ja) * 2011-11-24 2016-07-20 セイコーエプソン株式会社 装置、頭部装着型表示装置、装置の制御方法および頭部装着型表示装置の制御方法
CN102693068B (zh) * 2011-03-25 2014-10-22 京东方科技集团股份有限公司 一种视觉扩展显示设备和方法
CA2750287C (en) 2011-08-29 2012-07-03 Microsoft Corporation Gaze detection in a see-through, near-eye, mixed reality display
US9025252B2 (en) 2011-08-30 2015-05-05 Microsoft Technology Licensing, Llc Adjustment of a mixed reality display for inter-pupillary distance alignment
US9213163B2 (en) 2011-08-30 2015-12-15 Microsoft Technology Licensing, Llc Aligning inter-pupillary distance in a near-eye display system
EP2751609B1 (en) 2011-08-30 2017-08-16 Microsoft Technology Licensing, LLC Head mounted display with iris scan profiling
KR20190133080A (ko) * 2011-09-19 2019-11-29 아이사이트 모빌 테크놀로지 엘티디 증강 현실 시스템용 터치프리 인터페이스
US8941560B2 (en) * 2011-09-21 2015-01-27 Google Inc. Wearable computer with superimposed controls and instructions for external device
JP6105850B2 (ja) * 2012-03-16 2017-03-29 富士通株式会社 携帯端末装置、表示制御方法及び表示制御プログラム
JP6008086B2 (ja) * 2012-03-26 2016-10-19 セイコーエプソン株式会社 頭部装着型表示装置
US9671566B2 (en) 2012-06-11 2017-06-06 Magic Leap, Inc. Planar waveguide apparatus with diffraction element(s) and system employing same
US9219901B2 (en) * 2012-06-19 2015-12-22 Qualcomm Incorporated Reactive user interface for head-mounted display
JP5694257B2 (ja) * 2012-08-27 2015-04-01 株式会社東芝 表示装置、表示方法及びプログラム
US10620902B2 (en) * 2012-09-28 2020-04-14 Nokia Technologies Oy Method and apparatus for providing an indication regarding content presented to another user
US9894269B2 (en) * 2012-10-31 2018-02-13 Atheer, Inc. Method and apparatus for background subtraction using focus differences
JP6040715B2 (ja) * 2012-11-06 2016-12-07 ソニー株式会社 画像表示装置及び画像表示方法、並びにコンピューター・プログラム
WO2014077046A1 (ja) * 2012-11-13 2014-05-22 ソニー株式会社 画像表示装置及び画像表示方法、移動体装置、画像表示システム、並びにコンピューター・プログラム
JP6023568B2 (ja) * 2012-11-30 2016-11-09 オリンパス株式会社 頭部装着型装置
CN103024582B (zh) * 2012-12-26 2016-09-21 深圳Tcl新技术有限公司 多屏显示控制方法及多屏显示装置
JP6375591B2 (ja) * 2013-01-15 2018-08-22 セイコーエプソン株式会社 頭部装着型表示装置、頭部装着型表示装置の制御方法、および、画像表示システム
JP6333801B2 (ja) * 2013-02-19 2018-05-30 ミラマ サービス インク 表示制御装置、表示制御プログラム、および表示制御方法
KR102516124B1 (ko) 2013-03-11 2023-03-29 매직 립, 인코포레이티드 증강 및 가상 현실을 위한 시스템 및 방법
NZ712192A (en) 2013-03-15 2018-11-30 Magic Leap Inc Display system and method
US9819627B2 (en) * 2013-07-01 2017-11-14 Paypal, Inc. Location restricted message exchange system
JP6252002B2 (ja) * 2013-07-11 2017-12-27 セイコーエプソン株式会社 頭部装着型表示装置および頭部装着型表示装置の制御方法
CN104427226B (zh) * 2013-08-20 2018-08-31 联想(北京)有限公司 图像采集方法和电子设备
KR102081934B1 (ko) * 2013-08-28 2020-02-26 엘지전자 주식회사 헤드 마운트 디스플레이 디바이스 및 그 제어 방법
KR20150045637A (ko) * 2013-10-21 2015-04-29 삼성전자주식회사 사용자 인터페이스 운용 방법 및 그 전자 장치
US9772495B2 (en) * 2013-11-04 2017-09-26 Weng-Kong TAM Digital loupe device
US9672649B2 (en) 2013-11-04 2017-06-06 At&T Intellectual Property I, Lp System and method for enabling mirror video chat using a wearable display device
JP6264014B2 (ja) * 2013-12-17 2018-01-24 セイコーエプソン株式会社 頭部装着型表示装置
CN106030493B (zh) * 2013-12-18 2018-10-26 前视红外系统股份公司 基于滑动手势处理红外图像
US9993335B2 (en) 2014-01-08 2018-06-12 Spy Eye, Llc Variable resolution eye mounted displays
CN106030692B (zh) * 2014-02-20 2019-11-15 索尼公司 显示控制装置、显示控制方法及计算机程序
JP2015159383A (ja) 2014-02-21 2015-09-03 ソニー株式会社 ウェアラブル機器、制御装置、撮影制御方法および自動撮像装置
USD817353S1 (en) * 2014-03-07 2018-05-08 Sony Corporation Display panel or screen with an icon
CN106464797B (zh) * 2014-06-03 2019-06-11 索尼公司 获得标记有修正的图像方向的数字图像的方法和电子装置
KR101652475B1 (ko) * 2014-06-11 2016-08-30 안병영 후방 영상이 나타나는 안경
KR102217561B1 (ko) * 2014-06-23 2021-02-19 엘지전자 주식회사 헤드 마운티드 디스플레이 및 그것의 제어방법
JP6638195B2 (ja) * 2015-03-02 2020-01-29 セイコーエプソン株式会社 表示装置、表示装置の制御方法、および、プログラム
US9959591B2 (en) * 2014-07-31 2018-05-01 Seiko Epson Corporation Display apparatus, method for controlling display apparatus, and program
JP2016033763A (ja) * 2014-07-31 2016-03-10 セイコーエプソン株式会社 表示装置、表示装置の制御方法、および、プログラム
KR102337509B1 (ko) * 2014-08-29 2021-12-09 삼성전자주식회사 컨텐츠 제공 방법 및 그 전자 장치
US9495004B2 (en) * 2014-09-08 2016-11-15 Qualcomm Incorporated Display device adjustment by control device
JP6501389B2 (ja) * 2014-12-01 2019-04-17 浅田 一憲 ヘッドマウントディスプレイ装置、撮影制御方法、及びプログラム
US10379357B2 (en) * 2015-01-08 2019-08-13 Shai Goldstein Apparatus and method for displaying content
US9811681B2 (en) * 2015-07-28 2017-11-07 Sony Mobile Communications Inc. Method and system for providing access to a device for a user
JP6679856B2 (ja) * 2015-08-31 2020-04-15 カシオ計算機株式会社 表示制御装置、表示制御方法及びプログラム
EP3185535A1 (en) 2015-12-22 2017-06-28 Thomson Licensing Method and apparatus for controlling a discrepant aiming direction of a camera
US10976809B2 (en) * 2016-03-14 2021-04-13 Htc Corporation Interaction method for virtual reality
JP2017182130A (ja) * 2016-03-28 2017-10-05 ソニー株式会社 情報処理装置、情報処理方法、及びプログラム
KR102595020B1 (ko) 2016-05-03 2023-10-27 엘지전자 주식회사 헤드마운티드 디스플레이
EP4345831A3 (en) 2016-07-25 2024-04-24 Magic Leap, Inc. Imaging modification, display and visualization using augmented and virtual reality eyewear
EP3296792B1 (en) * 2016-09-19 2025-05-14 Essilor International Method for managing the display of an image to a user of an optical system
EP3316117A1 (en) * 2016-10-31 2018-05-02 Nokia Technologies OY Controlling content displayed in a display
KR20180065515A (ko) * 2016-12-08 2018-06-18 박순구 다기능 웨어러블 디스플레이 장치
WO2018110434A1 (ja) 2016-12-15 2018-06-21 株式会社ソニー・インタラクティブエンタテインメント 振動デバイス、及び制御システム
WO2018110433A1 (ja) 2016-12-15 2018-06-21 株式会社ソニー・インタラクティブエンタテインメント 情報処理システム、振動制御方法、及びプログラム
US10969867B2 (en) 2016-12-15 2021-04-06 Sony Interactive Entertainment Inc. Information processing system, controller device, controller device control method and program
US11138436B2 (en) 2016-12-29 2021-10-05 Magic Leap, Inc. Automatic control of wearable display device based on external conditions
US10277943B2 (en) * 2017-03-27 2019-04-30 Microsoft Technology Licensing, Llc Selective rendering of sparse peripheral displays based on user movements
JP6833018B2 (ja) * 2017-04-18 2021-02-24 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
WO2018193514A1 (ja) 2017-04-18 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
WO2018193557A1 (ja) 2017-04-19 2018-10-25 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
JP6757466B2 (ja) 2017-04-26 2020-09-16 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
JP6976719B2 (ja) * 2017-05-25 2021-12-08 キヤノン株式会社 表示制御装置、表示制御方法及びプログラム
CN108983421B (zh) * 2017-06-02 2022-02-01 富泰华工业(深圳)有限公司 佩戴显示装置
US10687119B2 (en) 2017-06-27 2020-06-16 Samsung Electronics Co., Ltd System for providing multiple virtual reality views
US10578869B2 (en) * 2017-07-24 2020-03-03 Mentor Acquisition One, Llc See-through computer display systems with adjustable zoom cameras
WO2019038887A1 (ja) 2017-08-24 2019-02-28 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
JP6893561B2 (ja) 2017-08-24 2021-06-23 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置
WO2019043781A1 (ja) 2017-08-29 2019-03-07 株式会社ソニー・インタラクティブエンタテインメント 振動制御装置、振動制御方法、及びプログラム
CN119717285A (zh) 2017-10-26 2025-03-28 奇跃公司 用于增强现实显示器的宽带自适应透镜组件
JP7135413B2 (ja) * 2018-05-07 2022-09-13 セイコーエプソン株式会社 接続装置、表示装置、接続装置の制御方法、及び、表示装置の制御方法
JP2021536592A (ja) 2018-08-31 2021-12-27 マジック リープ, インコーポレイテッドMagic Leap, Inc. 拡張現実デバイスのための空間的に分解された動的調光
WO2020096743A1 (en) 2018-11-09 2020-05-14 Beckman Coulter, Inc. Service glasses with selective data provision
EP3908876B1 (en) 2019-01-11 2025-10-29 Magic Leap, Inc. Time-multiplexed display of virtual content at various depths
EP3956858B1 (en) 2019-04-18 2025-12-03 Beckman Coulter, Inc. Securing data of objects in a laboratory environment
WO2021030328A1 (en) 2019-08-12 2021-02-18 Magic Leap, Inc. Systems and methods for virtual and augmented reality
JP2021089351A (ja) * 2019-12-03 2021-06-10 キヤノン株式会社 頭部装着システム及び情報処理装置
JP7170277B2 (ja) * 2019-12-09 2022-11-14 株式会社辰巳菱機 通報装置
CN111208964B (zh) * 2020-01-13 2023-08-01 宜视智能科技(苏州)有限公司 低视力助视方法、终端及存储介质
EP4181993A4 (en) 2020-07-16 2024-08-07 Ventec Life Systems, Inc. System and method for concentrating gas
EP4182054A4 (en) 2020-07-16 2024-11-06 Ventec Life Systems, Inc. System and method for concentrating gas
US12172121B2 (en) 2020-07-16 2024-12-24 Ventec Life Systems, Inc. System and method for concentrating gas
CN116249569A (zh) 2020-07-16 2023-06-09 英瓦卡尔公司 用于浓缩气体的系统和方法
JP7676174B2 (ja) * 2021-03-24 2025-05-14 キヤノン株式会社 制御装置、表示制御システム、および制御方法
DE102021206565A1 (de) * 2021-06-24 2022-12-29 Siemens Healthcare Gmbh Darstellungsvorrichtung zur Anzeige einer graphischen Darstellung einer erweiterten Realität
US12347555B2 (en) 2021-07-15 2025-07-01 Ventec Life Systems, Inc. System and method for medical device communication

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08126031A (ja) 1994-10-20 1996-05-17 Minolta Co Ltd 方位検出機構付hmd
JPH0927970A (ja) 1995-07-13 1997-01-28 Minolta Co Ltd 映像表示装置
JPH09185009A (ja) 1995-12-28 1997-07-15 Fuji Xerox Co Ltd メガネディスプレイ
JP2002044683A (ja) * 2000-07-19 2002-02-08 Yasushi Haruta 立体画像用ビデオカメラおよび立体画像再生装置
WO2002073535A2 (en) * 2001-03-13 2002-09-19 John Riconda Enhanced display of environmental navigation features to vehicle operator
JP2005172851A (ja) * 2003-12-05 2005-06-30 Sony Corp 画像表示装置

Family Cites Families (48)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0226870U (enExample) 1988-08-04 1990-02-21
US7310072B2 (en) * 1993-10-22 2007-12-18 Kopin Corporation Portable communication display device
US5978015A (en) 1994-10-13 1999-11-02 Minolta Co., Ltd. Stereoscopic system with convergence and dioptric power adjustments according to object distance
JPH08163526A (ja) 1994-11-30 1996-06-21 Canon Inc 映像選択装置
US5905525A (en) 1995-07-13 1999-05-18 Minolta Co., Ltd. Image display apparatus having a display controlled by user's head movement
US6847336B1 (en) * 1996-10-02 2005-01-25 Jerome H. Lemelson Selectively controllable heads-up display system
JPH1164780A (ja) 1997-08-12 1999-03-05 Olympus Optical Co Ltd 頭部装着型映像表示装置
US6037914A (en) 1997-08-25 2000-03-14 Hewlett-Packard Company Method and apparatus for augmented reality using a see-through head-mounted display
JPH11249588A (ja) 1998-02-27 1999-09-17 Shimadzu Corp 頭部装着型表示装置
CA2270003A1 (en) 1998-05-01 1999-11-01 Sarwat Ghobranios Video camera with eyeglass interface
JPH11341399A (ja) 1998-05-29 1999-12-10 Sony Corp 眼鏡型画像表示装置
JP2000284214A (ja) 1999-03-30 2000-10-13 Suzuki Motor Corp ヘルメット搭載用表示手段制御装置
JP2001356836A (ja) 2000-06-16 2001-12-26 Toshiba Corp コンピュータシステム、その画面保護の制御方法及び記憶媒体
GB0109720D0 (en) 2001-04-20 2001-06-13 Koninkl Philips Electronics Nv Display apparatus and image encoded for display by such an apparatus
CN1156248C (zh) * 2001-07-13 2004-07-07 清华大学 运动图像的人脸特征检测方法
GB2378075A (en) 2001-07-27 2003-01-29 Hewlett Packard Co Method and apparatus for transmitting images from head mounted imaging device.
JP3893983B2 (ja) 2002-01-17 2007-03-14 ソニー株式会社 情報提供装置及び情報提供方法、記憶媒体、並びにコンピュータ・プログラム
JP2003244728A (ja) 2002-02-15 2003-08-29 Mitsubishi Heavy Ind Ltd 仮想映像作成装置及び仮想映像作成方法
US7204425B2 (en) 2002-03-18 2007-04-17 Precision Dynamics Corporation Enhanced identification appliance
JP2004023692A (ja) * 2002-06-20 2004-01-22 Hitachi Ltd 撮影装置、撮影制限システム、撮影警告システム、及び撮影制限解除システム
JP3988632B2 (ja) 2002-11-28 2007-10-10 日本電気株式会社 眼鏡型ディスプレイの制御方法
FR2848304A1 (fr) 2002-12-10 2004-06-11 Ingineo Nouveau systeme audiovisuel mobile de gestion d'information
JP4133297B2 (ja) 2002-12-19 2008-08-13 シャープ株式会社 カメラシステム
SE0203908D0 (sv) * 2002-12-30 2002-12-30 Abb Research Ltd An augmented reality system and method
US7495638B2 (en) * 2003-05-13 2009-02-24 Research Triangle Institute Visual display with increased field of view
CN101127834B (zh) 2003-05-20 2011-04-06 松下电器产业株式会社 摄像系统
US7369101B2 (en) * 2003-06-12 2008-05-06 Siemens Medical Solutions Usa, Inc. Calibrating real and virtual views
JP3968522B2 (ja) * 2003-10-06 2007-08-29 ソニー株式会社 記録装置、及び記録方法
JP4801340B2 (ja) 2003-10-28 2011-10-26 株式会社半導体エネルギー研究所 表示装置
US8884845B2 (en) * 2003-10-28 2014-11-11 Semiconductor Energy Laboratory Co., Ltd. Display device and telecommunication system
JP4206036B2 (ja) 2003-12-09 2009-01-07 株式会社ゼンリン 電子地図データを利用した風景画像の撮像位置の特定
JP2005269010A (ja) 2004-03-17 2005-09-29 Olympus Corp 画像生成装置、画像生成プログラム、及び画像生成方法
WO2005088970A1 (ja) 2004-03-11 2005-09-22 Olympus Corporation 画像生成装置、画像生成方法、および画像生成プログラム
JP4605152B2 (ja) * 2004-03-12 2011-01-05 株式会社ニコン 画像表示光学系及び画像表示装置
CN1922651A (zh) * 2004-06-10 2007-02-28 松下电器产业株式会社 穿戴型信息提示装置
JP2006050265A (ja) 2004-08-04 2006-02-16 Sony Corp アンテナモジュール用磁芯部材、アンテナモジュールおよびこれを備えた携帯情報端末
JP2006050285A (ja) 2004-08-05 2006-02-16 Matsushita Electric Ind Co Ltd 携帯端末
JP2006067139A (ja) 2004-08-25 2006-03-09 Matsushita Electric Ind Co Ltd 複数カメラ映像検索装置、複数カメラ映像検索方法、及び複数カメラ映像検索プログラム
JP4720167B2 (ja) 2004-12-03 2011-07-13 株式会社ニコン 電子カメラおよびプログラム
JP5040061B2 (ja) 2004-12-15 2012-10-03 コニカミノルタホールディングス株式会社 映像表示装置及び情報提供システム
US20080136916A1 (en) * 2005-01-26 2008-06-12 Robin Quincey Wolff Eye tracker/head tracker/camera tracker controlled camera/weapon positioner control system
JP2006208687A (ja) 2005-01-27 2006-08-10 Konica Minolta Photo Imaging Inc ヘッドマウントディスプレイの映像調節システム
JP2006208867A (ja) 2005-01-31 2006-08-10 Three Bond Co Ltd プロジェクター用液晶パネルの製造方法
US20080122931A1 (en) * 2006-06-17 2008-05-29 Walter Nicholas Simbirski Wireless Sports Training Device
JP5119636B2 (ja) 2006-09-27 2013-01-16 ソニー株式会社 表示装置、表示方法
JP5228307B2 (ja) * 2006-10-16 2013-07-03 ソニー株式会社 表示装置、表示方法
JP2008096868A (ja) 2006-10-16 2008-04-24 Sony Corp 撮像表示装置、撮像表示方法
JP4961984B2 (ja) 2006-12-07 2012-06-27 ソニー株式会社 画像表示システム、表示装置、表示方法

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08126031A (ja) 1994-10-20 1996-05-17 Minolta Co Ltd 方位検出機構付hmd
JPH0927970A (ja) 1995-07-13 1997-01-28 Minolta Co Ltd 映像表示装置
JPH09185009A (ja) 1995-12-28 1997-07-15 Fuji Xerox Co Ltd メガネディスプレイ
JP2002044683A (ja) * 2000-07-19 2002-02-08 Yasushi Haruta 立体画像用ビデオカメラおよび立体画像再生装置
WO2002073535A2 (en) * 2001-03-13 2002-09-19 John Riconda Enhanced display of environmental navigation features to vehicle operator
JP2005172851A (ja) * 2003-12-05 2005-06-30 Sony Corp 画像表示装置

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP2081182A4

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20170043252A1 (en) * 2008-10-24 2017-02-16 Excalibur Ip, Llc Reconfiguring reality using a reality overlay device
US12311261B2 (en) 2008-10-24 2025-05-27 Samsung Electronics Co., Ltd. Reconfiguring reality using a reality overlay device
WO2015149612A1 (en) * 2014-04-01 2015-10-08 Beijing Zhigu Rui Tuo Tech Co., Ltd Image presentation control methods and image presentation control apparatuses
US10643334B2 (en) 2014-04-01 2020-05-05 Beijing Zhigu Rui Tuo Tech Co., Ltd Image presentation control methods and image presentation control apparatuses
WO2016157677A1 (ja) * 2015-03-31 2016-10-06 ソニー株式会社 情報処理装置、情報処理方法及びプログラム
US10559065B2 (en) 2015-03-31 2020-02-11 Sony Corporation Information processing apparatus and information processing method
WO2023173323A1 (zh) * 2022-03-16 2023-09-21 深圳市大疆创新科技有限公司 无人机的功耗控制方法、装置、系统及存储介质

Also Published As

Publication number Publication date
CN101542584A (zh) 2009-09-23
TW200834536A (en) 2008-08-16
US20100085462A1 (en) 2010-04-08
EP2081182B1 (en) 2016-03-30
EP2081182A4 (en) 2011-08-24
EP2081182A1 (en) 2009-07-22
JP5228307B2 (ja) 2013-07-03
KR20090069335A (ko) 2009-06-30
JP2008096867A (ja) 2008-04-24
TWI428903B (zh) 2014-03-01
US8681256B2 (en) 2014-03-25
US20180067313A1 (en) 2018-03-08
US20150268471A1 (en) 2015-09-24
CN103399403B (zh) 2016-10-05
CA2666692C (en) 2018-01-02
US9182598B2 (en) 2015-11-10
CA2666692A1 (en) 2008-04-24
CN103399403A (zh) 2013-11-20
US9846304B2 (en) 2017-12-19
US20140152688A1 (en) 2014-06-05

Similar Documents

Publication Publication Date Title
JP5228307B2 (ja) 表示装置、表示方法
CN101520690B (zh) 图像获取和显示设备以及图像获取和显示方法
US9772686B2 (en) Imaging display apparatus and method
US7855743B2 (en) Image capturing and displaying apparatus and image capturing and displaying method
JP6137113B2 (ja) 表示装置、表示方法、プログラム
JP2008083289A (ja) 撮像表示装置、撮像表示方法
JP5664677B2 (ja) 撮像表示装置、撮像表示方法
JP2013174898A (ja) 撮像表示装置、撮像表示方法
JP5971298B2 (ja) 表示装置、表示方法
JP2013083994A (ja) 表示装置、表示方法

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 200780043163.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 07830131

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2666692

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 2007830131

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 1020097010005

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 12445477

Country of ref document: US