WO2010021373A1 - 画像表示装置、制御方法およびコンピュータプログラム - Google Patents
画像表示装置、制御方法およびコンピュータプログラム Download PDFInfo
- Publication number
- WO2010021373A1 WO2010021373A1 PCT/JP2009/064625 JP2009064625W WO2010021373A1 WO 2010021373 A1 WO2010021373 A1 WO 2010021373A1 JP 2009064625 W JP2009064625 W JP 2009064625W WO 2010021373 A1 WO2010021373 A1 WO 2010021373A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- image
- face
- face detection
- unit
- image display
- Prior art date
Links
Images
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42201—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] biosensors, e.g. heat sensor for presence detection, EEG sensors or any limb activity sensors worn by the user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/63—Generation or supply of power specially adapted for television receivers
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3206—Monitoring of events, devices or parameters that trigger a change in power modality
- G06F1/3231—Monitoring the presence, absence or movement of users
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F1/00—Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
- G06F1/26—Power supply means, e.g. regulation thereof
- G06F1/32—Means for saving power
- G06F1/3203—Power management, i.e. event-based initiation of a power-saving mode
- G06F1/3234—Power saving characterised by the action undertaken
- G06F1/325—Power saving in peripheral device
- G06F1/3265—Power saving in display device
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/06—Adjustment of display parameters
- G09G2320/0626—Adjustment of display parameters for control of overall brightness
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2330/00—Aspects of power supply; Aspects of display protection and defect management
- G09G2330/02—Details of power systems and of start or stop of display operation
- G09G2330/021—Power management, e.g. power saving
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
- G09G3/34—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters by control of light from an independent source
- G09G3/3406—Control of illumination source
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/42202—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS] environmental sensors, e.g. for detecting temperature, luminosity, pressure, earthquakes
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/441—Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card
- H04N21/4415—Acquiring end-user identification, e.g. using personal code sent by the remote control or by inserting a card using biometric characteristics of the user, e.g. by voice recognition or fingerprint scanning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/443—OS processes, e.g. booting an STB, implementing a Java virtual machine in an STB or power management in an STB
- H04N21/4436—Power management, e.g. shutting down unused components of the receiver
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/45—Management operations performed by the client for facilitating the reception of or the interaction with the content or administrating data related to the end-user or to the client device itself, e.g. learning user preferences for recommending movies, resolving scheduling conflicts
- H04N21/4508—Management of client data or end-user data
- H04N21/4532—Management of client data or end-user data involving end-user characteristics, e.g. viewer profile, preferences
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
- H04N5/57—Control of contrast or brightness
- H04N5/58—Control of contrast or brightness in dependence upon ambient light
-
- Y—GENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
- Y02—TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
- Y02D—CLIMATE CHANGE MITIGATION TECHNOLOGIES IN INFORMATION AND COMMUNICATION TECHNOLOGIES [ICT], I.E. INFORMATION AND COMMUNICATION TECHNOLOGIES AIMING AT THE REDUCTION OF THEIR OWN ENERGY USE
- Y02D10/00—Energy efficient computing, e.g. low power processors, power management or thermal management
Definitions
- the present invention relates to an image display device, a control method, and a computer program.
- the present invention has been made in view of the above problems, and an object of the present invention is to detect a person's viewing situation from a captured image and automatically inside the apparatus according to the detected viewing situation. It is an object of the present invention to provide a new and improved image display apparatus, control method, and computer program capable of controlling the operation state of the computer.
- an image display unit that displays a still image or a moving image, and an imaging unit that performs shooting in a direction in which the image display unit displays a still image or a moving image
- a face detection unit that performs face detection processing for detecting a user's face included in an image captured by the image capturing unit at a predetermined interval, and the predetermined interval is determined based on the image captured by the image capturing unit. If the position of the whole or a part of the detected face does not change for a predetermined time during the face detection process, the face detection unit An image display device that does not detect a face is provided.
- the image display unit displays a still image or a moving image
- the imaging unit performs shooting in a direction in which the image display unit displays a still image or a moving image.
- the face detection unit performs face detection processing for detecting a face included in the image captured by the imaging unit at a predetermined interval.
- the predetermined interval is variable depending on whether or not the user's face is included in the image captured by the imaging unit, and the face detection unit detects the position of the detected face during the face detection process. If the time does not change, the face is not detected as the user's face.
- the cost and power required for the face detection process can be reduced by changing the interval of the face detection process in the face detection unit depending on whether or not the user's face is included in the image.
- the control unit detects the brightness of the installed room using the image captured by the imaging unit, and if the brightness of the room does not reach a predetermined value, the brightness of the image display unit is not reduced. Good.
- the control unit may increase the luminance of the image display unit when the user's face is included in the image captured by the imaging unit after the luminance of the image display unit is decreased.
- the control unit may omit part or all of the image processing for the image displayed on the image display unit when a predetermined time has elapsed since the luminance of the image display unit was decreased.
- control unit determines that the user's face is included in the image captured by the imaging unit, but the user is meditating, the brightness of the image display unit is reduced after a predetermined time has elapsed since the determination. You may make it make it.
- a control unit that executes a control operation related to a still image or a moving image according to the result of the face detection process of the face detection unit, and the control unit determines the position of the face detected by the face detection unit and the distance of the face from the imaging unit; Accordingly, the range of the image to be displayed on the image display unit may be changed.
- a control unit that executes a control operation related to a still image or a moving image according to the result of the face detection process of the face detection unit, and the control unit displays on the image display unit according to the face status detected by the face detection unit An image may be selected.
- a control unit that executes a control operation related to a still image or a moving image according to the result of the face detection process of the face detection unit, and the control unit displays on the image display unit according to the presence or absence of the face detected by the face detection unit
- the presence / absence of execution of image quality enhancement processing on an image may be controlled.
- a control unit that executes a control operation related to a still image or a moving image according to the result of the face detection process of the face detection unit, and the control unit displays on the image display unit according to the presence or absence of the face detected by the face detection unit Channels may be automatically switched sequentially.
- a voice output unit that outputs voice; and a control unit that executes a control operation related to voice according to a result of the face detection process of the face detection unit, wherein the control unit is configured to detect the situation of the face detected by the face detection unit. Accordingly, the volume of the sound output from the sound output unit may be controlled.
- the control unit calculates the area of the face when facing the front direction from the face direction and area detected by the face detection unit, and controls the sound volume output from the audio output unit using the calculation result. May be.
- a control unit that executes a control operation related to a still image or a moving image according to the result of the face detection process of the face detection unit is provided, and the control unit is displayed on the image display unit as a result of the face detection process in the face detection unit When it is determined that the face detected by the face detection unit has reacted to the image, a mark may be set on the image.
- the control unit may set a chapter as a mark when the image displayed on the image display unit is a moving image.
- the control unit may set the chapter at a time point that is a predetermined time later than the time point when the face detected by the face detection unit has reacted.
- an image display unit that displays a still image or a moving image performs imaging in a direction in which the still image or the moving image is displayed; and A face detection step of performing face detection processing for detecting a user's face included in the image captured in the imaging step at a predetermined interval, and the predetermined interval includes a user's face on the image captured in the imaging step.
- the face detection step detects the face as the user's face if the position of the whole or part of the detected face does not change for a predetermined time during the face detection process. No control method is provided.
- an image display device, a control method, and a computer program can be provided.
- FIG. 6 is an explanatory diagram illustrating, as a graph, an example of the transition of the luminance of the backlight in the image display device 100 according to the embodiment of the present invention.
- FIG. 7 is an explanatory diagram illustrating another example of the transition of the luminance of the backlight in the image display device 100 according to the embodiment of the present invention.
- FIG. 8 is an explanatory diagram showing, as a graph, an example of the transition of power consumption in the image display apparatus 100 according to an embodiment of the present invention.
- FIG. 9 is a flowchart for explaining a modification of the face detection process in the image display apparatus 100 according to the embodiment of the present invention.
- FIG. 10 is an explanatory diagram illustrating another modification of the face detection process in the image display apparatus 100 according to the embodiment of the present invention.
- the image display apparatus 100 includes an image input unit 104 that captures a moving image at the upper center of a display panel 102 that displays a still image or a moving image. It has been.
- the image input unit 104 captures a moving image in a direction in which the image display apparatus 100 displays a still image or a moving image on the display panel 102.
- the image display apparatus 100 analyzes an image captured by the image input unit 104 and detects a user's face shown in the image.
- the image display device 100 is characterized in that the internal operation state is changed depending on whether or not the user's face is included in the moving image captured by the image input unit 104.
- the image display apparatus 100 includes the image input unit 104 that captures a moving image in the upper center portion of the image display panel 102.
- the image input device 100 captures a moving image.
- the place is not limited to such an example.
- a device different from the image display device 100 may be provided, the device may be connected to the image display device 100, and a moving image may be captured by the device.
- the number of image input units is not limited to one, and two or more image input units may be provided for imaging.
- the image display apparatus 100 includes a display panel 102, an image input unit 104, a face detection unit 106, a power control unit 108, and an SDRAM (Synchronous Dynamic Random). (Access Memory) 110, flash ROM 112, CPU 114, remote controller light receiving unit 116, network I / F 118, network terminal 120, terrestrial tuner 122, digital tuner 124, audio A / D conversion circuit 126, , Video decoder 128, MPEG decoder 130, audio signal processing circuit 132, video signal processing circuit 134, and HDMI (High-Definition Multimedia Interface) receiver 136. Constituted an HDMI terminal 138, an audio amplification circuit 140, a speaker 142, a graphic generation circuit 144, a panel drive circuit 146, include.
- the terrestrial tuner 122 receives a broadcast wave signal transmitted from a terrestrial antenna (not shown) that receives an analog terrestrial wave, and demodulates a video signal and an audio signal included in the broadcast wave signal into a baseband signal. To do. Of the baseband signals demodulated by the terrestrial tuner 122, the audio signal is sent to the audio A / D conversion circuit 126 and the video signal is sent to the video decoder 128.
- the digital tuner 124 receives a broadcast wave signal transmitted from a digital antenna (not shown) that receives a digital broadcast, and converts the received broadcast wave signal into MPEG2-TS (MPEG2 Transport Stream).
- MPEG2-TS MPEG2 Transport Stream
- the audio A / D conversion circuit 126 receives the audio signal demodulated by the terrestrial tuner 122 and converts it from an analog audio signal to a digital audio signal. When the audio A / D conversion circuit 126 converts an analog audio signal into a digital audio signal, the converted digital audio signal is sent to the audio signal processing circuit 132.
- the MPEG decoder 130 receives MPEG2-TS sent from the digital tuner 124, converts audio from MPEG2-TS to digital audio signal, and converts video from MPEG2-TS to digital component signal. .
- the MPEG decoder TS converts the MPEG2-TS into a digital audio signal or digital component signal
- the converted digital audio signal is sent to the audio signal processing circuit 132
- the converted digital component signal is sent to the video signal processing circuit 134.
- the video signal processing circuit 134 receives a digital component signal sent from the video decoder 128 or the MPEG decoder 130 and performs signal processing on the digital component signal. When the video signal processing circuit 134 performs signal processing on the digital component signal, the digital component signal is sent to the graphic generation circuit 144.
- the audio amplifier circuit 140 receives the audio signal output from the audio signal processing circuit 132, amplifies it by a predetermined amount, and outputs it.
- the amount of amplification in the audio amplifier circuit 140 is in accordance with the volume instructed by the user of the image display device 100.
- the audio signal amplified by the audio amplifier circuit 140 is sent to the speaker 142.
- the speaker 142 outputs a sound based on the sound signal sent from the sound amplifier circuit 140.
- the panel drive circuit 146 generates a panel drive signal necessary for displaying an image on the display panel 102 from the graphic generation circuit 144 from the video signal sent from the graphic generation circuit 144.
- the panel drive signal generated by the panel drive circuit 146 is sent to the display panel 102, and the display panel 102 operates according to the panel drive signal, so that an image is displayed on the display panel 102.
- the display panel 102 displays a moving image based on the panel drive signal sent from the panel drive circuit 146.
- the display panel 102 displays a moving image using liquid crystal.
- the image input unit 104 is provided in the upper center portion of the image display panel 102, and when the panel drive signal is supplied to the display panel 102 and a moving image is displayed on the display panel 102.
- the image display apparatus 100 captures a moving image in the direction in which the display panel 102 displays the moving image.
- the image input unit 104 may capture a moving image with a CCD (Charge Coupled Device) image sensor, or may capture a moving image with a CMOS (Complementary Metal Oxide Semiconductor) image sensor. .
- a moving image captured by the image input unit 104 is sent to the face detection unit 106.
- the face detection unit 106 inputs a moving image sent from the image input unit 104 and detects a user's face included in the moving image.
- the detection result of the user's face in the face detection unit 106 is sent to the power control unit 108, the CPU 114, and the like, and is used to control the operation of the image display apparatus 100.
- the operation of the image display apparatus 100 using the detection result of the user's face in the face detection unit 106 will be described in detail later.
- the SDRAM 110 is a temporary work area when the computer program for controlling each part of the image display device 100 is executed by the CPU 114.
- the flash ROM 112 is a ROM that stores a computer program for the CPU 114 to control each unit of the image display apparatus 100.
- the CPU 114 reads out a computer program stored in the flash ROM 112 for controlling each unit of the image display device 100 and sequentially executes the computer program.
- the remote controller light receiving unit 116 receives a signal transmitted from the remote controller 200.
- the signal received by the remote controller light receiving unit 116 is input to the CPU 114, and the control code included in the signal is decoded by the CPU 114.
- the CPU 114 controls each part of the image display device 100 so as to perform operations (volume adjustment, channel setting, menu screen display, etc.) according to the control code.
- the network I / F 118 receives a signal input from the network terminal 120 via the network, and transmits a signal to the network via the network terminal 120.
- a signal input from the network terminal 120 via the network is received by the network I / F 118, the received signal is sent to the CPU 114.
- the CPU 114 analyzes a signal received by the network I / F 118 and controls each unit of the image display apparatus 100 so as to perform an operation according to the signal.
- the image captured by the image input unit 104 includes the user's face.
- An operation for determining whether or not there is will be described. The determination is performed by the face detection unit 106, and the power control unit 108 controls the operation of the image display device 100 depending on whether or not the user's face is included in the image captured by the image input unit 104. Control power.
- step S101 If the result of determination by the face detection unit 106 in step S101 is closer than a predetermined distance, the power control unit 108 is excluded from control and the process returns to step S101 and the determination processing of the face detection unit 106 is repeated.
- the result of determination by the face detection unit 106 in step S101 is far from a predetermined distance, whether or not the brightness of the captured image has reached the predetermined brightness, that is, the image display apparatus 100 It is detected whether the brightness of the installed room has reached a predetermined brightness (step S102). For example, when the image input unit 104 is configured by a CCD image sensor, the average value of the image is acquired by the CCD image sensor. Then, the CPU 114 may determine whether the average value of the acquired images is equal to or greater than a predetermined value.
- the display panel 102 Processing to reduce the brightness of the backlight is not performed.
- the brightness of the room in which the image display apparatus 100 is installed is a dark room that does not reach a predetermined brightness, and if the process of reducing the brightness of the backlight of the display panel 102 is performed, the entire room becomes dark. This is because there is a possibility that inconvenience may arise when the light of the image display apparatus 100 is used as room lighting.
- the present invention is not limited to this example.
- step S102 When processing for detecting whether or not the brightness of the room has reached a predetermined brightness is performed in step S102, an image is subsequently input from the image input unit 104 to the face detection unit 106 (step S103), and the face detection unit 106 is detected.
- the user's face detection process is executed.
- face detection process For the detection process of the user's face (hereinafter also simply referred to as “face detection process”) in the face detection unit 106, for example, techniques disclosed in Japanese Patent Application Laid-Open Nos. 2007-65766 and 2005-44330 are used. be able to. Hereinafter, the face detection process will be briefly described.
- the position of the face, the size of the face, and the direction of the face in the supplied image are detected.
- the face image portion can be cut out from the image.
- a characteristic part of the face for example, a characteristic part such as eyebrows, eyes, nose, or mouth is detected from the cut face image and face direction information.
- AAM Active Appearance Models
- a local feature amount is calculated for each detected face feature position.
- the face can be identified from the image input from the image input unit 104.
- techniques disclosed in Japanese Patent Application Laid-Open No. 2007-65766 and Japanese Patent Application Laid-Open No. 2005-44330 can be used, and detailed description thereof is omitted here. It is also possible to determine whether the face shown in the supplied image is a man, a woman, an adult, or a child based on the face image or the face feature position.
- the face detection process can be performed by the face detection unit 106 on the image captured by the image input unit 104 and input from the image input unit 104.
- a face detection error that is, whether any user face exists in the image input from the image input unit 104. Determination is made (step S104).
- a face detection error is subsequently determined. It is determined whether or not this time has continued (for example, 1 minute) (step S105).
- step S105 when the face detection error continues in the face detection unit 106 for a predetermined time, the power control unit 108 shifts the operation mode of the image display device 100 to the power saving mode (step S106). .
- step S106 When the power control unit 108 shifts the operation mode of the image display apparatus 100 to the power saving mode, the process returns to step S101 and distance determination is performed.
- the power saving mode in this embodiment means that the number of operation parts of the image display device 100 is reduced or the display panel 102 is equipped with a backlight in order to reduce the power consumption of the image display device 100.
- a plurality of power saving modes may be provided.
- step S105 processing for reducing the luminance of the backlight of the display panel 102 is performed, and as a next step, processing for stopping a circuit for increasing the image quality of a moving image or a circuit for increasing a frame rate is performed. May be.
- step S105 if the result of determination in step S105 is that the face detection error has not continued for a predetermined time in the face detection unit 106, the process returns to step S101 and distance determination is performed.
- the display panel 102 has a backlight
- the luminance of the image displayed on the display panel 102 is lowered by reducing the luminance of the backlight of the display panel 102, and the power consumption of the image display device 100 is suppressed.
- the sound emitted from the speaker 142 may be heard.
- the operation may be continued without stopping. If the power saving mode is maintained for a long time (for example, 30 minutes to 1 hour) after the shift to the power saving mode, it is determined that the sound emitted from the speaker 142 is not heard, and the speaker 142 In order to stop the output of the sound, the operation of the circuit related to the sound may be stopped, or the power supply itself of the image display device 100 may be turned off.
- the face detection process by the face detection unit 106 indicates that the person is If it is determined that the person keeps closing, it is determined that the person is asleep and the image displayed on the display panel 102 is not being viewed, and the power control unit 108 shifts to the power saving mode. Also good.
- step S105 when one or more user faces exist in the image input from the image input unit 104 and no face detection error has occurred in the face detection unit 106, that is, display on the display panel 102.
- the power control unit 108 determines whether or not the operation mode of the image display device 100 at the time is the power saving mode (step S107).
- step S107 when it is determined that the operation mode of the image display apparatus 100 is not the power saving mode, the process returns to step S101 and distance determination is performed.
- step S108 a process for recovering from the power saving mode is executed (step S108).
- the process of recovering from the power saving mode refers to, for example, a process of increasing again the brightness of the backlight of the display panel 102 that has been reduced or increasing the number of operation locations of the image display apparatus 100 that has been decreased. And if the process which recovers from a power saving mode is performed, it will return to said step S101 and distance determination will be performed.
- step S106 of FIG. 3 The operation of the image display device 100 according to the embodiment of the present invention has been described above with reference to FIG. Next, the transition process to the power saving mode in step S106 of FIG. 3 will be described in more detail.
- FIG. 4 is a flowchart for explaining the transition process to the power saving mode in the image display apparatus 100 according to the embodiment of the present invention.
- the transition process to the power saving mode in the image display apparatus 100 according to the embodiment of the present invention will be described in more detail with reference to FIG.
- the power control unit 108 performs control so as to start reducing the luminance of the backlight of the display panel 102 (step S111).
- the power control unit 108 determines whether or not a predetermined time T1 has elapsed since the start of reducing the brightness of the backlight of the display panel 102 (step S112). ).
- step S112 if the predetermined time T1 has not yet elapsed since the start of reducing the luminance of the backlight of the display panel 102, the process returns to step S111 to return the backlight of the display panel 102. Continue control to reduce the brightness of.
- the power control unit 108 if the predetermined time T1 has elapsed since the start of reducing the brightness of the backlight of the display panel 102, the power control unit 108 causes the operation mode of the image display apparatus 100 to be performed. Is shifted to the power saving mode (1) (step S113).
- step S113 when the power control unit 108 shifts the operation mode of the image display apparatus 100 to the power saving mode (1), the operation mode of the image display apparatus 100 next shifts to the power saving mode (1).
- the power control unit 108 determines whether or not the predetermined time T2 has elapsed (step S114).
- step S114 As a result of the determination by the power control unit 108 in step S114, if the predetermined time T2 has not yet elapsed since the operation mode of the image display apparatus 100 has shifted to the power saving mode (1), the power control unit 108 The determination in step S114 is continued until the time T2 elapses. On the other hand, as a result of the determination by the power control unit 108 in step S114, if the predetermined time T2 has elapsed since the operation mode of the image display apparatus 100 has shifted to the power saving mode (1), the power control unit 108 The operation mode of the image display device 100 is shifted to the power saving mode (2) (step S115).
- the operation of the video signal processing circuit 134 for converting the frame rate of a moving image displayed on the display panel 102 is partially or completely stopped by the power control unit 108.
- the power consumption of the image display device 100 can be changed in stages by controlling the internal operation of the image display device 100 according to the passage of time.
- the transition process to the power saving mode in the image display apparatus 100 according to the embodiment of the present invention has been described.
- the recovery process from the power saving mode in the image display apparatus 100 according to the embodiment of the present invention will be described in more detail.
- FIG. 5 is a flowchart for explaining recovery processing from the power saving mode in the image display apparatus 100 according to the embodiment of the present invention.
- the recovery process from the power saving mode in the image display apparatus 100 according to the embodiment of the present invention will be described in more detail with reference to FIG.
- the luminance of the backlight of the display panel 102 that has been dropped when the mode is shifted to the power saving mode.
- the power control unit 108 so as to start increasing again (step S121).
- step S121 when the power control unit 108 performs control so that the luminance of the backlight of the display panel 102 starts to increase again, subsequently, in the power saving mode, for example, a moving image to be displayed on the display panel 102 is displayed. If the operation of the video signal processing circuit 134 or the like for performing signal processing has been stopped, the power control unit 108 performs control so that the operation of the stopped high-quality circuit is restarted (step S122).
- step S122 when the power control unit 108 performs control so that the operation of the high-quality circuit that has been stopped in the power saving mode is resumed, in the power saving mode (for example, the video signal processing circuit by the power control unit 108).
- the power control unit 108 performs control so that the frame rate is restored (step S123).
- FIG. 6 to FIG. 8 illustrate some examples of the transition processing to the power saving mode and the recovery processing from the power saving mode in the image display apparatus 100 according to the embodiment of the present invention.
- FIGS. 6 to 8 show the luminance of the backlight of the display panel 102 when the image display device 100 according to the embodiment of the present invention performs the transition process to the power saving mode and the recovery process from the power saving mode. It is explanatory drawing which shows an example of transition of power consumption.
- the face detection process in the face detection unit 106 if one or more user faces are included in the image input by the image input unit 104, the brightness of the backlight of the display panel 102 is not reduced.
- the power control unit 108 controls the backlight of the display panel 102. Reduce brightness gradually.
- the power control unit 108 adjusts the brightness of the backlight of the display panel 102. Return to the original brightness in an instant.
- the face detection process in the face detection unit 106 when a predetermined time has elapsed since the image input by the image input unit 104 does not include the user's face, no image is displayed as being viewed by anyone.
- the luminance of the backlight of the panel 102 can be reduced to reduce power consumption.
- FIG. 6 when the power control unit 108 returns the backlight brightness of the display panel 102 to the original brightness, the power control unit 108 returns the backlight brightness of the display panel 102 to the original brightness at once.
- FIG. 7 shows the transition of the luminance of the backlight of the display panel 102 when the transition processing to the power saving mode and the recovery processing from the power saving mode are performed in the image display device 100 according to the embodiment of the present invention. Another example is shown graphically.
- the power control unit 108 adjusts the brightness of the backlight of the display panel 102. Return to the original brightness.
- the luminance of the backlight of the display panel 102 may be instantaneously restored to the original luminance, or may be gradually restored to the original luminance. By gradually returning the luminance of the backlight of the display panel 102 to the original luminance, the brightness can be adjusted naturally when someone starts watching the video.
- FIG. 8 shows an example of the transition of the power consumption of the image display device 100 when the transition processing to the power saving mode and the recovery processing from the power saving mode are performed in the image display device 100 according to the embodiment of the present invention. Is shown in a graph. In the graph shown in FIG. 8, the power consumption of the image display device 100 is shown, and the horizontal axis shows time.
- the face detection process in the face detection unit 106 if one or more user faces are included in the image input by the image input unit 104, the brightness of the backlight of the display panel 102 is not reduced.
- the power control unit 108 controls the backlight of the display panel 102. Reduce brightness gradually.
- the face detection process performed by the face detection unit 106 results in the image input by the image input unit 104 being displayed by the user.
- the power control unit 108 increases the brightness of the backlight of the display panel 102, restarts the operation of the circuit for improving the image quality, and sets the frame rate. The case where the power consumption is restored by returning to the original state is shown.
- an image captured by the image input unit 104 includes a poster such as a poster showing a person's face
- the face may be detected from such a poster. Therefore, in the face detection process in the face detection unit 106, a canceling function for a still image may be provided.
- the entire face or a face in which the position of a part of the face, such as the eyes, nose, mouth, ears, or the like does not move for a predetermined time or more is a still image. As such, processing that does not recognize the user's face may be performed.
- the face detection process is performed by the face detection unit 106, it is determined that the user is viewing the video while viewing the moving image displayed on the display panel 102 by looking at the angle of the line of sight in the pitch direction. For example, a message that prompts the viewer to move the viewing position may be displayed to prevent dry eye or the like.
- the face detection process in the face detection unit 106 and a human sensor may be combined. As long as the person can understand that the person is in the room, the moving image may be displayed on the display panel 102 as usual even when the person is not within the viewing angle of the camera. If connected to, playback may be continued. Further, when returning from the power saving mode, if the human sensor can detect the person before entering the viewing angle of the image input unit 104, the human sensor using the human sensor prior to the input from the image input unit 104 It is possible to return from the power saving mode by detecting the above. Since the return operation can be started in advance before entering the viewing angle of the image input unit 104, the return operation from the power saving mode can be speeded up.
- the face detection process in which the image input unit 104 captures an image and the face detection unit 106 detects the face of the user included in the image. Execute. Then, using the result of the face detection processing, the power consumption of the image display device 100 is reduced by reducing the luminance of the image displayed on the display panel 102 or omitting part or all of the image processing. I can do it. Conventionally, it has been grasped whether to shift to the power saving mode based on the elapsed time from the state where the user has not operated at all or using a sensor such as infrared rays or ultrasonic waves.
- whether or not the user is in front of the image display apparatus 100 is a determination criterion, and the user does not have to perform any operation after entering the power saving mode.
- the user in order to recover from the power saving mode, the user has to perform some operation.
- whether or not the user is in front of the image display apparatus 100 is determined as a criterion. Therefore, there is no need for the user to perform any operation in order to recover from the power saving mode.
- FIG. 9 is a flowchart for explaining a modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- a modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the face detection unit 106 determines whether an image input by the image input unit 104 includes a user's face and at least one person is viewing an image displayed on the display panel 102 (step S131). ).
- step S131 if the user's face is included in the image input by the image input unit 104 and at least one person is viewing an image displayed on the display panel 102, the face detection unit 106 If it is determined, the face detection unit 106 executes face detection processing once every 15 frames (step S132). On the other hand, when the face detection unit 106 determines that no one is viewing the image displayed on the display panel 102, the face detection unit 106 executes the face detection process once every two frames ( Step S133).
- the frequency of the face detection process is not limited to this example.
- the face detection processing interval is longer when at least one person determines that the image displayed on the display panel 102 is being viewed.
- the interval of the face detection process when it is determined that at least one person is viewing an image displayed on the display panel 102 may be shorter.
- the face detection process may be executed with a higher frequency of face detection (for example, once every 2 to 3 frames), or when a face is not detected as a result of the face detection process at a certain point in time
- face detection processing is executed for all regions of the image captured by the image input unit 104 at a frequency that is less than the frequency when a face is detected (for example, once every 10 to 15 frames). May be.
- FIG. 10 is an explanatory diagram for explaining another modification of the face detection process in the face detection unit 106 in the image display device 100 according to the embodiment of the present invention. is there.
- the explanatory diagram shown in FIG. 10 shows a case where a pseudo three-dimensional image is displayed using the image display device 100 according to the embodiment of the present invention.
- an image display device for displaying a moving image (content) created as a three-dimensional image as a three-dimensional image is necessary.
- preparing such contents and image display devices increases the threshold for enjoying a three-dimensional image, and it is not possible to easily enjoy a three-dimensional image.
- the display panel 102 when the face is located at the position indicated by reference numeral 160a and when the face is located at the position indicated by reference numeral 160b, the display panel 102 is considered when the display panel 102 is considered as a window.
- the range that can be seen through is different. Therefore, the face detection unit 106 performs face detection processing and distance measurement processing, and changes the visible region of the image displayed on the display panel 102 in accordance with the position of the face and the position and distance between the display panel and the viewer.
- a pseudo three-dimensional image can be displayed on the display panel 102.
- a pseudo three-dimensional image on the display panel 102 In order to display a pseudo three-dimensional image on the display panel 102, only an image of a portion other than the surrounding portion of the images displayed by the panel drive signal supplied to the display panel 102 is displayed on the display panel 102. You may make it make it. For example, only an image in a range of about 80% in the horizontal direction and the vertical direction among images displayed by a panel drive signal supplied to the display panel 102 may be displayed on the display panel 102. Then, the visible area of the image can be changed by changing the display area according to the change in the position of the face of the viewer, and as a result, a pseudo three-dimensional image is displayed on the display panel 102. I can do it.
- the image displayed on the display panel 102 is shifted upward to show a pseudo three-dimensional image. be able to.
- the image displayed on the display panel 102 is shifted downward, and when the viewer shifts his face to the left from a certain position, the image is displayed on the display panel 102.
- the image displayed on the display panel 102 can be shifted to the left to display a pseudo three-dimensional image.
- the face detection unit 106 performs face detection processing and distance measurement processing, and uses the results of the face detection processing and distance measurement processing to control the range of images displayed on the display panel 102, thereby determining the position of the face,
- the range of the image displayed on the display panel 102 is changed according to the change in the distance to the display panel 102.
- FIG. 11 is a flowchart for explaining another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the flowchart shown in FIG. 11 shows a moving image to be displayed on the display panel 102 depending on whether or not the user's face is included in the image input by the image input unit 104 during the face detection process in the face detection unit 106. It is determined whether or not the channel is automatically switched. Conventionally, the channel switching (zapping) is performed by the user himself / herself operating the main body of the image display device and the remote controller. In the modification shown here, if the face input by the face detection unit 106 includes a user's face in the image input by the image input unit 104, the channel is not changed, and the image input by the image input unit 104 is changed. If the user's face is not included, that is, if it is determined that no one is viewing, the channels are sequentially switched.
- the face detection unit 106 determines whether the user's face is included in the image input by the image input unit 104 and someone is viewing the moving image displayed on the display panel 102 (step S141). ).
- the face detection unit 106 determines that the user's face is included in the image input by the image input unit 104 and someone is viewing the moving image displayed on the display panel 102. If so, the channel is left as it is (step S142). On the other hand, if the face detection unit 106 determines that no one is watching the moving image displayed on the display panel 102, the channel is sequentially changed after a predetermined time has elapsed (step S143).
- the method of changing the channel may be, for example, ascending order of channel numbers or descending order. Further, when the channel makes one round, it may be switched to another broadcast (for example, in the order of analog broadcast ⁇ terrestrial digital broadcast ⁇ BS digital broadcast ⁇ CS digital broadcast), and the channel may be changed sequentially.
- the flowchart shown in FIG. 12 shows a method for estimating the distance between the user and the image display apparatus 100 and adjusting the sound volume according to the estimated distance during the face detection processing in the face detection unit 106. It is. Conventionally, when the user leaves the screen or approaches the screen, it has been necessary to operate the remote controller to adjust the volume in order to obtain an appropriate volume. In addition, although the appropriate sound volume varies depending on the user, there is also annoyance that the sound volume must be adjusted for each user. Therefore, face detection processing is performed in the face detection unit 106, and the detection of the user and the measurement of the distance between the user and the image display device 100 are performed based on the detected face and the area of the face in the image. Then, appropriate volume control is performed based on the detected user and the measured distance.
- volume control is performed by the user of the image display apparatus 100 (step S151).
- the image display apparatus 100 captures an image of the user's face whose volume is controlled by the image input unit 104 and recognizes the user's face whose volume is controlled by the face detection unit 106.
- the volume information controlled by the user is stored (step S152).
- the image input unit 104 captures the face of the user of the image display device 100.
- the face detection unit 106 performs face detection processing using the captured image to recognize the user, and the distance between the user and the image display device 100 is determined. Measurement is performed (step S154).
- step S154 the volume information of the user is read and the volume of the sound output from the speaker 142 is set (step S155).
- the CPU 114 may read out the volume information of the user and set the volume of the sound output from the speaker 142, for example.
- the user's face of the image display device 100 is captured by the image input unit 104, and the face detection unit 106 performs face detection processing using the captured image to recognize the user, and the user and the image display device
- the volume suitable for the user can be automatically set.
- the volume information can be stored inside the image display device 100 without requiring a cumbersome operation.
- FIG. 13 is a flowchart for explaining another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- FIG. 13 is a flowchart for explaining another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the flowchart shown in FIG. 13 estimates the distance between the user and the image display device 100 during the face detection process in the face detection unit 106, and responds to the estimated distance. This shows how to adjust the volume.
- the face detection unit 106 performs face detection processing
- the user faces the display panel 102 using the face area and line-of-sight direction information detected by the face detection unit 106. Approximate the face area at.
- the calculated face area information and volume information are stored, and the stored face area information and volume information are used for volume control.
- volume control is performed by the user of the image display apparatus 100 (step S161).
- the image display apparatus 100 captures an image of the user's face whose volume is controlled by the image input unit 104 and recognizes the user's face whose volume is controlled by the face detection unit 106.
- the volume information controlled by the user is stored (step S162).
- the face detection unit 106 determines the face area when the user faces the display panel 102 from the face area and line-of-sight direction in the image input by the image input unit 104. Is approximated. Note that user recognition, face area measurement, and volume information storage may be performed at least twice for the same user so that the relationship between the face area and volume is linear.
- the image input unit 104 captures the face of the user of the image display apparatus 100.
- the face detection unit 106 performs face detection processing using the captured image to recognize the user, and the face area occupied in the image input by the image input unit 104 From the direction of the line of sight, the face area when the user faces the front of the display panel 102 is approximately calculated (step S164).
- step S164 when user recognition, face area measurement, and approximate calculation by the face detection unit 106 are completed, the volume information of the user is read and the volume is set (step S165).
- step S164 when a plurality of users are detected by the face detection unit 106, the volume information of the user with the largest face area may be read and the volume set.
- the volume setting may be performed using the one having the closest face area.
- FIG. 14 is a flowchart for explaining another modification of the face detection processing in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the face detection unit 106 performs face detection processing on the image captured by the image input unit 104, and the person who is viewing the video displayed on the display panel 102 as a result of the face detection processing.
- the user when the user temporarily leaves the TV in the middle of watching a content such as a drama or standing in the bathroom, the user temporarily uses the remote controller. It was necessary to stop playback of the content by pressing a stop button or the like. Then, when the user returns to the television and resumes viewing the content, the user has to press the playback button on the remote controller and resume the playback of the content. Therefore, in the modification shown in FIG. 14, it is determined by face detection processing whether there is a person watching the video displayed on the display panel 102 without any user operation, and the determination result is used. The content playback is controlled.
- the moving image captured by the image input unit 104 is analyzed by the face detection unit 106, and the area of the user's face in the moving image captured by the image input unit 104 is calculated.
- the distance between the display device 100 and a person is measured.
- the face detection unit 106 determines whether or not it is closer than a predetermined distance (step S171). Note that the distance between the image display device 100 and the viewer does not need to be strictly measured, and is measured within a rough range (for example, the face position is at a distance of 2 m50 cm to 3 m from the image input unit 104). May be.
- step S171 If the result of determination by the face detection unit 106 in step S171 is closer than a predetermined distance, the power control unit 108 is excluded from control and the process returns to step S171 to repeat the determination processing of the face detection unit 106.
- the result of determination by the face detection unit 106 in step S171 is that the distance is greater than a predetermined distance, then whether or not the brightness of the captured image has reached the predetermined brightness, that is, the image display device 100 It is detected whether the brightness of the installed room has reached a predetermined brightness (step S172).
- the image input unit 104 is configured by a CCD image sensor, the average value of the image is acquired by the CCD image sensor. Then, the CPU 114 may determine whether the average value of the acquired images is equal to or greater than a predetermined value.
- step S172 When processing for detecting whether or not the brightness of the room has reached the predetermined brightness is performed in step S172, an image is subsequently input from the image input unit 104 to the face detection unit 106 (step S173), and the face detection unit 106 is detected. The user's face detection process is executed.
- step S174 when the face detection unit 106 performs the process of detecting the user's face, it is determined whether a face detection error, that is, whether any user face exists in the image input from the image input unit 104. Determination is made (step S174). As a result of the determination in step S174, when no face of the user exists in the image input from the image input unit 104 and a face detection error occurs in the face detection unit 106, a face detection error is subsequently determined. It is determined whether or not this time has continued (for example, 1 minute) (step S175).
- step S175 when the face detection error continues in the face detection unit 106 for a predetermined time, the power control unit 108 shifts the operation mode of the image display device 100 to the time shift mode (step S176). .
- step S176 When the power control unit 108 shifts the operation mode of the image display apparatus 100 to the time shift mode, the process returns to step S171 and distance determination is performed.
- the time shift mode in this embodiment is a mode in which the reproduction of the content that the user was watching is stopped and the backlight of the display panel 102 is reduced to darken the display on the display panel 102. That means.
- step S177 when it is determined that the operation mode of the image display apparatus 100 is the time shift mode, the content that has recovered from the time shift mode and stopped playback in step S176 is described below. Processing for resuming reproduction is executed (step S178). When recovering from the time shift mode, a process of recovering the brightness of the backlight of the display panel 102 that has been dropped is also executed.
- the playback of the content can be stopped without any operation, and the user can When returning to, the playback of the content can be resumed from the position where it was stopped without any operation.
- the brightness of the backlight of the display panel 102 is lowered when the user leaves the front of the screen, it is possible to contribute to reduction of power consumption.
- FIG. 15 is a flowchart for explaining another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- step S181 when the content is being played back and the content being played back is displayed on the display panel 102, whether the brightness of the captured image has reached a predetermined brightness, that is, the image display device 100 is installed. It is detected whether the brightness of the room in the room has reached a predetermined brightness (step S181). As described above, for example, when the image input unit 104 is configured with a CCD image sensor, the average value of the image is determined by the CCD image sensor. get. Then, the CPU 114 may determine whether the average value of the acquired images is equal to or greater than a predetermined value.
- the face detection unit 106 determines whether the detected face is a smile as a result of the determination in step S183, the face detection unit 106 subsequently determines whether the smile is a predetermined intensity or more (Ste S184). As a result of the determination in step S184, if the face detection unit 106 determines that the smile is equal to or greater than a predetermined intensity, an auto chapter process for setting a chapter for the content being played back is executed (step S185). The place where the chapter is set may be the moment when the person smiles, or may be a little before the person smiles (for example, 15 seconds before). By setting a chapter at a point just before smiling, you can watch the content from a point just before laughing. On the other hand, as a result of the determination in step S184, when the face detection unit 106 determines that the smile is not equal to or greater than the predetermined intensity, the process returns to step S181 to execute the room brightness detection process.
- the face detection process it is possible to easily find a desired scene by attaching a mark (chapter) to the content being viewed depending on whether the detected face is a smile.
- by measuring the cumulative time of non-laughing time it is possible to grasp the ratio of the time of laughing and non-laughing time in content units, and it is possible to check whether content is interesting after viewing the content it can.
- the cumulative time of the time when not laughing is measured, but the cumulative time of the time when laughing may also be measured.
- the content is marked depending on whether the detected face is a smile.
- the type of face when marking the content is applied. It is not limited to examples.
- the content for setting a mark is not limited to a moving image. For example, when still images are sequentially displayed on the display panel 102 (when a so-called slide show process is executed), a mark may be set for an image for which the user has responded.
- FIG. 16 is a flowchart for explaining another modified example of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- FIG. 16 is a flowchart for explaining another modified example of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention.
- another modification of the face detection process in the face detection unit 106 in the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- face detection processing is performed on the image captured by the image input unit 104 by the face detection unit 106.
- the number of people watching and the number of people watching Depending on the state of the reaction, the viewing state of content and viewing control are performed.
- the user's viewing situation is grasped only by the channel selection situation.
- the content viewing state is grasped and viewing control is performed according to the number of people watching and the behavior of the people watching. It is a feature.
- the moving image captured by the image input unit 104 is analyzed by the face detection unit 106, and the area of the user's face in the moving image captured by the image input unit 104 is calculated.
- the distance between the display device 100 and a person is measured.
- the face detection unit 106 determines whether or not it is closer than a predetermined distance (step S191). Note that the distance between the image display device 100 and the viewer does not need to be strictly measured, and is measured within a rough range (for example, the face position is at a distance of 2 m50 cm to 3 m from the image input unit 104). May be.
- step S191 If the result of determination by the face detection unit 106 in step S191 is closer than a predetermined distance, the power control unit 108 is excluded from control, and the process returns to step S191 to repeat the determination processing of the face detection unit 106.
- the result of determination by the face detection unit 106 in step S191 is greater than a predetermined distance, then whether or not the brightness of the room in which the image display device 100 is installed has reached the predetermined brightness. Is detected (step S192).
- the image input unit 104 is configured by a CCD image sensor
- the average value of the image is acquired by the CCD image sensor.
- the CPU 114 may determine whether the average value of the acquired images is equal to or greater than a predetermined value.
- step S192 When processing for detecting whether or not the brightness of the room has reached the predetermined brightness is performed in step S192, an image is subsequently input from the image input unit 104 to the face detection unit 106 (step S193), and the face detection unit 106 is detected. The user's face detection process is executed.
- the face detection unit 106 executes a user face detection process to determine whether or not the user's face has been detected from the image input from the image input unit 104 (step S194). As a result of the determination in step S194, when the user's face can be detected from the image input from the image input unit 104, the user's face detected in step S194 is then sent to the image input unit 104. It is then determined whether the angle is within a predetermined angle range (step S195).
- step S195 If it is determined in step S195 that the image input unit 104 is not within a predetermined angle range, the process returns to step S191 and the distance measurement process is performed again. On the other hand, as a result of the determination in step S195, if it is determined that the angle is within a predetermined angle range with respect to the image input unit 104, processing for analyzing the degree of concentration on the content is executed (step S196).
- the analysis of the degree of concentration on the content is to analyze whether or not the content is viewed in a concentrated manner, for example, by analyzing the direction of the line of sight of the user's face detected by the face detection unit 106.
- the time during which the line of sight of the user's face is facing the image input unit 104 may be divided by the content playback time.
- the degree of concentration increases as the time during which the direction of the line of sight faces the direction of the display panel 102 increases, and the time during which the direction of the line of sight faces the direction of the display panel 102. The shorter the concentration, the lower the concentration.
- the degree of concentration with respect to the content
- the motion in the vertical direction (pitch direction) of the user's face detected by the face detection unit 106 is analyzed, and the presence or absence of the viewer's whisper is asked about the content of the content
- the degree of concentration may be higher as the number of times the user has asked.
- the concentration with respect to the currently displayed content is low, that is, the content being currently displayed is not very interested, It may be recommended or the displayed content may be switched.
- the recommended content is selected based on information such as the number of people, gender, and the like obtained as a result of performing face detection processing on the image input from the image input unit 104 by the face detection unit 106. You may decide.
- step S194 if the user's face cannot be detected from the image input from the image input unit 104, it is determined that the content is displayed but not viewed, for example, The CPU 114 performs processing for measuring the content non-viewing time (step S197). When the content non-viewing time measurement process is performed, the process returns to step S191 and the distance measurement process is performed again.
- the face detection unit 106 performs user face detection processing, performs concentration level analysis processing on the content using the detection result, and grasps the detailed viewing status for the currently displayed content, Content reproduction control can be performed based on the degree of concentration obtained by analysis.
- the display panel 102 includes the image input unit 104 that captures an image in the same direction as the moving image is displayed.
- a face detection unit 106 that performs face detection processing on an image captured by the input unit 104 is provided. Then, the internal operation of the image display device 100 can be controlled depending on whether or not the face detection unit 106 detects the user's face, thereby contributing to a reduction in power consumption.
- the viewing status of the content displayed on the display panel 102 is analyzed, and the content displayed on the display panel 102 is reproduced. And can be controlled.
- a child is shown in the image supplied from the image input unit 104, and the distance between the image input unit 104 and the face is within a predetermined distance.
- an effect such as blurring, a message, or a sound is output to the moving image displayed on the display panel 102 to alert the user to move away from the front of the display panel 102. You may do it.
- SYMBOLS 100 Image display apparatus 102 Display panel 104 Image input part 106 Face detection part 108 Power control part 110 SDRAM 112 flash ROM 114 CPU 116 Remote controller light receiving unit 118 Network I / F DESCRIPTION OF SYMBOLS 120 Network terminal 122 Terrestrial tuner 124 Digital tuner 126 Audio
Landscapes
- Engineering & Computer Science (AREA)
- Multimedia (AREA)
- Signal Processing (AREA)
- Theoretical Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Health & Medical Sciences (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- General Engineering & Computer Science (AREA)
- Databases & Information Systems (AREA)
- Life Sciences & Earth Sciences (AREA)
- Biomedical Technology (AREA)
- Social Psychology (AREA)
- Neurosurgery (AREA)
- Analytical Chemistry (AREA)
- Chemical & Material Sciences (AREA)
- Biophysics (AREA)
- Ecology (AREA)
- Environmental & Geological Engineering (AREA)
- Software Systems (AREA)
- Business, Economics & Management (AREA)
- Biodiversity & Conservation Biology (AREA)
- Computer Networks & Wireless Communication (AREA)
- Emergency Management (AREA)
- Computer Hardware Design (AREA)
- Environmental Sciences (AREA)
- Remote Sensing (AREA)
- Human Computer Interaction (AREA)
- Controls And Circuits For Display Device (AREA)
- Transforming Electric Information Into Light Information (AREA)
- Control Of Indicators Other Than Cathode Ray Tubes (AREA)
- Image Processing (AREA)
Abstract
Description
〔1〕本発明の一実施形態にかかる画像表示装置の概要
〔2〕本発明の一実施形態にかかる画像表示装置の動作
〔2-1〕省電力モードへの移行
〔2-2〕省電力モードからの回復
〔3〕本発明の一実施形態にかかる画像表示装置の動作の変形例
〔3-1〕検出頻度の変化
〔3-2〕立体的な画像の擬似的表示
〔3-3〕チャンネルの自動切り替え
〔3-4〕距離に応じた音量制御
〔3-5〕コンテンツの再生制御
〔3-6〕コンテンツへの目印の設定
〔3-7〕コンテンツの視聴状態の把握
〔4〕まとめ
まず、本発明の一実施形態にかかる画像表示装置の概要について説明する。図1は、本発明の一実施形態にかかる画像表示装置100の概要について説明する説明図である。以下、図1を用いて本発明の一実施形態にかかる画像表示装置の概要について説明する。
〔2-1〕省電力モードへの移行
図3は、本発明の一実施形態にかかる画像表示装置100の動作について説明する流れ図である。以下、図3を用いて本発明の一実施形態にかかる画像表示装置100の動作について説明する。
T.F. Cootes, G.J.Edwards, and C.J.Taylor, "Active Appearance
Models",Proc.Fift
h European Conf. Computer Vision, H. Burkhardt and B. Neumann,eds, vol.2,
pp.484
-498, 1998
図5は、本発明の一実施形態にかかる画像表示装置100における、省電力モードからの回復処理について説明する流れ図である。以下、図5を用いて本発明の一実施形態にかかる画像表示装置100における、省電力モードからの回復処理について、より詳細に説明する。
顔検出部106における顔検出処理には様々な変形例を適用することができる。以下、顔検出部106における顔検出処理に適用される種々の変形例について説明する。
図9は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の変形例について説明する流れ図である。以下、図9を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の変形例について説明する。
図10は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する説明図である。図10に示した説明図は、本発明の一実施形態にかかる画像表示装置100を用いて、擬似的に立体的な画像を表示する場合について示したものである。従来においては、立体的な画像を表示させるには、立体的な画像として作成された動画像(コンテンツ)を、立体的な画像として表示するための画像表示装置が必要であった。しかし、このようなコンテンツや画像表示装置を用意することは、立体的な画像を楽しむための敷居が高くなってしまい、気軽に立体的な画像を楽しむことが出来なかった。そこで、画像入力部104で入力した画像に含まれるユーザの顔を顔検出部106によって検出し、検出した顔の位置によって、表示パネル102で表示させる画像の可視領域を変化させることで、擬似的に立体的な画像を表示させる。
図11は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する流れ図である。以下、図11を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する。
図12は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する流れ図である。以下、図12を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する。
図14は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する流れ図である。以下、図14を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する。
図15は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する流れ図である。以下、図15を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する。
図16は、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する流れ図である。以下、図16を用いて、本発明の一実施形態にかかる画像表示装置100における、顔検出部106における顔検出処理の別の変形例について説明する。
以上、顔検出部106における顔検出処理に適用される種々の変形例について説明した。以上説明したように、本発明の一実施形態にかかる画像表示装置100によれば、表示パネル102が動画像を表示している向きと同じ向きの画像を撮像する画像入力部104を備え、画像入力部104が撮像する画像に対して顔検出処理を行う顔検出部106を備えている。そして、顔検出部106によるユーザの顔の検出の有無によって、画像表示装置100の内部の動作を制御し、消費電力の低下に寄与できる。また、顔検出部106によって検出した顔の大きさや向き、視線の方向等の情報を用いて、表示パネル102で表示するコンテンツの視聴状況を分析したり、表示パネル102で表示するコンテンツの再生を制御したりすることができる。
のエフェクトやメッセージや音声を出し、ユーザに対して、表示パネル102の前から離れるように注意を喚起するようにしてもよい。
102 表示パネル
104 画像入力部
106 顔検出部
108 電力制御部
110 SDRAM
112 フラッシュROM
114 CPU
116 リモートコントローラ受光部
118 ネットワークI/F
120 ネットワーク端子
122 地上波チューナー
124 デジタルチューナー
126 音声A/D変換回路
128 ビデオデコーダ
130 MPEGデコーダ
132 音声信号処理回路
134 映像信号処理回路
136 HDMIレシーバ
138 HDMI端子
140 音声増幅回路
142 スピーカ
144 グラフィック生成回路
146 パネル駆動回路
200 リモートコントローラ
Claims (23)
- 静止画像または動画像を表示する画像表示部と、
前記画像表示部が静止画像または動画像を表示する方向の撮影を行う撮像部と、
前記撮像部で撮影した画像に含まれているユーザの顔を検出する顔検出処理を所定の間隔で行う顔検出部と、
を備え、
前記所定の間隔は、前記撮像部で撮影した画像にユーザの顔が含まれているか否かによって可変であり、
前記顔検出部は、前記顔検出処理の際に、検出した顔の全体又は一部の位置が所定の時間変化しない場合には当該顔はユーザの顔として検出しない、画像表示装置。 - 前記顔検出部の顔検出処理の結果に応じた前記画像表示部に対する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部の顔検出処理の結果、前記撮像部で撮影した画像にユーザの顔が含まれていない場合に前記画像表示部の輝度を低下させる、請求項1に記載の画像表示装置。 - 前記制御部は、前記撮像部で撮影した画像にユーザの顔が含まれていないと判断した場合には、ユーザの顔が含まれていると判断した場合に比べて前記所定の間隔を長くする、請求項2に記載の画像表示装置。
- 前記制御部は、前記撮像部で撮影した画像にユーザの顔が含まれていないと判断してから所定の時間が経過した後に前記画像表示部の輝度を低下させる、請求項2に記載の画像表示装置。
- 前記制御部は、前記撮像部で撮影した画像を用いて設置されている部屋の明度を検出し、部屋の明度が所定値に達していない場合には、前記画像表示部の輝度を低下させない、請求項2に記載の画像表示装置。
- 前記制御部は、前記画像表示部の輝度を低下させた後に、前記撮像部で撮影した画像にユーザの顔が含まれるようになった場合には、前記画像表示部の輝度を上昇させる、請求項2に記載の画像表示装置。
- 前記制御部は、前記画像表示部の輝度を低下させてから所定の時間が経過すると、前記画像表示部に表示する画像に対する画像処理の一部または全部を省略させる、請求項2に記載の画像表示装置。
- 前記制御部は、前記撮像部で撮影した画像にユーザの顔が含まれているが、該ユーザは目を瞑っていると判断すると、該判断から所定の時間が経過した後に前記画像表示部の輝度を低下させる、請求項2に記載の画像表示装置。
- 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部が検出した顔の位置および前記撮像部からの顔の距離に応じて、前記画像表示部に表示させる画像の範囲を変化させる、請求項1に記載の画像表示装置。 - 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部が検出した顔の状況に応じて前記画像表示部に表示させる画像を選択する、請求項1に記載の画像表示装置。 - 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部が検出した顔の有無に応じて前記画像表示部に表示させる画像に対する高画質化処理の実行の有無を制御する、請求項1に記載の画像表示装置。 - 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部が検出した顔の有無に応じて前記画像表示部に表示させるチャンネルを自動的に順次切り替える、請求項1に記載の画像表示装置。 - 音声を出力する音声出力部と、
前記顔検出部の顔検出処理の結果に応じた音声に関する制御動作を実行する制御部と、
をさらに備え、
前記制御部は、前記顔検出部が検出した顔の状況に応じて前記音声出力部から出力する音声の音量を制御する、請求項1に記載の画像表示装置。 - 前記制御部は、前記顔検出部が検出した顔の方向および面積に応じて前記音声出力部から出力する音声の音量を制御する、請求項13に記載の画像表示装置。
- 前記制御部は、前記顔検出部が検出した顔の方向および面積から正面方向を向いた場合の顔の面積を計算し、該計算結果を用いて前記音声出力部から出力する音声の音量を制御する、請求項14に記載の画像表示装置。
- 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部における顔検出処理の結果、前記撮像部で撮影した画像にユーザの顔が含まれていないと判断した場合に、前記画像表示部に表示されている動画像の再生を一時停止する、請求項1に記載の画像表示装置。 - 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部における顔検出処理の結果、前記画像表示部に表示されている画像に対して前記顔検出部で検出した顔が反応を示したと判断した場合に、当該画像に目印を設定する、請求項1に記載の画像表示装置。 - 前記制御部は、前記画像表示部に表示されている画像が動画像である場合に、前記目印としてチャプターを設定する、請求項17に記載の画像表示装置。
- 前記制御部は、前記顔検出部で検出した顔が反応を示した時点より所定の時間遡った時点に前記チャプターを設定する、請求項18に記載の画像表示装置。
- 前記顔検出部の顔検出処理の結果に応じた静止画像または動画像に関する制御動作を実行する制御部を備え、
前記制御部は、前記顔検出部における顔検出処理の結果を用いて前記画像表示部に表示されている画像の鑑賞状況を取得する、請求項1に記載の画像表示装置。 - 前記制御部は、前記鑑賞状況に応じて前記画像表示部に表示する画像を選択する、請求項20に記載の画像表示装置。
- 静止画像または動画像を表示する画像表示部が、静止画像または動画像を表示する方向の撮影を行う撮像ステップと、
前記撮像ステップで撮影した画像に含まれているユーザの顔を検出する顔検出処理を所定の間隔で行う顔検出ステップと、
を含み、
前記所定の間隔は、前記撮像ステップで撮影した画像にユーザの顔が含まれているか否かによって可変であり、
前記顔検出ステップは、前記顔検出処理の際に、検出した顔の全体又は一部の位置が所定の時間変化しない場合には当該顔はユーザの顔として検出しない、制御方法。 - 静止画像または動画像を表示する画像表示部が、該静止画像または動画像を表示する方向の撮影を行う撮像ステップと、
前記撮像ステップで撮影した画像に含まれているユーザの顔を検出する顔検出処理を所定の間隔で行う顔検出ステップと、
をコンピュータに実行させ、
前記所定の間隔は、前記撮像ステップで撮影した画像にユーザの顔が含まれているか否かによって可変であり、
前記顔検出ステップは、前記顔検出処理の際に、検出した顔の全体又は一部の位置が所定の時間変化しない場合には当該顔はユーザの顔として検出しない、コンピュータプログラム。
Priority Applications (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN2009801305540A CN102119530B (zh) | 2008-08-22 | 2009-08-21 | 图像显示设备、控制方法 |
RU2011104474/07A RU2493613C2 (ru) | 2008-08-22 | 2009-08-21 | Устройство воспроизведения изображений и способ управления |
EP09808309.0A EP2315439B1 (en) | 2008-08-22 | 2009-08-21 | Image display device, control method and computer program |
JP2010525710A JP5553022B2 (ja) | 2008-08-22 | 2009-08-21 | 画像表示装置、制御方法およびコンピュータプログラム |
US12/737,697 US9104408B2 (en) | 2008-08-22 | 2009-08-21 | Image display device, control method and computer program |
BRPI0917133-9A BRPI0917133B1 (pt) | 2008-08-22 | 2009-08-21 | dispositivo de exibição de imagem, método de controle, e, meio de armazenamento legível por computador |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2008-214618 | 2008-08-22 | ||
JP2008214618 | 2008-08-22 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2010021373A1 true WO2010021373A1 (ja) | 2010-02-25 |
Family
ID=41707247
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2009/064625 WO2010021373A1 (ja) | 2008-08-22 | 2009-08-21 | 画像表示装置、制御方法およびコンピュータプログラム |
Country Status (7)
Country | Link |
---|---|
US (1) | US9104408B2 (ja) |
EP (1) | EP2315439B1 (ja) |
JP (2) | JP5553022B2 (ja) |
CN (2) | CN103379300B (ja) |
BR (1) | BRPI0917133B1 (ja) |
RU (1) | RU2493613C2 (ja) |
WO (1) | WO2010021373A1 (ja) |
Cited By (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011030624A1 (ja) * | 2009-09-11 | 2011-03-17 | ソニー株式会社 | 表示装置および制御方法 |
JP2011259384A (ja) * | 2010-06-11 | 2011-12-22 | Sharp Corp | 撮像装置、表示装置、プログラム及び記録媒体 |
JP2012089909A (ja) * | 2010-10-15 | 2012-05-10 | Sony Corp | 情報処理装置、同期方法およびプログラム |
WO2012066705A1 (ja) * | 2010-11-17 | 2012-05-24 | パナソニック株式会社 | 表示装置、表示制御方法、携帯電話及び半導体装置 |
CN102655576A (zh) * | 2011-03-04 | 2012-09-05 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
JP2012181054A (ja) * | 2011-02-28 | 2012-09-20 | Toshiba Corp | 情報出力装置及び情報出力方法 |
JP2012249867A (ja) * | 2011-06-03 | 2012-12-20 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP2013070296A (ja) * | 2011-09-26 | 2013-04-18 | Hitachi Consumer Electronics Co Ltd | 立体映像処理装置、立体表示装置、立体映像処理方法、および受信装置 |
WO2013108438A1 (ja) * | 2012-01-16 | 2013-07-25 | シャープ株式会社 | 表示装置、表示方法及び表示プログラム |
JP2013182217A (ja) * | 2012-03-02 | 2013-09-12 | Toshiba Corp | 電子機器、電子機器の制御方法、制御プログラム及び記録媒体 |
WO2013175735A1 (ja) * | 2012-05-22 | 2013-11-28 | パナソニック株式会社 | 表示制御装置及び表示制御方法 |
JP2014519091A (ja) * | 2011-05-12 | 2014-08-07 | アップル インコーポレイテッド | 存在センサ |
WO2014199666A1 (ja) * | 2013-06-13 | 2014-12-18 | シャープ株式会社 | 表示装置 |
JP2015036925A (ja) * | 2013-08-15 | 2015-02-23 | 富士ゼロックス株式会社 | 情報処理装置及び情報処理プログラム |
JP2015039066A (ja) * | 2010-12-24 | 2015-02-26 | 株式会社東芝 | 立体視映像表示システム、立体視映像表示装置および出力制御方法 |
JP6055535B1 (ja) * | 2015-12-04 | 2016-12-27 | 株式会社ガイア・システム・ソリューション | 集中度処理システム |
JP2019168687A (ja) * | 2018-03-20 | 2019-10-03 | ジョンソン・アンド・ジョンソン・ビジョン・ケア・インコーポレイテッドJohnson & Johnson Vision Care, Inc. | 近視開始及び/又は近視進行に対する近距離視認の影響を低減するためのシステムを有するデバイス |
US11418565B2 (en) | 2018-04-13 | 2022-08-16 | Sony Corporation | Space information sharing apparatus, space information sharing method, and program |
Families Citing this family (55)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5556098B2 (ja) | 2009-09-11 | 2014-07-23 | ソニー株式会社 | 表示方法及び表示装置 |
JP5568929B2 (ja) | 2009-09-15 | 2014-08-13 | ソニー株式会社 | 表示装置および制御方法 |
US8760517B2 (en) | 2010-09-27 | 2014-06-24 | Apple Inc. | Polarized images for security |
US9241195B2 (en) * | 2010-11-05 | 2016-01-19 | Verizon Patent And Licensing Inc. | Searching recorded or viewed content |
US20120154351A1 (en) * | 2010-12-21 | 2012-06-21 | Hicks Michael A | Methods and apparatus to detect an operating state of a display based on visible light |
CN102761645B (zh) * | 2011-04-26 | 2016-05-18 | 富泰华工业(深圳)有限公司 | 电子设备及其控制方法 |
US20120287031A1 (en) | 2011-05-12 | 2012-11-15 | Apple Inc. | Presence sensing |
FR2978267A1 (fr) * | 2011-07-18 | 2013-01-25 | St Microelectronics Rousset | Procede et dispositif de controle d'un appareil en fonction de la detection de personnes a proximite de l'appareil |
JP2013055424A (ja) * | 2011-09-01 | 2013-03-21 | Sony Corp | 撮影装置、パターン検出装置、および電子機器 |
CN103037153A (zh) * | 2011-09-30 | 2013-04-10 | 联想(北京)有限公司 | 一种基于摄像头的监测方法及具有摄像头的电子设备 |
CN103123537B (zh) * | 2011-11-21 | 2016-04-20 | 国基电子(上海)有限公司 | 电子显示设备及其省电方法 |
US20120092248A1 (en) * | 2011-12-23 | 2012-04-19 | Sasanka Prabhala | method, apparatus, and system for energy efficiency and energy conservation including dynamic user interface based on viewing conditions |
KR20130078676A (ko) * | 2011-12-30 | 2013-07-10 | 삼성전자주식회사 | 디스플레이장치 및 그 제어방법 |
JP5957893B2 (ja) * | 2012-01-13 | 2016-07-27 | ソニー株式会社 | 情報処理装置及び情報処理方法、並びにコンピューター・プログラム |
US20130194172A1 (en) * | 2012-01-30 | 2013-08-01 | Cellco Partnership D/B/A Verizon Wireless | Disabling automatic display shutoff function using face detection |
JP5204323B1 (ja) * | 2012-03-02 | 2013-06-05 | 株式会社東芝 | 電子機器、電子機器の制御方法、制御プログラム及び記録媒体 |
JP2015121567A (ja) * | 2012-04-11 | 2015-07-02 | シャープ株式会社 | 表示制御装置及び表示装置 |
US9633186B2 (en) | 2012-04-23 | 2017-04-25 | Apple Inc. | Systems and methods for controlling output of content based on human recognition data detection |
JP6101438B2 (ja) * | 2012-06-15 | 2017-03-22 | サターン ライセンシング エルエルシーSaturn Licensing LLC | 情報処理装置及び情報処理方法、コンピューター・プログラム、並びに情報通信システム |
US20140096169A1 (en) * | 2012-09-28 | 2014-04-03 | Joseph Dodson | Playback synchronization in a group viewing a media title |
US9769512B2 (en) | 2012-11-08 | 2017-09-19 | Time Warner Cable Enterprises Llc | System and method for delivering media based on viewer behavior |
DE102013224590B4 (de) | 2012-12-03 | 2019-07-18 | Canon Kabushiki Kaisha | Anzeigevorrichtung und deren steuerverfahren |
KR102062310B1 (ko) * | 2013-01-04 | 2020-02-11 | 삼성전자주식회사 | 전자 장치에서 헤드 트래킹 기술을 이용하여 제어 서비스를 제공하기 위한 장치 및 방법 |
KR20140093513A (ko) * | 2013-01-18 | 2014-07-28 | 삼성전자주식회사 | 휴대 단말기의 디스플레이 제어 장치 및 방법 |
CN104375628B (zh) * | 2013-08-16 | 2018-08-07 | 联想(北京)有限公司 | 一种信息处理方法及电子设备 |
CN103780970B (zh) | 2013-08-20 | 2018-03-16 | 华为终端(东莞)有限公司 | 一种媒体播放的方法、装置和系统 |
CN103686286A (zh) * | 2013-12-19 | 2014-03-26 | 小米科技有限责任公司 | 一种进行开关控制的方法和装置 |
US10667007B2 (en) * | 2014-01-22 | 2020-05-26 | Lenovo (Singapore) Pte. Ltd. | Automated video content display control using eye detection |
KR102163850B1 (ko) * | 2014-01-29 | 2020-10-12 | 삼성전자 주식회사 | 디스플레이장치 및 그 제어방법 |
WO2015117648A1 (en) * | 2014-02-05 | 2015-08-13 | Fujitsu Technology Solutions Intellectual Property Gmbh | Display device, computer system and method for managing the operating states of a computer system |
CN105094261B (zh) * | 2014-05-12 | 2018-06-29 | 联想(北京)有限公司 | 电源管理方法、系统及电子设备 |
KR101598771B1 (ko) * | 2014-06-11 | 2016-03-02 | 주식회사 슈프리마에이치큐 | 얼굴 인식 생체 인증 방법 및 장치 |
JP6586956B2 (ja) * | 2014-09-03 | 2019-10-09 | ソニー株式会社 | 検出機能付き投射型表示装置 |
CN104394461A (zh) * | 2014-11-12 | 2015-03-04 | 无锡科思电子科技有限公司 | 一种电视自适应关机控制方法 |
KR102585842B1 (ko) * | 2015-09-15 | 2023-10-11 | 인터디지털 매디슨 페턴트 홀딩스 에스에이에스 | 절전 미디어 콘텐츠를 제공하기 위한 방법 및 장치 |
KR102485453B1 (ko) * | 2015-11-24 | 2023-01-06 | 엘지디스플레이 주식회사 | 표시장치와 이의 구동방법 |
CN106817580B (zh) * | 2015-11-30 | 2019-05-21 | 深圳超多维科技有限公司 | 一种设备控制方法、装置及系统 |
CN106937135A (zh) * | 2015-12-31 | 2017-07-07 | 幸福在线(北京)网络技术有限公司 | 一种游戏场景的实时播放方法及相关装置和系统 |
NZ745738A (en) * | 2016-03-04 | 2020-01-31 | Magic Leap Inc | Current drain reduction in ar/vr display systems |
CN107221303A (zh) * | 2016-03-22 | 2017-09-29 | 中兴通讯股份有限公司 | 一种调节屏幕亮度的方法、装置及智能终端 |
CN105827872A (zh) * | 2016-06-07 | 2016-08-03 | 维沃移动通信有限公司 | 一种移动终端的控制方法及移动终端 |
WO2018076172A1 (zh) * | 2016-10-25 | 2018-05-03 | 华为技术有限公司 | 一种图像显示方法及终端 |
KR102674490B1 (ko) * | 2016-11-04 | 2024-06-13 | 삼성전자주식회사 | 디스플레이 장치 및 그 제어 방법 |
US10565671B2 (en) * | 2017-04-24 | 2020-02-18 | Intel Corporation | Reduce power by frame skipping |
CN107632708B (zh) * | 2017-09-22 | 2021-08-17 | 京东方科技集团股份有限公司 | 一种屏幕可视角度的控制方法、控制装置及柔性显示装置 |
US10817594B2 (en) | 2017-09-28 | 2020-10-27 | Apple Inc. | Wearable electronic device having a light field camera usable to perform bioauthentication from a dorsal side of a forearm near a wrist |
CN108882032A (zh) * | 2018-06-08 | 2018-11-23 | 百度在线网络技术(北京)有限公司 | 用于输出信息的方法和装置 |
US10923045B1 (en) | 2019-11-26 | 2021-02-16 | Himax Technologies Limited | Backlight control device and method |
TWI717928B (zh) * | 2019-12-09 | 2021-02-01 | 奇景光電股份有限公司 | 背光控制裝置 |
JP2021107886A (ja) * | 2019-12-27 | 2021-07-29 | 富士フイルムビジネスイノベーション株式会社 | 制御装置およびプログラム |
CN113395389B (zh) * | 2020-03-13 | 2022-12-02 | 北京小米移动软件有限公司 | 一种防止屏幕误触的方法、装置及存储介质 |
US11244654B2 (en) * | 2020-06-19 | 2022-02-08 | Intel Corporation | Display control apparatus and method for a display based on information indicating presence or engagement of the user of the display |
CN113408347B (zh) * | 2021-05-14 | 2022-03-15 | 桂林电子科技大学 | 监控摄像头远距离建筑物变化检测的方法 |
JP7488383B1 (ja) | 2023-02-15 | 2024-05-21 | レノボ・シンガポール・プライベート・リミテッド | 情報処理装置、及び制御方法 |
JP7540030B1 (ja) * | 2023-03-03 | 2024-08-26 | レノボ・シンガポール・プライベート・リミテッド | 情報処理装置、及び制御方法 |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0784062A (ja) * | 1993-09-14 | 1995-03-31 | Hitachi Commun Syst Inc | テレビジョン受像機 |
JP2004171490A (ja) * | 2002-11-22 | 2004-06-17 | Sony Corp | 画像検出装置及び画像検出方法 |
JP2005044330A (ja) | 2003-07-24 | 2005-02-17 | Univ Of California San Diego | 弱仮説生成装置及び方法、学習装置及び方法、検出装置及び方法、表情学習装置及び方法、表情認識装置及び方法、並びにロボット装置 |
JP2005267611A (ja) * | 2004-01-23 | 2005-09-29 | Sony United Kingdom Ltd | 表示装置 |
JP2005266061A (ja) * | 2004-03-17 | 2005-09-29 | Olympus Corp | 再生装置 |
JP2007065766A (ja) | 2005-08-29 | 2007-03-15 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2007328675A (ja) * | 2006-06-09 | 2007-12-20 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2008099246A (ja) * | 2006-09-13 | 2008-04-24 | Ricoh Co Ltd | 撮像装置および被写体検出方法 |
JP2008111886A (ja) * | 2006-10-27 | 2008-05-15 | Digital Electronics Corp | 自動ドア、画面表示装置、画面表示制御プログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体 |
Family Cites Families (33)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH07325639A (ja) * | 1994-05-31 | 1995-12-12 | Sharp Corp | カメラ内蔵コンピュータシステム |
JPH0944772A (ja) * | 1995-05-22 | 1997-02-14 | Mk Seiko Co Ltd | テレビ画面接近防止装置 |
JPH0916296A (ja) * | 1995-07-04 | 1997-01-17 | Fujitsu Ltd | 情報処理機器 |
JPH09120323A (ja) * | 1995-10-25 | 1997-05-06 | Canon Inc | 電源制御方法及び装置及び電子機器 |
JPH09190245A (ja) * | 1996-01-12 | 1997-07-22 | Canon Inc | 情報処理装置及び該装置における省電力制御方法 |
JPH11288259A (ja) | 1998-02-06 | 1999-10-19 | Sanyo Electric Co Ltd | 省電力制御方法及びその装置 |
JPH11242733A (ja) * | 1998-02-24 | 1999-09-07 | Sanyo Electric Co Ltd | 省電力制御装置 |
EP1580684B1 (en) * | 1998-04-13 | 2008-12-31 | Google Inc. | Face recognition from video images |
JP2000221953A (ja) | 1999-01-29 | 2000-08-11 | Sony Corp | 映像表示装置、映像処理方法及びこれらを応用した映像表示システム |
JP3603674B2 (ja) | 1999-06-22 | 2004-12-22 | 日本電気株式会社 | コンピュータシステムの電源制御システム |
US6754389B1 (en) * | 1999-12-01 | 2004-06-22 | Koninklijke Philips Electronics N.V. | Program classification using object tracking |
US6665805B1 (en) * | 1999-12-27 | 2003-12-16 | Intel Corporation | Method and apparatus for real time monitoring of user presence to prolong a portable computer battery operation time |
EP1426919A1 (en) * | 2002-12-02 | 2004-06-09 | Sony International (Europe) GmbH | Method for operating a display device |
JP2004213486A (ja) | 2003-01-07 | 2004-07-29 | Sony Corp | 画像処理装置および方法、記録媒体、並びにプログラム |
JP2005115521A (ja) | 2003-10-06 | 2005-04-28 | Toshiba Corp | 機器制御システムおよび機器制御方法 |
JP2007520110A (ja) * | 2003-12-22 | 2007-07-19 | コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ | 視聴者のムードを監視するコンテンツ処理システム、方法及びコンピュータプログラムプロダクト |
EP1566788A3 (en) * | 2004-01-23 | 2017-11-22 | Sony United Kingdom Limited | Display |
JP2005221907A (ja) * | 2004-02-09 | 2005-08-18 | Sanyo Electric Co Ltd | 表示装置 |
JP4371024B2 (ja) | 2004-09-28 | 2009-11-25 | ソニー株式会社 | 記録再生装置、記録再生方法および記録再生システム |
US7315631B1 (en) * | 2006-08-11 | 2008-01-01 | Fotonation Vision Limited | Real-time face tracking in a digital image acquisition device |
JP3862027B2 (ja) * | 2005-01-25 | 2006-12-27 | 船井電機株式会社 | 放送信号受信システム |
JP2006236244A (ja) * | 2005-02-28 | 2006-09-07 | Toshiba Corp | 顔認証装置および入退場管理装置 |
JP4114676B2 (ja) * | 2005-05-16 | 2008-07-09 | 船井電機株式会社 | 映像再生装置 |
JP2007036702A (ja) * | 2005-07-27 | 2007-02-08 | Casio Comput Co Ltd | 携帯情報端末及び表示制御方法 |
JP4350725B2 (ja) * | 2005-08-05 | 2009-10-21 | キヤノン株式会社 | 画像処理方法、画像処理装置、及び、画像処理方法をコンピュータに実行させるためのプログラム |
US8218080B2 (en) * | 2005-12-05 | 2012-07-10 | Samsung Electronics Co., Ltd. | Personal settings, parental control, and energy saving control of television with digital video camera |
RU2309454C1 (ru) * | 2006-03-23 | 2007-10-27 | Государственное образовательное учреждение высшего профессионального образования "Пермский государственный университет" | Устройство управления компьютером |
JP5303824B2 (ja) * | 2006-06-16 | 2013-10-02 | セイコーエプソン株式会社 | プロジェクタ及びその制御方法 |
CN200953604Y (zh) * | 2006-09-05 | 2007-09-26 | 康佳集团股份有限公司 | 节能电视机 |
US8005768B2 (en) * | 2006-11-28 | 2011-08-23 | Samsung Electronics Co., Ltd. | Multimedia file reproducing apparatus and method |
US7903166B2 (en) * | 2007-02-21 | 2011-03-08 | Sharp Laboratories Of America, Inc. | Methods and systems for display viewer motion compensation based on user image data |
JP2008148297A (ja) * | 2007-11-22 | 2008-06-26 | Sharp Corp | 電子機器 |
JP5160293B2 (ja) * | 2008-04-23 | 2013-03-13 | ソニーモバイルコミュニケーションズ, エービー | 携帯端末、その表示制御方法および表示制御プログラム |
-
2009
- 2009-08-21 US US12/737,697 patent/US9104408B2/en not_active Expired - Fee Related
- 2009-08-21 WO PCT/JP2009/064625 patent/WO2010021373A1/ja active Application Filing
- 2009-08-21 CN CN201310278235.2A patent/CN103379300B/zh active Active
- 2009-08-21 CN CN2009801305540A patent/CN102119530B/zh active Active
- 2009-08-21 RU RU2011104474/07A patent/RU2493613C2/ru active
- 2009-08-21 JP JP2010525710A patent/JP5553022B2/ja not_active Expired - Fee Related
- 2009-08-21 BR BRPI0917133-9A patent/BRPI0917133B1/pt active IP Right Grant
- 2009-08-21 EP EP09808309.0A patent/EP2315439B1/en active Active
-
2014
- 2014-05-22 JP JP2014106164A patent/JP5976035B2/ja active Active
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0784062A (ja) * | 1993-09-14 | 1995-03-31 | Hitachi Commun Syst Inc | テレビジョン受像機 |
JP2004171490A (ja) * | 2002-11-22 | 2004-06-17 | Sony Corp | 画像検出装置及び画像検出方法 |
JP2005044330A (ja) | 2003-07-24 | 2005-02-17 | Univ Of California San Diego | 弱仮説生成装置及び方法、学習装置及び方法、検出装置及び方法、表情学習装置及び方法、表情認識装置及び方法、並びにロボット装置 |
JP2005267611A (ja) * | 2004-01-23 | 2005-09-29 | Sony United Kingdom Ltd | 表示装置 |
JP2005266061A (ja) * | 2004-03-17 | 2005-09-29 | Olympus Corp | 再生装置 |
JP2007065766A (ja) | 2005-08-29 | 2007-03-15 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2007328675A (ja) * | 2006-06-09 | 2007-12-20 | Sony Corp | 情報処理装置、および情報処理方法、並びにコンピュータ・プログラム |
JP2008099246A (ja) * | 2006-09-13 | 2008-04-24 | Ricoh Co Ltd | 撮像装置および被写体検出方法 |
JP2008111886A (ja) * | 2006-10-27 | 2008-05-15 | Digital Electronics Corp | 自動ドア、画面表示装置、画面表示制御プログラムおよびそのプログラムを記録したコンピュータ読み取り可能な記録媒体 |
Non-Patent Citations (1)
Title |
---|
T.F. COOTES; G.J. EDWARDS; CJ. TAYLOR: "Proc. Fifth European Conf. Computer Vision", vol. 2, 1998, article "Active Appearance Models", pages: 484 - 498 |
Cited By (28)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2011030624A1 (ja) * | 2009-09-11 | 2011-03-17 | ソニー株式会社 | 表示装置および制御方法 |
US9298258B2 (en) | 2009-09-11 | 2016-03-29 | Sony Corporation | Display apparatus and control method |
US8913007B2 (en) | 2009-09-11 | 2014-12-16 | Sony Corporation | Display apparatus and control method |
JP2011259384A (ja) * | 2010-06-11 | 2011-12-22 | Sharp Corp | 撮像装置、表示装置、プログラム及び記録媒体 |
JP2012089909A (ja) * | 2010-10-15 | 2012-05-10 | Sony Corp | 情報処理装置、同期方法およびプログラム |
WO2012066705A1 (ja) * | 2010-11-17 | 2012-05-24 | パナソニック株式会社 | 表示装置、表示制御方法、携帯電話及び半導体装置 |
JP2012109810A (ja) * | 2010-11-17 | 2012-06-07 | Panasonic Corp | 表示装置、表示制御方法、携帯電話及び半導体装置 |
CN103210657A (zh) * | 2010-11-17 | 2013-07-17 | 松下电器产业株式会社 | 显示设备、显示控制方法、蜂窝电话和半导体器件 |
JP2015039066A (ja) * | 2010-12-24 | 2015-02-26 | 株式会社東芝 | 立体視映像表示システム、立体視映像表示装置および出力制御方法 |
JP2012181054A (ja) * | 2011-02-28 | 2012-09-20 | Toshiba Corp | 情報出力装置及び情報出力方法 |
CN102655576A (zh) * | 2011-03-04 | 2012-09-05 | 索尼公司 | 信息处理设备、信息处理方法和程序 |
US20120224043A1 (en) * | 2011-03-04 | 2012-09-06 | Sony Corporation | Information processing apparatus, information processing method, and program |
JP2014519091A (ja) * | 2011-05-12 | 2014-08-07 | アップル インコーポレイテッド | 存在センサ |
JP2012249867A (ja) * | 2011-06-03 | 2012-12-20 | Nintendo Co Ltd | 情報処理プログラム、情報処理装置、情報処理システム、および情報処理方法 |
JP2013070296A (ja) * | 2011-09-26 | 2013-04-18 | Hitachi Consumer Electronics Co Ltd | 立体映像処理装置、立体表示装置、立体映像処理方法、および受信装置 |
WO2013108438A1 (ja) * | 2012-01-16 | 2013-07-25 | シャープ株式会社 | 表示装置、表示方法及び表示プログラム |
JP2013182217A (ja) * | 2012-03-02 | 2013-09-12 | Toshiba Corp | 電子機器、電子機器の制御方法、制御プログラム及び記録媒体 |
WO2013175735A1 (ja) * | 2012-05-22 | 2013-11-28 | パナソニック株式会社 | 表示制御装置及び表示制御方法 |
US9307187B2 (en) | 2012-05-22 | 2016-04-05 | Panasonic Intellectual Property Management Co., Ltd. | Display control device and display control method |
JPWO2013175735A1 (ja) * | 2012-05-22 | 2016-01-12 | パナソニックIpマネジメント株式会社 | 表示制御装置及び表示制御方法 |
WO2014199666A1 (ja) * | 2013-06-13 | 2014-12-18 | シャープ株式会社 | 表示装置 |
US9875695B2 (en) | 2013-06-13 | 2018-01-23 | Sharp Kabushiki Kaisha | Display device |
JP2015036925A (ja) * | 2013-08-15 | 2015-02-23 | 富士ゼロックス株式会社 | 情報処理装置及び情報処理プログラム |
CN109191730B (zh) * | 2013-08-15 | 2021-02-12 | 富士施乐株式会社 | 信息处理装置 |
JP6055535B1 (ja) * | 2015-12-04 | 2016-12-27 | 株式会社ガイア・システム・ソリューション | 集中度処理システム |
JP2017103702A (ja) * | 2015-12-04 | 2017-06-08 | 株式会社ガイア・システム・ソリューション | 集中度処理システム |
JP2019168687A (ja) * | 2018-03-20 | 2019-10-03 | ジョンソン・アンド・ジョンソン・ビジョン・ケア・インコーポレイテッドJohnson & Johnson Vision Care, Inc. | 近視開始及び/又は近視進行に対する近距離視認の影響を低減するためのシステムを有するデバイス |
US11418565B2 (en) | 2018-04-13 | 2022-08-16 | Sony Corporation | Space information sharing apparatus, space information sharing method, and program |
Also Published As
Publication number | Publication date |
---|---|
JPWO2010021373A1 (ja) | 2012-01-26 |
US9104408B2 (en) | 2015-08-11 |
CN103379300B (zh) | 2017-03-01 |
EP2315439A4 (en) | 2017-04-12 |
CN103379300A (zh) | 2013-10-30 |
JP2014209739A (ja) | 2014-11-06 |
CN102119530A (zh) | 2011-07-06 |
CN102119530B (zh) | 2013-08-21 |
JP5976035B2 (ja) | 2016-08-23 |
BRPI0917133A2 (pt) | 2015-11-10 |
BRPI0917133B1 (pt) | 2021-03-02 |
US20110135114A1 (en) | 2011-06-09 |
RU2493613C2 (ru) | 2013-09-20 |
JP5553022B2 (ja) | 2014-07-16 |
RU2011104474A (ru) | 2012-08-20 |
EP2315439A1 (en) | 2011-04-27 |
EP2315439B1 (en) | 2018-10-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5976035B2 (ja) | 画像表示装置および制御方法 | |
US10290281B2 (en) | Display device and control method | |
CN106878787B (zh) | 电视影院模式实现方法及装置 | |
JP4697279B2 (ja) | 画像表示装置および検出方法 | |
JP4281819B2 (ja) | 撮像画像データ処理装置、視聴情報生成装置、視聴情報生成システム、撮像画像データ処理方法、視聴情報生成方法 | |
JP5299866B2 (ja) | 映像表示装置 | |
US20070126884A1 (en) | Personal settings, parental control, and energy saving control of television with digital video camera | |
JP5528318B2 (ja) | 表示装置 | |
WO2011125905A1 (ja) | テレビ受信機の動作モード自動設定装置、動作モード自動設定装置を備えたテレビ受信機、動作モード自動設定方法 | |
WO2019085980A1 (zh) | 视频字幕自动调整方法及装置、终端及可读存储介质 | |
US20060188234A1 (en) | Broadcasting signal receiving system | |
JP2011239029A (ja) | 映像表示装置、省電力制御装置、及び省電力方法 | |
CN102970610B (zh) | 智能显示的方法和电子设备 | |
JP2010074323A (ja) | 記録装置および方法、並びに記録再生装置および方法 | |
EP2575358A1 (en) | Display apparatus and control method thereof | |
US20150271465A1 (en) | Audio/video system with user analysis and methods for use therewith | |
US20140009588A1 (en) | Video display apparatus and video display method | |
US20120060614A1 (en) | Image sensing device | |
TWI549051B (zh) | 顯示裝置播放節目之方法 | |
JP2018207445A (ja) | 送信装置、送信方法、及びプログラム | |
JP2015039078A (ja) | 映像表示装置、3dメガネ、映像表示方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 200980130554.0 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 09808309 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2011104474 Country of ref document: RU Ref document number: 878/CHENP/2011 Country of ref document: IN Ref document number: 12737697 Country of ref document: US Ref document number: 2009808309 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010525710 Country of ref document: JP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
ENP | Entry into the national phase |
Ref document number: PI0917133 Country of ref document: BR Kind code of ref document: A2 Effective date: 20110208 |