WO2011033855A1 - 表示装置および制御方法 - Google Patents
表示装置および制御方法 Download PDFInfo
- Publication number
- WO2011033855A1 WO2011033855A1 PCT/JP2010/062311 JP2010062311W WO2011033855A1 WO 2011033855 A1 WO2011033855 A1 WO 2011033855A1 JP 2010062311 W JP2010062311 W JP 2010062311W WO 2011033855 A1 WO2011033855 A1 WO 2011033855A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- unit
- image
- user
- optimization processing
- calculated
- Prior art date
Links
- 238000000034 method Methods 0.000 title claims abstract description 48
- 238000012545 processing Methods 0.000 claims abstract description 133
- 238000005457 optimization Methods 0.000 claims abstract description 115
- 238000003384 imaging method Methods 0.000 claims description 70
- 238000010191 image analysis Methods 0.000 claims description 21
- 238000004458 analytical method Methods 0.000 description 38
- 238000012937 correction Methods 0.000 description 38
- 230000008569 process Effects 0.000 description 35
- 238000001514 detection method Methods 0.000 description 28
- 238000004364 calculation method Methods 0.000 description 21
- 238000010586 diagram Methods 0.000 description 9
- 230000003287 optical effect Effects 0.000 description 8
- 238000013459 approach Methods 0.000 description 6
- 230000007246 mechanism Effects 0.000 description 6
- 230000005484 gravity Effects 0.000 description 5
- 238000009434 installation Methods 0.000 description 5
- 230000008859 change Effects 0.000 description 3
- 230000003247 decreasing effect Effects 0.000 description 2
- 230000002238 attenuated effect Effects 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000005401 electroluminescence Methods 0.000 description 1
- 238000002474 experimental method Methods 0.000 description 1
- 210000004709 eyebrow Anatomy 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 230000001815 facial effect Effects 0.000 description 1
- 230000008921 facial expression Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 238000007670 refining Methods 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000007704 transition Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/011—Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F18/00—Pattern recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/178—Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G3/00—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
- G09G3/20—Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes for presentation of an assembly of a number of characters, e.g. a page, by composing the assembly by combination of individual elements arranged in a matrix no fixed position being assigned to or needed to be assigned to the individual characters or partial characters
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/41—Structure of client; Structure of client peripherals
- H04N21/422—Input-only peripherals, i.e. input devices connected to specially adapted client devices, e.g. global positioning system [GPS]
- H04N21/4223—Cameras
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/431—Generation of visual interfaces for content selection or interaction; Content or additional data rendering
- H04N21/4312—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations
- H04N21/4314—Generation of visual interfaces for content selection or interaction; Content or additional data rendering involving specific graphical features, e.g. screen layout, special fonts or colors, blinking icons, highlights or animations for fitting data in a restricted space on the screen, e.g. EPG data in a rectangular grid
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/44—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs
- H04N21/4402—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display
- H04N21/440263—Processing of video elementary streams, e.g. splicing a video clip retrieved from local storage with an incoming video stream or rendering scenes according to encoded video stream scene graphs involving reformatting operations of video signals for household redistribution, storage or real-time display by altering the spatial resolution, e.g. for displaying on a connected PDA
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N21/00—Selective content distribution, e.g. interactive television or video on demand [VOD]
- H04N21/40—Client devices specifically adapted for the reception of or interaction with content, e.g. set-top-box [STB]; Operations thereof
- H04N21/43—Processing of content or additional data, e.g. demultiplexing additional data from a digital video stream; Elementary client operations, e.g. monitoring of home network or synchronising decoder's clock; Client middleware
- H04N21/442—Monitoring of processes or resources, e.g. detecting the failure of a recording device, monitoring the downstream bandwidth, the number of times a movie has been viewed, the storage space available from the internal hard disk
- H04N21/44213—Monitoring of end-user related data
- H04N21/44218—Detecting physical presence or behaviour of the user, e.g. using sensors to detect if the user is leaving the room or changes his face expression during a TV program
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/44—Receiver circuitry for the reception of television signals according to analogue transmission standards
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N5/00—Details of television systems
- H04N5/64—Constructional details of receivers, e.g. cabinets or dust covers
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N9/00—Details of colour television systems
- H04N9/64—Circuits for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
- H04S7/301—Automatic calibration of stereophonic sound system, e.g. with test microphone
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2320/00—Control of display operating conditions
- G09G2320/02—Improving the quality of display appearance
- G09G2320/028—Improving the quality of display appearance by changing the viewing angle properties, e.g. widening the viewing angle, adapting the viewing angle to the view direction
-
- G—PHYSICS
- G09—EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
- G09G—ARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
- G09G2354/00—Aspects of interface with display user
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N23/00—Cameras or camera modules comprising electronic image sensors; Control thereof
- H04N23/80—Camera processing pipelines; Components thereof
- H04N23/84—Camera processing pipelines; Components thereof for processing colour signals
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04S—STEREOPHONIC SYSTEMS
- H04S7/00—Indicating arrangements; Control arrangements, e.g. balance control
- H04S7/30—Control circuits for electronic adaptation of the sound field
- H04S7/302—Electronic adaptation of stereophonic sound system to listener position or orientation
- H04S7/303—Tracking of listener position or orientation
Definitions
- the present invention relates to a display device and a control method.
- the state of the image display device that is, the volume of the sound output from the audio output unit Audio characteristics such as balance, the image characteristics and display contents of the image display unit of the image display device, the display direction of the image display device, etc. may not be optimum for the user at any position.
- the present invention has been made in view of the above problems, and an object of the present invention is to provide a novel user capable of optimizing the state of an image display apparatus for a user at an arbitrary position. And providing an improved display device and control method.
- an imaging unit configured to capture a moving image in a predetermined range with respect to an image display direction, and a moving image captured by the imaging unit are analyzed to position the user.
- a system optimization processing unit that calculates system control information for optimizing the system based on the position of the user calculated by the image analysis unit; And a system control unit that optimizes the system based on the calculated system control information.
- the system optimization processing unit may calculate, based on the position of the user calculated by the image analysis unit, system control information for optimizing the volume balance of the sound output from the sound output unit.
- the system optimization processing unit may calculate system control information for optimizing the image characteristic of the image display unit based on the position of the user calculated by the image analysis unit.
- the system optimization processing unit may calculate system control information for optimizing the display content of the image display unit based on the position of the user calculated by the image analysis unit.
- the system optimization processing unit may calculate system control information for optimizing the device direction of the own device based on the position of the user calculated by the image analysis unit.
- the image analysis unit may analyze a moving image captured by the imaging unit to calculate a three-dimensional position of the user.
- the image analysis unit analyzes a moving image captured by the imaging unit to calculate positions of a plurality of users, and the system optimization processing unit detects positions of the plurality of users calculated by the image analysis unit. Based on the calculated barycentric positions of the plurality of users, system control information for optimizing the system may be calculated based on the calculated barycentric positions of the plurality of users.
- an imaging step for capturing a moving image of a predetermined range with respect to an image display direction, and analyzing the captured moving image
- a system optimization processing step of calculating system control information for optimizing the system based on the calculated position of the user, and the calculated system control information.
- a system control step of optimizing the system is a process for optimizing the system.
- FIG. 6 is an explanatory view illustrating the configuration of a control unit 110.
- FIG. 6 is an explanatory view illustrating the configuration of a control unit 110.
- (A) is an explanatory diagram for describing a case where the user 1 and the user 2 exist in the imaging range of the imaging unit 104, and
- (B) is a user included in the image imaged by the imaging unit 104.
- FIG. 1 It is explanatory drawing for demonstrating face detection position [a1, b1] of 1, face size [w1, h1], face detection position [a2, b2] of user 2, and face size [w2, h2].
- A is an explanatory view for explaining a case where a user exists at a reference distance d0 and a distance d1 within the imaging range of the imaging unit 104
- B is a distance in an image imaged by the imaging unit 104 FIG.
- 16C is an explanatory diagram for describing the face size [w1, h1] of the user of d1;
- (C) is a reference face size [w0, h0] of the user of the reference distance d0 in the image captured by the imaging unit 104; It is an explanatory view for explaining.
- (A) is an explanatory view for explaining an apparatus center [0, 0, 0] of the image display apparatus 100 and a camera position [ ⁇ x, ⁇ y, ⁇ z] of the imaging unit 104
- (B) is an image display Explanatory drawing for demonstrating the apparatus center [0,0,0] of the apparatus 100, front direction axis [0,0], camera position [deltax, deltay, deltaz] of imaging part 104, and installation angle [deltaphi, deltatheta]. It is. It is a flowchart shown about an example of the optimization process according to the position of the user by the image display apparatus 100 concerning one Embodiment of this invention.
- (A) is an explanatory diagram for explaining the position [D0, Ah0] of the user A in the horizontal direction and the position [D1, Ah1] of the user B
- (B) is the position [user A in the vertical direction [ It is an explanatory view for explaining D0, Av0] and position [D, Av1] of user B.
- (A) and (B) is an explanatory view for explaining optimization processing of character size of GUI.
- (A) to (C) are explanatory diagrams for explaining a method of correcting the reference face size [w0, h0] at the reference distance d0 in the calculation of the user distance.
- Embodiment of the present invention > [1-1. Configuration of Image Display Device] [1-2. Configuration of control unit] [1-3. Optimization processing according to the position of the user] [1.4.1 Optimization processing according to the position of one or more users] [1-5.1 optimization processing according to the age of multiple users]
- FIG. 1 is an explanatory view for explaining the appearance of an image display apparatus 100 according to an embodiment of the present invention.
- FIG. 1 is a front view of the image display apparatus 100 viewed from the front.
- the appearance of the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the image display apparatus 100 captures an image by capturing a moving image in the upper center and the left and right center of the display panel unit 102 that displays a still image or a moving image.
- a unit 104 is provided.
- the imaging unit 104 captures a moving image in a direction in which the image display apparatus 100 displays a still image or a moving image on the display panel unit 102.
- the image display apparatus 100 analyzes the image captured by the imaging unit 104, and detects the face of the user appearing in the image.
- the image display apparatus 100 analyzes the detected face image of the user to detect a face detection position and a face size.
- the image display apparatus 100 calculates the relative position of the user with respect to the optical axis of the camera of the imaging unit 104 based on the user's face detection position and the detection result of the face size. Then, based on the calculation result of the relative position of the user with respect to the optical axis of the camera of the imaging unit 104 and the attachment information such as the information of the position and angle of the camera of the imaging unit 104, Calculate the user's position with respect to the device center and the front direction axis. According to the position of the user, the image display apparatus 100 according to the present embodiment determines the state of the image display apparatus 100, that is, the audio characteristics such as the sound volume balance of audio, the image characteristics, the display content, the device direction, etc. It is characterized by optimizing for.
- the image display apparatus 100 includes the sensor unit 106 at the lower center of the display panel unit 102.
- the sensor unit 106 detects the presence or absence of a person in front of the image display device 100.
- the image display apparatus 100 is provided with the imaging unit 104 for capturing a moving image at three locations around the display panel unit 102, but in the present invention, the image display apparatus 100 of the imaging unit 104 for capturing a moving image.
- the place is not limited to such an example.
- a device different from the image display device 100 may be provided, the device may be connected to the image display device 100, and a moving image may be taken by the device.
- the number of imaging units 104 is not limited to three, and two or less or four or more imaging units 104 may be provided for imaging.
- the number of sensor units 106 is not limited to one, and two or more sensor units may be provided.
- the image display apparatus 100 may further include a signal receiving unit capable of receiving a control signal by infrared light or wireless communication from a remote controller (not shown).
- FIG. 2 is an explanatory view for explaining the configuration of the image display apparatus 100 according to the embodiment of the present invention.
- the configuration of the image display apparatus 100 according to the embodiment of the present invention will be described with reference to FIG.
- the image display apparatus 100 includes a display panel unit 102, an imaging unit 104, a sensor unit 106, a speaker unit 108, a mechanism unit 109, and a control unit. And 110.
- the control unit 110 controls the image input unit 112, the image processing unit 114, the viewing state analysis unit 116, the user position information storage unit 117, the viewing state recording unit 118, the system optimization processing unit 120, and system control. And a unit 122.
- the display panel unit 102 displays a still image or a moving image based on a panel drive signal.
- the display panel unit 102 displays a still image or a moving image by liquid crystal.
- the display panel section 102 is not limited to such an example.
- the display panel unit 102 may display a still image or a moving image by a self-luminous display device such as organic EL (electroluminescence).
- the imaging unit 104 is provided at the upper center and the left and right center of the display panel unit 102 that displays a still image or a moving image, and a panel drive signal is supplied to the display panel unit 102 to display While displaying a moving image on the panel unit 102, the image display apparatus 100 picks up a moving image in the direction in which the display panel unit 102 displays the moving image.
- the imaging unit 104 may capture a moving image with a charge coupled device (CCD) image sensor, or may capture a moving image with a complementary metal oxide semiconductor (CMOS) image sensor. The moving image captured by the imaging unit 104 is sent to the control unit 110.
- CCD charge coupled device
- CMOS complementary metal oxide semiconductor
- the sensor unit 106 is provided at the lower center of the display panel unit 102 that displays a still image or a moving image, and detects, for example, the presence or absence of a person in front of the image display device 100. is there. Further, when a human is present in front of the image display device 100, the sensor unit 106 can detect the distance between the image display device 100 and the human. The detection result and distance information by the sensor unit 106 are sent to the control unit 110.
- the speaker unit 108 outputs sound based on the sound output signal.
- the mechanical unit 109 controls, for example, the display direction of the display panel 102 of the image display apparatus 100 based on a drive signal.
- the control unit 110 controls the operation of the image display apparatus 100. Each part of the control unit 110 will be described below.
- the image input unit 112 receives a moving image captured by the imaging unit 104.
- the moving image received by the image input unit 112 is sent to the image processing unit 114 and used for image processing in the image processing unit 114.
- the image processing unit 114 is an example of the image analysis unit of the present invention, and executes various image processing on the moving image captured by the imaging unit 104 and sent from the image input unit 112.
- the image processing performed by the image processing unit 114 includes detection processing of a moving body included in the moving image captured by the imaging unit 104, detection processing of the number of persons included in the moving image, and included in the moving image. Detection processing of the face and face expression. Results of various image processing by the image processing unit 114 are sent to the viewing state analysis unit 116, and are used to analyze the presence or absence of a person viewing the image display device 100, and the viewing state and viewing position of the viewing person. Be
- the techniques disclosed in Japanese Patent Application Laid-Open No. 2007-65766 and Japanese Patent Application Laid-Open No. 2005-443330 can be used.
- the face detection process will be briefly described below.
- the position of the face, the size of the face, and the direction of the face in the supplied image are detected.
- a characteristic part of the face for example, a characteristic part such as eyebrows, eyes, a nose, and a mouth is detected from the extracted face image and information on the direction of the face.
- AAM Active Appearance Models
- the local feature amount is calculated for each of the detected face feature positions.
- the techniques disclosed in Japanese Patent Application Laid-Open No. 2007-65766 and Japanese Patent Application Laid-Open No. 2005-443330 can be used, and thus detailed description will be omitted here.
- it is possible to determine whether the face shown in the supplied image is male or female or how old the person is, according to the face image or face feature position.
- by recording face information in advance it is possible to identify a person by searching for a person appearing in the supplied image from the recorded faces.
- the viewing state analysis unit 116 is an example of the image analysis unit according to the present invention, receives results of various image processing by the image processing unit 114, and detection results and distance information detected by the sensor unit 106. Analysis of the viewing state and the viewing position of the person viewing the image displayed by the image display device 100 is performed using the result of the image processing, and the detection result and distance information detected by the sensor unit 106. is there. By analyzing the viewing state and the viewing position of the person being viewed by the viewing state analysis unit 116, the image display device 100 can use, for example, the speaker unit 108 according to the viewing position of the person viewing the image display device 100.
- An analysis result by analysis processing in the viewing state analysis unit 116 is sent to the viewing state recording unit 118, the user position information storage unit 117, and the system optimization processing unit 120.
- the viewing state analysis unit 116 can detect a moving object from the detection result and distance information by the detection of the sensor unit 106, but if the distance between the sensor unit 106 and the moving object is greater than a predetermined distance The moving body may be excluded from detection.
- the viewing state recording unit 118 records the analysis result obtained by the analysis process of the viewing state analysis unit 116.
- the analysis result in the viewing state analysis unit 116 recorded by the viewing state recording unit 118 is used for the system optimization process in the system optimization processing unit 120. Further, the analysis result in the viewing state analysis unit 116 recorded by the viewing state recording unit 118 may be sent to the external information collection server 200.
- the user position information storage unit 117 stores an analysis result by the analysis processing in the viewing state analysis unit 116.
- the system optimization processing unit 120 calculates system control information for executing system optimization processing on each unit of the image display device 100 using an analysis result obtained by analysis processing of the viewing state analysis unit 116.
- System optimization processing for each part of the image display apparatus 100 includes control of audio characteristics such as audio balance of the output from the speaker unit 108, control of image characteristics of the display panel unit 102, and control of display content of the display panel unit 102.
- the control of the display direction of the display panel unit 102 of the image display apparatus 100 by the mechanism unit 109 may be performed.
- the image display apparatus 100 can execute the optimization process according to the position of the user based on the system control information calculated by the system optimization processing unit 120.
- System control information calculated by the system optimization processing unit 120 is sent to the system control unit 122.
- the system control unit 122 executes system optimization processing on each unit of the image display apparatus 100 based on the system control information calculated by the system optimization processing unit 120. For example, the system control unit 122 controls the sound volume balance of the sound output from the speaker unit 108, controls the image characteristics of the display panel unit 102, and the display panel based on the system control information calculated by the system optimization processing unit 120. The control of the display content of the unit 102, the control of the display direction of the display panel unit 102 of the image display apparatus 100 by the mechanism unit 109, and the like are executed.
- FIG. 3 is an explanatory diagram for explaining the configuration of the control unit 110 included in the image display apparatus 100 according to the embodiment of the present invention.
- FIG. 3 illustrates the configuration of the viewing state analysis unit 116 included in the control unit 110 among the control unit 110.
- the configuration of the viewing state analysis unit 116 will be described using FIG. 3.
- the viewing state analysis unit 116 includes a user direction / distance calculation unit 132 and a user position information calculation unit 134.
- the user direction / distance calculation unit 132 receives results of various image processing by the image processing unit 114 and optical information such as information of the angle of view and resolution of the camera of the imaging unit 104, and results of various image processing by the image processing unit 114
- the relative position (direction [ ⁇ 1, ⁇ 1], distance d1) of the user with respect to the optical axis of the camera of the imaging unit 104 is calculated using the optical information of the imaging unit 104.
- FIG. 5A is an explanatory diagram for explaining the case where the user 1 and the user 2 exist in the imaging range of the imaging unit 104
- FIG. 5B shows the image captured by the imaging unit 104.
- FIG. 16 is an explanatory view for describing the face detection position [a1, b1] and face size [w1, h1] of the user 1 and the face detection position [a2, b2] and face size [w2, h2] of the user 2 included in is there.
- 6A is an explanatory diagram for describing the case where a user is present at the reference distance d0 and the distance d1 within the imaging range of the imaging unit 104
- FIG. FIG. 6C is an explanatory diagram for describing the face size [w1, h1] of the user of the distance d1 in the captured image
- FIG. 6C is a reference of the user of the reference distance d0 in the image captured by the imaging unit 104. It is an explanatory view for explaining face size [w0, h0].
- the direction [ ⁇ 1, ⁇ 1] is based on the face detection position [a1, b1] normalized by the captured image size [xmax, ymax] and the angle of view [ ⁇ 0, ⁇ 0] of the camera of the imaging unit 104
- Horizontal direction: ⁇ 1 ⁇ 0 * a1
- Vertical direction: ⁇ 1 ⁇ 0 * b1 Is calculated.
- the user position information calculation unit 134 calculates the relative position of the user relative to the optical axis of the camera of the imaging unit 104 by the user direction / distance calculation unit 132, and attachment information such as information on the position and angle of the camera of the imaging unit 104.
- the three-dimensional position of the user with respect to the device center and the front direction axis of the image display device 100 is calculated using the calculation result of the relative position of the user by the user direction / distance calculation unit 132 and the attachment information of the imaging unit 104.
- the user position information calculated by the user position information calculation unit 134 is sent to the user position information storage unit 117.
- FIG. 7A is an explanatory diagram for explaining the device center [0, 0, 0] of the image display device 100 and the camera position [ ⁇ x, ⁇ y, ⁇ z] of the imaging unit 104, and FIG. Describes the device center [0, 0, 0] of the image display device 100, the front direction axis [0, 0], the camera position [ ⁇ x, ⁇ y, ⁇ z] of the imaging unit 104, and the installation angle [ ⁇ , ⁇ ] FIG.
- FIG. 4 is an explanatory view illustrating the configuration of the control unit 110 included in the image display device 100 according to the embodiment of the present invention.
- FIG. 4 illustrates the configurations of the user position information storage unit 117, the system optimization processing unit 120, and the system control unit 122 included in the control unit 110 among the control unit 110.
- configurations of the user position information storage unit 117, the system optimization processing unit 120, and the system control unit 122 will be described with reference to FIG.
- the system optimization processing unit 120 is configured to include an audio characteristic optimization processing unit 142, an image characteristic optimization processing unit 144, and a device direction optimization processing unit 146.
- the system control unit 122 is configured to include an audio characteristic control unit 152, an image characteristic control unit 154, and a device direction control unit 156.
- the user position information storage unit 117 stores user position information as a calculation result of the position of the user with respect to the device center and the front direction axis of the image display device 100 by the user position information calculation unit 134 of the viewing state analysis unit 116.
- the user position information stored in the user position information storage unit 117 is sent to the system optimization processing unit 120.
- the voice characteristic optimization processing unit 142 of the system optimization processing unit 120 uses the voice characteristics of the image display apparatus 100 for the user at any position based on the user position information sent from the user position information storage unit 117. Audio characteristic control information for performing audio characteristic optimization processing on the speaker unit 108 of the image display device 100 for optimization is calculated. The voice characteristic control information calculated by the voice characteristic optimization processing unit 142 is sent to the voice characteristic control unit 152 of the system control unit 122.
- the sound characteristic optimization process includes a process for optimizing the volume balance between the left and right of the sound output from the speaker unit 108, and an optimization process using the surround effect of the sound output from the speaker unit 108. Since the left and right volume balance of the sound output from the speaker unit 108 causes a difference in the volume level according to the position of the user, in the optimization processing, the respective left and right gains are optimized. For example, as shown in FIG. 11, the difference from the reference point is calculated as shown below from the principle that the volume of the sound output from the speaker unit 108 is attenuated in inverse proportion to the square of the distance. The volume difference (gain_dif) can be calculated to optimize the left-right volume balance.
- gain_Lch 20 * log (d_Lch / d_org_LRch)
- gain_Rch 20 * log (d_Rch / d_org_LRch)
- gain_Lch Lch
- Gain difference gain_Rch Gain difference of Rch d_org_LRch : Distance from left and right speaker to reference point
- d_Lch Distance from Lch Speaker to User d_Rch : The distance from the Rch speaker to the user gain_dif : Voltage Gain difference of L / R
- the volume balance of the sound output from the left and right speaker units 108 may be optimized with respect to the gravity center of the user group, and specific users are prioritized and output from the left and right speaker units 108 Sound volume balance may be optimized.
- CAM_AUDIENCE User CAM_HOR_ANGLE [0] within the imaging range of the imaging unit 104 : User A's angle CAM_HOR_ANGLE [1] : User B's angle CAM_HOR_ANGLE [2] : User C's angle CAM_HOR_ANGLE [3] : User D's angle CAM_DIS [0]
- the image characteristic optimization processing unit 144 of the system optimization processing unit 120 Based on the user position information sent from the user position information storage unit 117, the image characteristic optimization processing unit 144 of the system optimization processing unit 120 applies the image characteristics of the image display apparatus 100 to the user at any position. Image characteristic control information for executing the image characteristic optimization process on the display panel unit 102 of the image display device 100 for optimization is calculated. The image characteristic control information calculated by the image characteristic optimization processing unit 144 is sent to the image characteristic control unit 154 of the system control unit 122.
- the image characteristic optimization processing includes processing such as gamma correction for optimizing the appearance of black, and correction of color gain for tracking a color change.
- gamma correction is performed as shown below.
- ⁇ 2.2 + image quality correction ⁇ 0.1 ⁇ user's direction
- color gain correction is performed as described below.
- ColorGain UserColor + image quality correction ⁇ ⁇ ⁇ direction of user R (G, B)
- Gain R (G, B) ⁇ image quality correction ⁇ ⁇ ⁇ direction of user
- gamma correction or color gain correction may be performed on the center of gravity of the user group, or gamma correction or color gain correction may be performed by giving priority to a specific user.
- gamma correction or color gain correction may be performed by giving priority to a specific user.
- the position [D0, Ah0] of the user A in the horizontal direction and the position [D1, Ah1] of the user B as shown in FIG. 13A the user A's in the vertical direction as shown in FIG.
- the average viewing angle correction coefficient and the setting data for the system optimization process are calculated by the following equation.
- Average viewing angle correction factor ⁇ (1 / D0 * (Ah0 + Av0) + 1 / D1 * (Ah1 + Av1) + 1 / Dn * (Ahn + Avn)) / n ⁇ *
- Correction value setting data (Basic data) * ⁇ 1 + (correction value at maximum viewing angle) * (Average viewing angle correction factor) ⁇
- the average viewing angle is given to users near the distance, and the average of horizontal and vertical angles for the number of persons is calculated.
- a correction coefficient is obtained by multiplying the average viewing angle by the correction value, and the correction amount is calculated by multiplying the maximum correction value by the correction coefficient.
- the optimum value fluctuates according to the age and gender of the user, so optimization processing is performed using attribute information such as age and gender acquired from the image processing unit together with user position information. You may go.
- luminance correction of the display panel unit 102 is performed as described below.
- BackLight Basic setting value
- Correction value Correction value 10 ⁇ (A * log screen illumination + B * log viewing angle + C * log image average level + D * age) / screen illuminance
- the device direction optimization processing unit 146 of the system optimization processing unit 120 determines the device direction of the image display device 100 for the user at any position.
- Device direction control information for executing the device direction optimization process on the mechanical unit 109 of the image display device 100 for optimization is calculated.
- the device direction control information calculated by the device direction optimization processing unit 146 is sent to the device direction control unit 156 of the system control unit 122.
- the front direction axis [0, 0] of the image display device 100 is the user's direction [ ⁇ 1, ⁇ 1].
- the image display apparatus 100 is rotated.
- the display panel unit 102 of the image display apparatus 100 can be optimized to face the user's face.
- the voice characteristic control unit 152 of the system control unit 122 executes voice characteristic optimization processing based on the voice characteristic control information sent from the voice characteristic optimization processing unit 142. For example, based on the audio characteristic control information sent from the audio characteristic optimization processing unit 142, the audio characteristic control unit 152 executes control of the volume balance of the audio output from the speaker unit 108, and the like.
- the image characteristic control unit 154 of the system control unit 122 executes image characteristic optimization processing based on the image characteristic control information sent from the image characteristic optimization processing unit 144. For example, the image characteristic control unit 154 executes control of image characteristics of the display panel unit 102 based on the image characteristic control information sent from the image characteristic optimization processing unit 144.
- the device direction control unit 156 of the system control unit 122 executes the device direction optimization process based on the device direction control information sent from the device direction optimization processing unit 146.
- the device direction control unit 156 executes control of the mechanical unit 109 of the image display device 100 based on the device direction control information sent from the device direction optimization processing unit 146.
- control unit 110 included in the image display apparatus 100 according to the embodiment of the present invention has been described above with reference to FIGS. 3 and 4. Next, optimization processing according to the position of the user by the image display apparatus 100 according to the embodiment of the present invention will be described.
- FIG. 8 is a flow chart showing an example of optimization processing according to the position of the user by the image display apparatus 100 according to the embodiment of the present invention. The optimization processing according to the position of the user by the image display apparatus 100 according to the embodiment of the present invention will be described below using FIG.
- step S802 when the imaging unit 104 of the image display apparatus 100 starts imaging, the image input unit 112 of the control unit 110 receives an image captured by the imaging unit 104 (step S802).
- the image processing unit 114 of the control unit 110 executes a process of detecting a face included in the image received by the image input unit 112 (step S804).
- the viewing state analysis unit 116 of the control unit 110 calculates the relative position of the user with respect to the optical axis of the camera of the imaging unit 104 in the user direction / distance calculation unit, and the user position information calculation unit 134 The position of the user with respect to the device center and the front direction axis is calculated (step S806).
- the system optimization processing unit 120 of the control unit 110 optimizes the system for optimizing the state of the image display apparatus 100 for the user at any position based on the user position information calculated in step S806.
- System control information for executing the conversion process is calculated (step S808).
- step S 808 system control information for executing processing for optimizing the left and right volume balance of the sound output from the speaker unit 108 is calculated.
- step S 808 system control information for performing processing such as gamma correction for optimizing the appearance of black and correction of color gain for tracking a color change is calculated.
- system control information for executing a process for optimizing the device direction of the image display device 100 is calculated.
- step S810 the system control unit 122 of the control unit 110 executes a system optimization process based on the system control information calculated in step S808 (step S810), and ends this process.
- the state of the image display apparatus 100 can be optimized for the user at any position.
- the right and left volume balance of the sound output from the speaker unit 108 is optimized, and the user can view the image display apparatus 100 without feeling discomfort.
- the appearance of black, color change, and the like are optimized, and the user can view the image displayed on the image display device 100 favorably.
- the image display apparatus 100 is optimized in the direction in which the image display apparatus 100 faces the user as viewed from the user, and the user can view the image displayed on the image display apparatus 100 favorably.
- FIG. 9 is a flow chart showing an example of the optimization process according to the position of one or more users by the image display apparatus 100 according to the embodiment of the present invention.
- the image input unit 112 of the control unit 110 receives the image imaged by the imaging unit 104, and the image processing unit 114 of the control unit 110 The image input unit 112 executes detection processing of a face included in the received image (step S902).
- the viewing state analysis unit 116 of the control unit 110 receives the result of the face detection process by the image processing unit 114, and using the face detection result by the image processing unit 114, the number of detected users is one or more. Is determined (step S904).
- step S904 If it is determined in step S904 that the number of detected users is one, the viewing state analysis unit 116 of the control unit 110 obtains the horizontal angle and the vertical angle of the user (step S906).
- the system optimization processing unit 120 of the control unit 110 calculates a correction coefficient of system control information for system optimization processing based on the horizontal angle and vertical angle of the user obtained in step S906 (step S908). ).
- step S904 If it is determined in step S904 that the number of detected users is more than one, the viewing state analysis unit 116 of the control unit 110 determines whether or not more than one user is at the center of the image (step S910).
- step S910 when the plurality of users are not at the center of the image (NO in step S910), the viewing state analysis unit 116 of the control unit 110 determines the horizontal angle and the vertical angle for the number of users. And distances, and each is averaged (step S912). Further, in step S912, horizontal positions and vertical angles and distances of each of a plurality of users may be determined to determine barycentric positions of the plurality of users.
- the system optimization processing unit 120 of the control unit 110 calculates a correction coefficient of system control information for system optimization processing (step S914).
- step S910 when a plurality of users are at the center of the image (YES in step S910), system optimization processing unit 120 of control unit 110 corrects the system control information for system optimization processing. There is no coefficient or the weighting is changed to calculate a correction coefficient (step S916).
- the system optimization processing unit 120 of the control unit 110 After calculating the correction coefficient in steps S 908, S 914, and S 916, the system optimization processing unit 120 of the control unit 110 adds the correction coefficient to the basic data of the system control information for the system optimization process to obtain system control information. The calculation is performed (step S 918), and the process ends.
- the state of the image display apparatus 100 can be optimized for a plurality of users at arbitrary positions.
- FIG. 10 is a flow chart showing an example of the optimization processing according to the age of one or more users by the image display apparatus 100 according to the embodiment of the present invention.
- the image input unit 112 of the control unit 110 receives the image imaged by the imaging unit 104, and the image processing unit 114 of the control unit 110 The image input unit 112 executes detection processing of a face included in the received image (step S1002).
- the viewing state analysis unit 116 of the control unit 110 receives the result of the face detection process by the image processing unit 114, and using the face detection result by the image processing unit 114, the number of detected users is one or more. Is determined (step S1004).
- step S1004 If it is determined in step S1004 that the number of detected users is one, the viewing state analysis unit 116 of the control unit 110 analyzes the age of the user (step S1006).
- the system optimization processing unit 120 of the control unit 110 calculates a correction coefficient of system control information for system optimization processing based on the analysis result of age in step S1006 (step S1008).
- step S1004 when the number of detected users is more than one, the viewing state analysis unit 116 of the control unit 110 analyzes the age for the number of more than one users (step S1010).
- the viewing state analysis unit 116 of the control unit 110 calculates correction coefficients without correction of the system control information for system optimization processing or individually and averages the correction coefficients (step S1012).
- the system optimization processing unit 120 of the control unit 110 adds the correction coefficient to the basic data of the system control information for the system optimization process to calculate system control information (step S1014). , End this process.
- the state of the image display apparatus 100 can be optimized for a plurality of users of various ages.
- system optimization processing unit 120 may calculate system control information for optimizing the character size of the GUI displayed on display panel unit 102 based on the user position information. .
- the system control information calculated by the system optimization processing unit 120 is sent to the system control unit 122, and the system control unit 122 executes optimization processing for optimizing the character size of the GUI displayed on the display panel unit 102. Ru.
- the process of enlarging the character size of the GUI displayed on the display panel unit 102 when the user approaches the image display device 100 is performed. is there.
- the character size of the GUI is increased when the user approaches the image display device 100, and the character size of the GUI is decreased when the user leaves the image display device 100.
- the character size of the GUI increases as the user approaches it, and the user can easily recognize the GUI.
- the character size of the GUI displayed on the display panel unit 102 is reduced when the user approaches the image display device 100 as the process of optimizing the character size of the GUI.
- There is a process of refining the displayed information that is, increasing the amount of information.
- the character size of the GUI is reduced to increase the displayed information
- the character size of the GUI is increased to display the information.
- Reduce For example, when the program guide is displayed by the display panel unit 102, the character size is increased to reduce the amount of information when the user is away, and the character size is decreased when the user approaches. Increase the amount of information.
- the reference face size [w0, h0] at the reference distance d0 is corrected as follows: Variations in face size may be corrected by using a table. For example, based on attribute information such as the age of the user, a data table of the average face size at that age is stored in advance. For example, if the user is a child, the reference face size [w0, h0] is set as FIG. The face size [w0C, h0C] is smaller than the reference face size shown in FIG. 15, and the face size [w0A, h0] is larger than the reference face size shown in FIG. h0A].
- each user who registers image display apparatus 100 in advance for example, a family of the installation place of image display apparatus 100, is registered in image display apparatus 100 in advance.
- the face size of may be registered as a data table.
- the reference face size can be changed for each user.
- photographing with distance information in conjunction with another distance sensor (not shown), photographing after guiding the user to a certain distance, and a standard scale It can be realized by photographing at the same distance as
- the above-described system optimization is performed by estimating the user position out of the imaging range from the time-series transition information of the user. Processing can continue.
- an appropriate time constant can be set in the system optimization process according to the user's viewing environment.
- the system optimization process can be continued even if the user makes a sharp position movement.
- image display device 102 display panel unit 104 imaging unit 106 sensor unit 108 speaker unit 109 mechanism unit 110 control unit 112 image input unit 114 image processing unit 116 viewing state analysis unit 117 user position information storage unit 118 viewing state recording unit 120 system optimum Image processing unit 122 System control unit 132 User direction / distance calculation unit 134 User position information calculation unit 142 Speech characteristic optimization processing unit 144 Image characteristic optimization processing unit 146 Device direction optimization processing unit 152 Speech characteristic control unit 154 Image characteristic control unit 154 Part 156 Device Direction Control Part 200 Information Collection Server
Landscapes
- Engineering & Computer Science (AREA)
- Signal Processing (AREA)
- Multimedia (AREA)
- Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Social Psychology (AREA)
- General Engineering & Computer Science (AREA)
- Acoustics & Sound (AREA)
- Computer Hardware Design (AREA)
- Databases & Information Systems (AREA)
- Computer Networks & Wireless Communication (AREA)
- Human Computer Interaction (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Life Sciences & Earth Sciences (AREA)
- Artificial Intelligence (AREA)
- Bioinformatics & Cheminformatics (AREA)
- Bioinformatics & Computational Biology (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Data Mining & Analysis (AREA)
- Evolutionary Biology (AREA)
- Evolutionary Computation (AREA)
- Controls And Circuits For Display Device (AREA)
- Two-Way Televisions, Distribution Of Moving Picture Or The Like (AREA)
- Devices For Indicating Variable Information By Combining Individual Elements (AREA)
- User Interface Of Digital Computer (AREA)
Abstract
Description
<1.本発明の一実施形態>
[1-1.画像表示装置の構成]
[1-2.制御部の構成]
[1-3.ユーザの位置に応じた最適化処理]
[1-4.1または複数のユーザの位置に応じた最適化処理]
[1-5.1または複数のユーザの年齢に応じた最適化処理]
[1-1.画像表示装置の構成]
まず、本発明の一実施形態にかかる画像表示装置の構成について説明する。図1は、本発明の一実施形態にかかる画像表示装置100の外観について説明する説明図である。図1は、画像表示装置100を正面から見た場合の正面図である。以下、図1を用いて本発明の一実施形態にかかる画像表示装置100の外観について説明する。
図3は、本発明の一実施形態にかかる画像表示装置100に含まれる制御部110の構成について説明する説明図である。図3は、制御部110の中でも、制御部110に含まれる視聴状態解析部116の構成について説明するものである。以下、図3を用いて、視聴状態解析部116の構成について説明する。
水平方向:φ1=φ0*a1
垂直方向:θ1=θ0*b1
と算出される。
距離:d1=d0*(w0/w1)
と算出される。
x1=d1*cos(θ1-Δθ)*tan(φ1-Δφ)-Δx
y1=d1*tan(θ1-Δθ)-Δy
z1=d1*cos(θ1-Δθ)*cos(φ1-Δφ)-Δz
と算出される。
gain_Lch
= 20 * log(d_Lch / d_org_LRch)
gain_Rch
= 20 * log(d_Rch / d_org_LRch)
gain_dif
= gain_Rch - gain_Lch
=
20 * log(d_Rch) - 20 * log(d_Lch)
=
20 * log(d_Lch / d_Rch)
但し、gain_Lch
: LchのGain差分
gain_Rch
: RchのGain差分
d_org_LRch
: 左右スピーカ部から基準点までの距離
d_Lch
: Lchスピーカ部からユーザまでの距離
d_Rch
: Rchスピーカ部からユーザまでの距離
gain_dif
: L/Rの電圧Gain差分
d_cent_dif
= 0;
temp_CAM_DIS
= 0;
if(CAM_AUDIENCE
!= 0){
for(int i=0;i <
CAM_AUDIENCE;i++){
d_cent_dif +=
CAM_DIS[i] * tan(CAM_HOR_ANGLE[i] * PI / 180);
temp_CAM_DIS +=
CAM_DIS[i];
}
}
d_cent_dif
= d_cent_dif / CAM_AUDIENCE;
// return (use) value 重心処理済角度
CAM_DIS
= temp_CAM_DIS / CAM_AUDIENCE;
// return (use) value 重心処理済距離
但し、CAM_AUDIENCE
: 撮像部104の撮像範囲内のユーザ
CAM_HOR_ANGLE[0]
: ユーザAの角度
CAM_HOR_ANGLE[1]
: ユーザBの角度
CAM_HOR_ANGLE[2]
: ユーザCの角度
CAM_HOR_ANGLE[3]
: ユーザDの角度
CAM_DIS[0]
: ユーザAの距離
CAM_DIS[1]
: ユーザBの距離
CAM_DIS[2]
: ユーザCの距離
CAM_DIS[3]
: ユーザDの距離
例えば、以下に示すようにガンマ補正を行う。
γ=2.2+画質補正-0.1×ユーザの方向
また、例えば、以下に示すようにカラーゲインの補正を行う。
ColorGain=UserColor+画質補正±α×ユーザの方向
R(G,B)Gain=R(G,B)×画質補正±α×ユーザの方向
平均視聴角度補正係数 = {(1 / D0 * (Ah0 + Av0) + 1 / D1 * (Ah1 + Av1) + 1 / Dn * (Ahn +
Avn)) / n} * 補正値
設定データ =
(基本データ) * {1 + (最大視聴角度における補正値)
* (平均視聴角度補正係数)}
平均視聴角度は距離の近いユーザに重きをおき、人数分の水平角度と垂直角度の平均を求める。平均視聴角度に補正値をかけることで補正係数を求め、最大補正値に補正係数をかけることで全体の補正量を算出する。基本データ(補正なしのデータ:γ=2.2+画質補正)に補正量を加算または減算することで設定データを算出する。
BackLight
= 基本設定値 * 補正値
補正値 =
10^(A*log画面照度 + B*log視野角 +
C*log映像平均レベル + D*年齢)/画面照度
また、実験において、最適輝度は、画面照度、視野角、映像平均輝度レベル、年齢との間に以下に示すような関係があることが知られている。
log最適輝度 = A*log画面照度 + B*log視野角 + C*log映像平均レベル + D*年齢
図8は、本発明の一実施形態にかかる画像表示装置100によるユーザの位置に応じた最適化処理の一例について示す流れ図である。以下、図8を用いて本発明の一実施形態にかかる画像表示装置100によるユーザの位置に応じた最適化処理について説明する。
次に、本発明の一実施形態にかかる画像表示装置100による1または複数のユーザの位置に応じた最適化処理について説明する。図9は、本発明の一実施形態にかかる画像表示装置100による1または複数のユーザの位置に応じた最適化処理の一例について示す流れ図である。
次に、本発明の一実施形態にかかる画像表示装置100による1または複数のユーザの年齢に応じた最適化処理について説明する。図10は、本発明の一実施形態にかかる画像表示装置100による1または複数のユーザの年齢に応じた最適化処理の一例について示す流れ図である。
102 表示パネル部
104 撮像部
106 センサ部
108 スピーカ部
109 機構部
110 制御部
112 画像入力部
114 画像処理部
116 視聴状態解析部
117 ユーザ位置情報記憶部
118 視聴状態記録部
120 システム最適化処理部
122 システム制御部
132 ユーザ方向/距離算出部
134 ユーザ位置情報算出部
142 音声特性最適化処理部
144 画像特性最適化処理部
146 装置方向最適化処理部
152 音声特性制御部
154 画像特性制御部
156 装置方向制御部
200 情報収集サーバ
Claims (8)
- 画像表示方向に対する所定の範囲の動画像を撮像する撮像部と、
前記撮像部が撮像した動画像を解析して、ユーザの位置を算出する画像解析部と、
前記画像解析部が算出した前記ユーザの位置に基づいて、システムを最適化するためのシステム制御情報を算出するシステム最適化処理部と、
前記システム最適化処理部が算出した前記システム制御情報に基づいて、システムを最適化するシステム制御部と、
を備える、表示装置。 - 前記システム最適化処理部は、前記画像解析部が算出した前記ユーザの位置に基づいて、音声出力部から出力される音声の音量バランスを最適化するためのシステム制御情報を算出する、請求項1に記載の表示装置。
- 前記システム最適化処理部は、前記画像解析部が算出した前記ユーザの位置に基づいて、画像表示部の画像特性を最適化するためのシステム制御情報を算出する、請求項1に記載の表示装置。
- 前記システム最適化処理部は、前記画像解析部が算出した前記ユーザの位置に基づいて、画像表示部の表示内容を最適化するためのシステム制御情報を算出する、請求項1に記載の表示装置。
- 前記システム最適化処理部は、前記画像解析部が算出した前記ユーザの位置に基づいて、自装置の装置方向を最適化するためのシステム制御情報を算出する、請求項1に記載の表示装置。
- 前記画像解析部は、前記撮像部が撮像した動画像を解析して、ユーザの3次元位置を算出する、請求項1に記載の表示装置。
- 前記画像解析部は、前記撮像部が撮像した動画像を解析して、複数のユーザの位置をそれぞれ算出し、
前記システム最適化処理部は、前記画像解析部が算出した複数のユーザの位置に基づいて、複数のユーザの重心位置を算出し、算出した複数のユーザの重心位置に基づいて、システムを最適化するためのシステム制御情報を算出する、請求項1に記載の表示装置。 - 画像表示方向に対する所定の範囲の動画像を撮像する撮像ステップと、
前記撮像された動画像を解析して、ユーザの位置を算出する画像解析ステップと、
前記算出された前記ユーザの位置に基づいて、システムを最適化するためのシステム制御情報を算出するシステム最適化処理ステップと、
前記算出された前記システム制御情報に基づいて、システムを最適化するシステム制御ステップと、
を備える、制御方法。
Priority Applications (7)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
RU2012108872/08A RU2553061C2 (ru) | 2009-09-15 | 2010-07-22 | Устройство отображения и способ управления |
KR1020127006151A KR101784754B1 (ko) | 2009-09-15 | 2010-07-22 | 표시 장치 및 제어 방법 |
BR112012005231-4A BR112012005231A2 (pt) | 2009-09-15 | 2010-07-22 | dispositivo de exibição e método de controle |
US13/395,035 US8952890B2 (en) | 2009-09-15 | 2010-07-22 | Display device and controlling method |
CN201080048006.6A CN102687522B (zh) | 2009-09-15 | 2010-07-22 | 显示装置和控制方法 |
EP10816968A EP2472863A4 (en) | 2009-09-15 | 2010-07-22 | DISPLAY DEVICE AND CONTROL METHOD |
US14/589,107 US9489043B2 (en) | 2009-09-15 | 2015-01-05 | Display device and controlling method |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009213377A JP5568929B2 (ja) | 2009-09-15 | 2009-09-15 | 表示装置および制御方法 |
JP2009-213377 | 2009-09-15 |
Related Child Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US13/395,035 A-371-Of-International US8952890B2 (en) | 2009-09-15 | 2010-07-22 | Display device and controlling method |
US14/589,107 Continuation US9489043B2 (en) | 2009-09-15 | 2015-01-05 | Display device and controlling method |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2011033855A1 true WO2011033855A1 (ja) | 2011-03-24 |
Family
ID=43758468
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2010/062311 WO2011033855A1 (ja) | 2009-09-15 | 2010-07-22 | 表示装置および制御方法 |
Country Status (8)
Country | Link |
---|---|
US (2) | US8952890B2 (ja) |
EP (1) | EP2472863A4 (ja) |
JP (1) | JP5568929B2 (ja) |
KR (1) | KR101784754B1 (ja) |
CN (1) | CN102687522B (ja) |
BR (1) | BR112012005231A2 (ja) |
RU (1) | RU2553061C2 (ja) |
WO (1) | WO2011033855A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314848A (zh) * | 2011-09-09 | 2012-01-11 | 深圳Tcl新技术有限公司 | 液晶显示装置背光控制方法和系统 |
US20130278790A1 (en) * | 2012-04-18 | 2013-10-24 | Won Sik Oh | Image display system and method of driving the same |
CN106713793A (zh) * | 2015-11-18 | 2017-05-24 | 天津三星电子有限公司 | 一种声音播放控制方法及其装置 |
Families Citing this family (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US9330589B2 (en) * | 2011-11-16 | 2016-05-03 | Nanolumens Acquisition, Inc. | Systems for facilitating virtual presence |
JP5263092B2 (ja) | 2009-09-07 | 2013-08-14 | ソニー株式会社 | 表示装置および制御方法 |
JP5418093B2 (ja) | 2009-09-11 | 2014-02-19 | ソニー株式会社 | 表示装置および制御方法 |
JP2013031013A (ja) * | 2011-07-28 | 2013-02-07 | Toshiba Corp | 電子機器、電子機器の制御方法、電子機器の制御プログラム |
JP5892797B2 (ja) * | 2012-01-20 | 2016-03-23 | 日本放送協会 | 送受信システム及び送受信方法、受信装置及び受信方法 |
CN104412607A (zh) * | 2012-07-06 | 2015-03-11 | Nec显示器解决方案株式会社 | 显示设备以及用于显示设备的控制方法 |
US9412375B2 (en) | 2012-11-14 | 2016-08-09 | Qualcomm Incorporated | Methods and apparatuses for representing a sound field in a physical space |
JP6058978B2 (ja) * | 2012-11-19 | 2017-01-11 | サターン ライセンシング エルエルシーSaturn Licensing LLC | 画像処理装置及び画像処理方法、撮影装置、並びにコンピューター・プログラム |
CN104798129B (zh) * | 2012-11-27 | 2018-10-19 | 索尼公司 | 显示装置、显示方法和计算机可读介质 |
US20140153753A1 (en) * | 2012-12-04 | 2014-06-05 | Dolby Laboratories Licensing Corporation | Object Based Audio Rendering Using Visual Tracking of at Least One Listener |
JP6297985B2 (ja) * | 2013-02-05 | 2018-03-20 | Toa株式会社 | 拡声システム |
WO2014126991A1 (en) * | 2013-02-13 | 2014-08-21 | Vid Scale, Inc. | User adaptive audio processing and applications |
US10139925B2 (en) * | 2013-03-04 | 2018-11-27 | Microsoft Technology Licensing, Llc | Causing specific location of an object provided to a device |
DE102013206569B4 (de) * | 2013-04-12 | 2020-08-06 | Siemens Healthcare Gmbh | Gestensteuerung mit automatisierter Kalibrierung |
WO2015100205A1 (en) * | 2013-12-26 | 2015-07-02 | Interphase Corporation | Remote sensitivity adjustment in an interactive display system |
CN104298347A (zh) * | 2014-08-22 | 2015-01-21 | 联发科技(新加坡)私人有限公司 | 电子显示设备屏幕的控制方法、控制装置及显示系统 |
US10462517B2 (en) * | 2014-11-04 | 2019-10-29 | Sony Corporation | Information processing apparatus, communication system, and information processing method |
JP2016092765A (ja) * | 2014-11-11 | 2016-05-23 | 株式会社リコー | 情報処理装置、ユーザー検知方法、プログラム |
US10291949B2 (en) * | 2016-10-26 | 2019-05-14 | Orcam Technologies Ltd. | Wearable device and methods for identifying a verbal contract |
WO2018155354A1 (ja) * | 2017-02-21 | 2018-08-30 | パナソニックIpマネジメント株式会社 | 電子機器の制御方法、電子機器の制御システム、電子機器、及び、プログラム |
EP3396226B1 (en) * | 2017-04-27 | 2023-08-23 | Advanced Digital Broadcast S.A. | A method and a device for adjusting a position of a display screen |
CN107632708B (zh) * | 2017-09-22 | 2021-08-17 | 京东方科技集团股份有限公司 | 一种屏幕可视角度的控制方法、控制装置及柔性显示装置 |
US11011095B2 (en) * | 2018-08-31 | 2021-05-18 | Chongqing Hkc Optoelectronics Technology Co., Ltd. | Display panel, and image control device and method thereof |
KR20200027394A (ko) * | 2018-09-04 | 2020-03-12 | 삼성전자주식회사 | 디스플레이 장치 및 이의 제어 방법 |
US11032508B2 (en) * | 2018-09-04 | 2021-06-08 | Samsung Electronics Co., Ltd. | Display apparatus and method for controlling audio and visual reproduction based on user's position |
CN110503891A (zh) * | 2019-07-05 | 2019-11-26 | 太仓秦风广告传媒有限公司 | 一种基于距离变化的电子广告牌变换方法及其系统 |
JP2021015203A (ja) * | 2019-07-12 | 2021-02-12 | 富士ゼロックス株式会社 | 画像表示装置、画像形成装置及びプログラム |
CN112073804B (zh) * | 2020-09-10 | 2022-05-20 | 深圳创维-Rgb电子有限公司 | 电视声音调整方法、电视及存储介质 |
TW202333491A (zh) * | 2021-12-28 | 2023-08-16 | 日商索尼集團公司 | 資訊處理裝置、資訊處理方法、以及程式 |
CN114999335B (zh) * | 2022-06-10 | 2023-08-15 | 长春希达电子技术有限公司 | 基于超宽带、一维包络峰值的led拼接屏修缝方法 |
Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05137200A (ja) * | 1991-11-14 | 1993-06-01 | Sony Corp | ステレオ音量バランス自動調整装置 |
JPH09247564A (ja) * | 1996-03-12 | 1997-09-19 | Hitachi Ltd | テレビジョン受像機 |
JP2005044330A (ja) | 2003-07-24 | 2005-02-17 | Univ Of California San Diego | 弱仮説生成装置及び方法、学習装置及び方法、検出装置及び方法、表情学習装置及び方法、表情認識装置及び方法、並びにロボット装置 |
JP2007065766A (ja) | 2005-08-29 | 2007-03-15 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2008172817A (ja) * | 2008-02-18 | 2008-07-24 | Seiko Epson Corp | 制御システム及びこのシステムに適合する被制御装置 |
JP2009094723A (ja) * | 2007-10-05 | 2009-04-30 | Mitsubishi Electric Corp | テレビ受像機 |
Family Cites Families (46)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4973149A (en) * | 1987-08-19 | 1990-11-27 | Center For Innovative Technology | Eye movement detector |
JPH07106839B2 (ja) * | 1989-03-20 | 1995-11-15 | 株式会社日立製作所 | エレベーター制御システム |
DE19613178A1 (de) * | 1996-04-02 | 1997-10-09 | Heinrich Landert | Verfahren zum Betrieb einer Türanlage und eine nach dem Verfahren arbeitende Türanlage |
JP3968477B2 (ja) * | 1997-07-07 | 2007-08-29 | ソニー株式会社 | 情報入力装置及び情報入力方法 |
US6611297B1 (en) * | 1998-04-13 | 2003-08-26 | Matsushita Electric Industrial Co., Ltd. | Illumination control method and illumination device |
US6215471B1 (en) * | 1998-04-28 | 2001-04-10 | Deluca Michael Joseph | Vision pointer method and apparatus |
US6076928A (en) * | 1998-06-15 | 2000-06-20 | Fateh; Sina | Ideal visual ergonomic system for computer users |
US6243076B1 (en) * | 1998-09-01 | 2001-06-05 | Synthetic Environments, Inc. | System and method for controlling host system interface with point-of-interest data |
US6603491B2 (en) * | 2000-05-26 | 2003-08-05 | Jerome H. Lemelson | System and methods for controlling automatic scrolling of information on a display or screen |
JP3561463B2 (ja) * | 2000-08-11 | 2004-09-02 | コナミ株式会社 | 3dビデオゲームにおける擬似カメラ視点移動制御方法及び3dビデオゲーム装置 |
US7348963B2 (en) * | 2002-05-28 | 2008-03-25 | Reactrix Systems, Inc. | Interactive video display system |
US7627139B2 (en) * | 2002-07-27 | 2009-12-01 | Sony Computer Entertainment Inc. | Computer image and audio processing of intensity and input devices for interfacing with a computer program |
EP1426919A1 (en) * | 2002-12-02 | 2004-06-09 | Sony International (Europe) GmbH | Method for operating a display device |
AU2003301043A1 (en) * | 2002-12-13 | 2004-07-09 | Reactrix Systems | Interactive directed light/sound system |
US8123616B2 (en) * | 2003-03-25 | 2012-02-28 | Igt | Methods and apparatus for limiting access to games using biometric data |
US7117380B2 (en) * | 2003-09-30 | 2006-10-03 | International Business Machines Corporation | Apparatus, system, and method for autonomic power adjustment in an electronic device |
EP1551178A1 (en) * | 2003-12-18 | 2005-07-06 | Koninklijke Philips Electronics N.V. | Supplementary visual display system |
EP1566788A3 (en) | 2004-01-23 | 2017-11-22 | Sony United Kingdom Limited | Display |
EP1596271A1 (en) * | 2004-05-11 | 2005-11-16 | Hitachi Europe S.r.l. | Method for displaying information and information display system |
JP4734855B2 (ja) * | 2004-06-23 | 2011-07-27 | 株式会社日立製作所 | 情報処理装置 |
RU2370817C2 (ru) * | 2004-07-29 | 2009-10-20 | Самсунг Электроникс Ко., Лтд. | Система и способ отслеживания объекта |
JP4107288B2 (ja) * | 2004-12-10 | 2008-06-25 | セイコーエプソン株式会社 | 制御システム及びこのシステムに適合する被制御装置並びに遠隔制御装置 |
JP4899334B2 (ja) * | 2005-03-11 | 2012-03-21 | ブラザー工業株式会社 | 情報出力装置 |
KR101112735B1 (ko) * | 2005-04-08 | 2012-03-13 | 삼성전자주식회사 | 하이브리드 위치 추적 시스템을 이용한 입체 디스플레이장치 |
WO2006122004A1 (en) * | 2005-05-06 | 2006-11-16 | Omnilink Systems, Inc. | System and method of tracking the movement of individuals and assets |
JP4225307B2 (ja) * | 2005-09-13 | 2009-02-18 | 船井電機株式会社 | テレビジョン受信装置 |
US8218080B2 (en) * | 2005-12-05 | 2012-07-10 | Samsung Electronics Co., Ltd. | Personal settings, parental control, and energy saving control of television with digital video camera |
CN101379455B (zh) * | 2006-02-03 | 2011-06-01 | 松下电器产业株式会社 | 输入装置及其方法 |
JP4876687B2 (ja) * | 2006-04-19 | 2012-02-15 | 株式会社日立製作所 | 注目度計測装置及び注目度計測システム |
US8340365B2 (en) * | 2006-11-20 | 2012-12-25 | Sony Mobile Communications Ab | Using image recognition for controlling display lighting |
JP2008301167A (ja) * | 2007-05-31 | 2008-12-11 | Sharp Corp | 液晶テレビジョン受像機 |
CN101952818B (zh) * | 2007-09-14 | 2016-05-25 | 智慧投资控股81有限责任公司 | 基于姿态的用户交互的处理 |
WO2009038149A1 (ja) * | 2007-09-20 | 2009-03-26 | Nec Corporation | 映像提供システム、および映像提供方法 |
CN101874404B (zh) * | 2007-09-24 | 2013-09-18 | 高通股份有限公司 | 用于语音和视频通信的增强接口 |
US8539357B2 (en) * | 2007-11-21 | 2013-09-17 | Qualcomm Incorporated | Media preferences |
WO2009067676A1 (en) * | 2007-11-21 | 2009-05-28 | Gesturetek, Inc. | Device access control |
JP5169403B2 (ja) * | 2008-04-07 | 2013-03-27 | ソニー株式会社 | 画像信号生成装置、画像信号生成方法、プログラム及び記憶媒体 |
JP2010004118A (ja) * | 2008-06-18 | 2010-01-07 | Olympus Corp | デジタルフォトフレーム、情報処理システム、制御方法、プログラム及び情報記憶媒体 |
CN102119530B (zh) * | 2008-08-22 | 2013-08-21 | 索尼公司 | 图像显示设备、控制方法 |
US8400322B2 (en) * | 2009-03-17 | 2013-03-19 | International Business Machines Corporation | Apparatus, system, and method for scalable media output |
JP5263092B2 (ja) | 2009-09-07 | 2013-08-14 | ソニー株式会社 | 表示装置および制御方法 |
JP5556098B2 (ja) * | 2009-09-11 | 2014-07-23 | ソニー株式会社 | 表示方法及び表示装置 |
JP2011107899A (ja) * | 2009-11-16 | 2011-06-02 | Sony Corp | 情報処理装置、設定変更方法及び設定変更プログラム |
US8523667B2 (en) * | 2010-03-29 | 2013-09-03 | Microsoft Corporation | Parental control settings based on body dimensions |
JP2012104871A (ja) * | 2010-11-05 | 2012-05-31 | Sony Corp | 音響制御装置及び音響制御方法 |
US9288387B1 (en) * | 2012-09-11 | 2016-03-15 | Amazon Technologies, Inc. | Content display controls based on environmental factors |
-
2009
- 2009-09-15 JP JP2009213377A patent/JP5568929B2/ja active Active
-
2010
- 2010-07-22 WO PCT/JP2010/062311 patent/WO2011033855A1/ja active Application Filing
- 2010-07-22 US US13/395,035 patent/US8952890B2/en active Active
- 2010-07-22 KR KR1020127006151A patent/KR101784754B1/ko active IP Right Grant
- 2010-07-22 CN CN201080048006.6A patent/CN102687522B/zh active Active
- 2010-07-22 RU RU2012108872/08A patent/RU2553061C2/ru active
- 2010-07-22 EP EP10816968A patent/EP2472863A4/en not_active Ceased
- 2010-07-22 BR BR112012005231-4A patent/BR112012005231A2/pt not_active Application Discontinuation
-
2015
- 2015-01-05 US US14/589,107 patent/US9489043B2/en active Active
Patent Citations (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH05137200A (ja) * | 1991-11-14 | 1993-06-01 | Sony Corp | ステレオ音量バランス自動調整装置 |
JPH09247564A (ja) * | 1996-03-12 | 1997-09-19 | Hitachi Ltd | テレビジョン受像機 |
JP2005044330A (ja) | 2003-07-24 | 2005-02-17 | Univ Of California San Diego | 弱仮説生成装置及び方法、学習装置及び方法、検出装置及び方法、表情学習装置及び方法、表情認識装置及び方法、並びにロボット装置 |
JP2007065766A (ja) | 2005-08-29 | 2007-03-15 | Sony Corp | 画像処理装置および方法、並びにプログラム |
JP2009094723A (ja) * | 2007-10-05 | 2009-04-30 | Mitsubishi Electric Corp | テレビ受像機 |
JP2008172817A (ja) * | 2008-02-18 | 2008-07-24 | Seiko Epson Corp | 制御システム及びこのシステムに適合する被制御装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2472863A4 |
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN102314848A (zh) * | 2011-09-09 | 2012-01-11 | 深圳Tcl新技术有限公司 | 液晶显示装置背光控制方法和系统 |
US20130278790A1 (en) * | 2012-04-18 | 2013-10-24 | Won Sik Oh | Image display system and method of driving the same |
US9478161B2 (en) * | 2012-04-18 | 2016-10-25 | Samsung Display Co., Ltd. | Image display system and method of driving the same |
US20170025058A1 (en) * | 2012-04-18 | 2017-01-26 | Samsung Display Co., Ltd. | Image display system and method of driving the same |
CN106713793A (zh) * | 2015-11-18 | 2017-05-24 | 天津三星电子有限公司 | 一种声音播放控制方法及其装置 |
Also Published As
Publication number | Publication date |
---|---|
US20150185830A1 (en) | 2015-07-02 |
KR20120082406A (ko) | 2012-07-23 |
EP2472863A1 (en) | 2012-07-04 |
EP2472863A4 (en) | 2013-02-06 |
BR112012005231A2 (pt) | 2020-08-04 |
JP2011066516A (ja) | 2011-03-31 |
RU2012108872A (ru) | 2013-09-20 |
US8952890B2 (en) | 2015-02-10 |
US20120293405A1 (en) | 2012-11-22 |
CN102687522A (zh) | 2012-09-19 |
US9489043B2 (en) | 2016-11-08 |
RU2553061C2 (ru) | 2015-06-10 |
CN102687522B (zh) | 2015-08-19 |
JP5568929B2 (ja) | 2014-08-13 |
KR101784754B1 (ko) | 2017-10-16 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO2011033855A1 (ja) | 表示装置および制御方法 | |
JP5418093B2 (ja) | 表示装置および制御方法 | |
WO2014171142A1 (ja) | 画像処理方法および画像処理装置 | |
US20120236180A1 (en) | Image adjustment method and electronics system using the same | |
JP2010016796A (ja) | 画像撮影装置及び画像撮影方法、並びにコンピュータ・プログラム | |
US9779290B2 (en) | Detecting apparatus, detecting method and computer readable recording medium recording program for detecting state in predetermined area within images | |
JP6171353B2 (ja) | 情報処理装置、システム、情報処理方法およびプログラム | |
KR101647969B1 (ko) | 사용자 시선을 검출하기 위한 사용자 시선 검출 장치 및 그 방법과, 그 방법을 실행하기 위한 컴퓨터 프로그램 | |
JP5793975B2 (ja) | 画像処理装置、画像処理方法、プログラム、記録媒体 | |
JP2007265125A (ja) | コンテンツ表示装置 | |
WO2022048424A1 (zh) | 屏幕画面自适应调整方法、装置、设备和存储介质 | |
JP7074056B2 (ja) | 画像処理装置、画像処理システム、および画像処理方法、並びにプログラム | |
JP2021524120A (ja) | ディスプレイ検出装置、そのための方法、およびコンピュータ読み取り可能媒体 | |
US20110279649A1 (en) | Digital photographing apparatus, method of controlling the same, and computer-readable storage medium | |
JP5720758B2 (ja) | 表示装置および制御方法 | |
US20240112608A1 (en) | Display system and display method | |
US20230344956A1 (en) | Systems and Methods for Multi-user Video Communication with Engagement Detection and Adjustable Fidelity | |
WO2022085506A1 (ja) | コンテンツ出力装置、コンテンツ出力方法、およびプログラム | |
KR101386840B1 (ko) | 아이알 광원 및 영상 이미지 인식을 동시 처리하는 듀얼 인터랙티브 방법 | |
KR101479901B1 (ko) | 가상 카메라의 제어 장치 및 방법 | |
WO2023235329A1 (en) | Framework for simultaneous subject and desk capture during videoconferencing | |
JP2024047670A (ja) | 表示システムおよび表示方法 | |
US8421785B2 (en) | Electrical device capable of adjusting display image based on a rotation of a web camera and method thereof | |
JP2024047671A (ja) | 表示システムおよび表示方法 | |
WO2023192771A1 (en) | Recommendations for image capture |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201080048006.6 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 10816968 Country of ref document: EP Kind code of ref document: A1 |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012108872 Country of ref document: RU Ref document number: 2109/CHENP/2012 Country of ref document: IN |
|
ENP | Entry into the national phase |
Ref document number: 20127006151 Country of ref document: KR Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2010816968 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 13395035 Country of ref document: US |
|
REG | Reference to national code |
Ref country code: BR Ref legal event code: B01A Ref document number: 112012005231 Country of ref document: BR |
|
ENP | Entry into the national phase |
Ref document number: 112012005231 Country of ref document: BR Kind code of ref document: A2 Effective date: 20120308 |