EP2157781B1 - Dispositif de caméra et procédé d'imagerie - Google Patents

Dispositif de caméra et procédé d'imagerie Download PDF

Info

Publication number
EP2157781B1
EP2157781B1 EP08764175.9A EP08764175A EP2157781B1 EP 2157781 B1 EP2157781 B1 EP 2157781B1 EP 08764175 A EP08764175 A EP 08764175A EP 2157781 B1 EP2157781 B1 EP 2157781B1
Authority
EP
European Patent Office
Prior art keywords
protection
information
area
protected
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Not-in-force
Application number
EP08764175.9A
Other languages
German (de)
English (en)
Other versions
EP2157781A1 (fr
EP2157781A4 (fr
Inventor
Takaaki Namba
Yusuke Mizuno
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Corp
Original Assignee
Panasonic Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Panasonic Corp filed Critical Panasonic Corp
Publication of EP2157781A1 publication Critical patent/EP2157781A1/fr
Publication of EP2157781A4 publication Critical patent/EP2157781A4/fr
Application granted granted Critical
Publication of EP2157781B1 publication Critical patent/EP2157781B1/fr
Not-in-force legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G03PHOTOGRAPHY; CINEMATOGRAPHY; ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ELECTROGRAPHY; HOLOGRAPHY
    • G03BAPPARATUS OR ARRANGEMENTS FOR TAKING PHOTOGRAPHS OR FOR PROJECTING OR VIEWING THEM; APPARATUS OR ARRANGEMENTS EMPLOYING ANALOGOUS TECHNIQUES USING WAVES OTHER THAN OPTICAL WAVES; ACCESSORIES THEREFOR
    • G03B13/00Viewfinders; Focusing aids for cameras; Means for focusing for cameras; Autofocus systems for cameras
    • G03B13/32Means for focusing
    • G03B13/34Power focusing
    • G03B13/36Autofocus systems
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B7/00Mountings, adjusting means, or light-tight connections, for optical elements
    • G02B7/28Systems for automatic generation of focusing signals
    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B13/00Burglar, theft or intruder alarms
    • G08B13/18Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength
    • G08B13/189Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems
    • G08B13/194Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems
    • G08B13/196Actuation by interference with heat, light, or radiation of shorter wavelength; Actuation by intruding sources of heat, light, or radiation of shorter wavelength using passive radiation detection systems using image scanning and comparing systems using television cameras
    • G08B13/19678User interface
    • G08B13/19686Interfaces masking personal details for privacy, e.g. blurring faces, vehicle license plates
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/61Control of cameras or camera modules based on recognised objects
    • H04N23/611Control of cameras or camera modules based on recognised objects where the recognised objects include parts of the human body
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/80Camera processing pipelines; Components thereof
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N5/00Details of television systems
    • H04N5/222Studio circuitry; Studio devices; Studio equipment
    • H04N5/262Studio circuits, e.g. for mixing, switching-over, change of character of image, other special effects ; Cameras specially adapted for the electronic generation of special effects
    • H04N5/272Means for inserting a foreground image in a background image, i.e. inlay, outlay
    • H04N5/2723Insertion of virtual advertisement; Replacing advertisements physical present in the scene by virtual advertisement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N7/00Television systems
    • H04N7/18Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast
    • H04N7/183Closed-circuit television [CCTV] systems, i.e. systems in which the video signal is not broadcast for receiving images from a single remote source

Definitions

  • the present invention relates to a camera device and an imaging method of imaging a subject and recording image data of the subject that has been imaged to a recording medium.
  • Digital still cameras for photographing still images; camera devices, such as digital video cameras, for taking moving images; and a range of electronic devices, in which a digital camera is installed, are now in wide use. This has led to a growing risk that images taken unintentionally or by chance will include subjects protected under copyright, portrait rights, or other rights to privacy (hereinafter collectively called "protection targets"); or that taken images or their image data will spread. Moreover, a function for imaging and recording moving images for a short period is now also provided in most digital cameras and some camera phones. Furthermore, images captured by digital cameras and camera phones, including subjects that require protection, can also be distributed instantaneously worldwide via the Internet.
  • Patent Literature 1 One known protection technology is to obscure a subject to be protected if it is included in a camera shooting field, by masking the subject to be protected. In other words, portrait rights are protected by hiding protected subjects. Likewise, a technology for making protected subjects unidentifiable, in spite of the protected subject having been captured, is proposed in Patent Literature 1, for example.
  • a conventional camera has a function to obscure an image, such as by applying a computer-generated mosaic. More specifically, the conventional camera is designed to apply a mosaic to a part of the image in the image data obtained by photography, immediately after pressing a button for a mosaic mode, unless an appropriate password is input.
  • Figs. 10A and 10B show an example of image 201 taken using this type of conventional camera.
  • Fig. 10A is an example of image 201 taken using a normal mode without using the mosaic mode.
  • Fig. 10B shows an example of image 201 taken using the mosaic mode.
  • the user can set area 203 for obscuring subject 202 to be protected, which is a part of image 201, when the mosaic mode is selected.
  • Patent Literature 2 Another technology for protecting images of subjects and their image data is the use of a digital watermark embedded in an image so as to clearly indicate that the subject requires protection and block photography of this image. This technology is disclosed in Patent Literature 2. Still another technology for blocking the photography of a protected subject is the use of a device that transmits information on the protected subject, such as an IC tag. This technology is disclosed in Patent Literature 3.
  • Patent Literature 1 requires the user to manually designate the image area of a protected subject and set an area for applying protection before starting photography. This may result in a failure to designate it appropriately during video photography.
  • the structure of the camera disclosed in Patent Literature 2 and Patent Literature 3 enable protection of the entire shooting field. However, even if a protected subject is just a part of the shooting field, it is not possible to apply protection to the protected subject only. In addition, these conventional technologies cannot automatically detect protected subjects. Accordingly, if a protected subject moves or changes, it cannot be automatically followed.
  • the present invention offers a camera device and an imaging method applicable to the camera device for applying protection to an image area of a subject to be protected in an image including this subject, even if the subject moves or changes, by following the subject.
  • the camera device of the present invention includes an imaging unit, a communications unit, an identified area detection unit, and an image data processing, unit, according to claim 1.
  • This structure provides camera device that generates image data in which a subject requiring copyright, portrait right, or other privacy protections is obscured in a photographed subject.
  • the imaging method of the present invention is a method of potecting a subject to be protected in a target on taking an image.
  • the imaging method includes an imaging step, an identified area detection step, and an image data processing step, according to claim 18. This offers an imaging method d generating image data in which subjects requiting copyright, portrait right, or other privacy protections is obscured
  • WO 00/13411 A1 describes a method and apparatus for recording an interview or the like with a speaker whose identity is to be maintained in secrecy. To this end, the video information corresponding to the skin tone color of the speaker is manipulated prior to recording so the recorded image comprising the speakers face is obscured.
  • D1 is silent regarding a transmission of object data and/or protection target information according to the invention. As a consequence, it is not possible to distinguish between objects that need to be obscured and other objects that do not need to be obscured.
  • EP 1 388 802 A describes a device for face identification so a specific person can be identified and recorded while the face images of other people in the image are obscured in an effort to protect their rights of privacy. This achieved by image processing without using object information and/or protection target information reflecting size, shape, direction and position of objects and targets, respectively.
  • JP 8 051611 A describes a video conference system, wherein the images transmitted to other participants of the video conference can be partly obscured. This is useful for hiding employees and/or facilities than could, otherwise, be seen in the background of the speaker participating in the video conference.
  • the obscured area in the image is determined by an operator, and the position of the obscured area within the image is then changed in accordance with the rotation of a camera producing the image.
  • the obscured or muted image area does not correspond to a particular object or its size, shape, direction and position. Instead, the muted area consists of lateral, upper and/or lower border portions of the image.
  • US 2006/206911 A1 describes a security camera system, wherein previously stored privacy locations can be obscured so privacy areas such as entrance doors or the like can be protected without losing all information relating to the protected area. This is useful for fixed installations, so it is not necessary to dynamically identify areas of interest that need to be protected by adapting protection targets to the specified objects of interest.
  • Fig. 1 is a block diagram illustrating the structure of camera device 10 in the exemplary embodiment of the present invention.
  • Fig. 2 is a block diagram of the structure of protection control unit 17 employed in camera device 10.
  • Camera device 10 in the exemplary embodiment of the present invention is used as, for example, a digital video camera for taking moving images.
  • Camera device 10 outputs an image after applying protection to an output display image or playback image of image data sets V10 and V20 obtained by imaging subjects 30 and 31, when subjects 30 and 31 that are taken images require copyright, portrait rights or other privacy protections.
  • Camera device 10 of the present invention is also applicable to camera phones or a digital camera linked to a range of electronic devices with digital camera, such as camera PDAs.
  • camera device 10 is also applicable to digital still cameras for photographing still images, and a video-recording function provided in digital still cameras.
  • camera device 10 includes first camera 11 (hereinafter referred to as camera 11), second camera 12 (hereinafter referred to as camera 12), microphone 25, image and audio forming unit 16, and image and audio memory 81 (hereinafter referred to as memory 81).
  • Camera device 10 also includes network communications unit 14 (hereinafter referred to as communications unit 14), camera control unit 15, autofocus processing unit 13 (hereinafter referred to as processing unit 13), and gyro unit 26.
  • camera device 10 includes identified area detection unit 83 (hereinafter referred to as detection unit 83), shape searching unit 85, distance estimation unit 82, and digital watermark extraction unit 84 (hereinafter referred to as extraction unit 84).
  • camera device 10 includes protection control unit 17 and protection processing unit 18.
  • Protection control unit 17 includes protection determination unit 71, protected area control unit 72, protection accuracy control unit 73, and protection time control unit 74.
  • camera device 10 includes record/reproduction processing unit 19 (hereinafter referred to as processing unit 19), read and write control unit 20 (hereinafter referred to as R/W control unit 20), output interface 21 (hereinafter referred to as output I/F 21), image and audio presentation unit 23 (hereinafter referred to as presentation unit 23), and user interface 24 (hereinafter referred to as user I/F 24).
  • Cameras 11 and 12 are imaging unit for obtaining image data by imaging an area in a shooting field. Cameras 11 and 12 capture an image in the shooting field that includes subject 30 and subject 31, and generate image data corresponding to the image captured.
  • Subject 30 and subject 31 are part of an imaging target. Image 35 contained in subject 30 and subject 31 is assumed to be a specific subject that requires protection.
  • IC tag 34 is attached to subject 30.
  • Subject 31 is, for example, image display unit 36, such as a television set, that displays image 35, and IC tag 34 is also attached to subject 31.
  • Camera device 10 includes, for example, two cameras: Camera 11 and camera 12. Twin-lens imaging is made possible by providing cameras 11 and 12 with a predetermined distance in between. Cameras 11 and 12 thus enable the generation of 3-D image data.
  • camera device 10 may also have a structure for displaying or recording and reproducing normal images that are not 3-D images.
  • one of cameras 11 and 12 may be an imaging unit dedicated to a distance-measuring function.
  • camera device 10 may have a structure including only one of cameras 11 and 12.
  • Cameras 11 and 12 include, for example, optical lens group (not illustrated), image pickup devices (not illustrated), and AD converter (not illustrated).
  • the image pickup device is typically an image sensor configured with CCD or MOS aligned in matrix.
  • the AD converter has a function toconvert an image signal to digital data.
  • the amount of light detected by each pixel of the image pickup device is converted to electric signals, which in turn become image signals.
  • image signals are converted to image data, which are digital signals, by an AD converter.
  • Camera 11 outputs image data V10 generated in this way.
  • camera 12 outputs image data V20 generated.
  • Image data V10 output from camera 11 and image data V20 output from camera 12 are input to image and audio forming unit 16.
  • Microphone 25 captures sound around the shooting field taken by cameras 11 and 12, and converts this sound to electric signals. Converted electric signals are input to image and audio forming unit 16 as audio data A3 converted to digital data.
  • Image and audio forming unit 16 receives two pieces of image data V10 and V20 output from cameras 11 and 12, respectively, and converts them to image data in a predetermined format. For example, to synthesize image data V30 for displaying a 3-D image, data V30 that allows stereovision is formed from two sets of image data V10 and V20. Image and audio forming unit 16 also receives audio data A3 output from microphone 25. Image and audio forming unit 16 first stores image data sets V10 and V20 and audio data A3 in memory 81. Memory 81 is a memory for storing image data sets V10 and V20 and audio data A3 in units of frame. Image and audio forming unit 16 stores image data sets V10 and V20 in memory 81 and extracts stored image data sets V10 and V20 as required.
  • Image and audio forming unit 16 synthesizes image data V30 by using image data V10 output from camera 11 that corresponds to the right eye as the first field and image data V20 output from camera 12 that corresponds to the left eye as the second field.
  • An image of image data V30 is displayed using, for example, glasses with shutter (not illustrated) so as to switch the shutter synchrony with the timing of the right image data and left image data. This makes a 3-D image visible.
  • Image and audio forming unit 16 supplies image data V30 to protection processing unit 18.
  • Image data V30 output from image and audio forming unit 16 is not necessarily a 3-D image. It may be normal image data V10 output from camera 11, or normal image data V20 output from camera 12, or both.
  • Communications unit 14 is a communications means that communicates with IC tag 34 attached to subjects 30 and 31 so as to obtain tag information stored in IC tag 34.
  • Communications unit 14 for example, transmits high-frequency electromagnetic waves, such as radio waves 39, and receives radio waves 39 such as a response signal to radio waves 39 transmitted.
  • IC tag 34 receives radio waves 39 transmitted from communications unit 14.
  • IC tag 34 responds to radio waves 39 received, and transmits radio waves 30, including the tag information stored in IC tag 34.
  • Communications unit 14 receives radio waves 39 including the tag information transmitted from IC tag 34, and extracts the tag information from the radio signals.
  • IC tag 34 attached to subject 30 includes information on whether or not subject 30 to which IC tag 34 is attached requires copyright, portrait right or other privacy protections; and subject information such as the size, shape, direction and position of subject 30.
  • requirement information refers to information on the need of copyright protection and/or privacy protection.
  • the subject information includes object information on subject 30 and protection target information on a protection target
  • the object information includes object size information, object shape information, and object direction information related to subject 30.
  • the protection target information includes protection target size information, protection target shape information, protection target direction information, and protection target positional information.
  • IC tag 34 attached to subject 31 includes the requirement information on copyright or privacy protection of subject 31 to which IC tag 34 is attached as the tag information, and also includes the subject information on a display state of image display unit 36, such as screen size information and screen direction information on subject 31.
  • the subject information includes the object information on subject 31 and the protection target information on the protection target
  • the protection target information includes screen size information and screen direction information on subject 31.
  • IC tag 34 is also called an electronic tag, wireless tag, or RFID (radio frequency identification).
  • This tag is a small memory device with a communications function, and an electronic circuit (not illustrated) is built inside.
  • the electronic circuit includes a memory (not illustrated) and wireless communications unit (not illustrated).
  • the IC tag is attached to, for example, an article (not illustrated); and an IC tag detection unit (not illustrated) reads the tag information stored in the IC tag. Reading the tag information stored in the IC tag, reveals the identification code, owner, or identity of the article to which the IC tag is attached.
  • communications unit 14 communicates with IC tag 34.
  • the target that communications unit 14 communicates to obtain the subject information is not limited to IC tag 34.
  • a small memory device includes a memory and wireless communications unit, and can communicate with communications unit 14, any communications target of communications unit 14 is acceptable.
  • a radio communications device or optical device such as Bluetooth (Registered trademark of Bluetooth SIG, Inc) in which a memory is provided, and an optical communications device is also applicable.
  • Subject 31 is image display unit 36 such as a television set.
  • Image display unit 36 shown in Fig. 1 displays image 35, which typically requires copyright protection, as subject 31 together with digital watermark for protecting image 35.
  • Table 1 shows an example of the tag information stored in IC tag 34.
  • the tag information includes a header area and data area, and stores information on subjects 30 and 31 to which IC tag 34 is attached. If subjects 30 and 31 are captured by camera device 10, information on subjects 30 and 31 is recognized as the subject information within a shooting field.
  • Tag-ID is information on an identifier for identifying IC tag 34, and its information length is 8 bytes.
  • Length is information on a length of the tag information to be transmitted from IC tag 34, and its information length is 8 bytes.
  • Content ID is information for identifying subjects 30 and 31 to which IC tag 34 is attached, and its information length is 64 bytes.
  • Type is content auxiliary information of subjects 30 and 31 or information for identifying EPG information (also called EPG info), and its information length is 4 bytes.
  • Encryption is information on encryption that shows whether or not information recorded in the data area is encrypted, and its information length is 4 bytes.
  • CRC Cyclic Redundancy Check
  • Information recorded in the data area is information to be encrypted.
  • Content ID is information for identifying subjects 30 and 31 to which IC tag 34 is attached, and its information length is 64 bytes.
  • Type is content auxiliary information or information for identifying EPG information, and its information length is 4 bytes.
  • Content Attribute is information on content attribute of subjects 30 and 31, and its information length is variable.
  • CRC is information for checking alteration of the tag information, and its information length is 4 bytes.
  • Content Attribute includes “Content information” (hereinafter referred to as “Content Info”), “Environmental information” (hereinafter referred to as Env Info), and “EPG Info.”
  • Content Info is information on contents of individual subjects 30 and 31, and its information length is variable.
  • Env Info is information on environment around subjects 30 and 31, and its information length is variable.
  • EPG Info is EPG information on image 35 contained in subject 31, and its information length is variable.
  • Content Info includes “Protect,” “Target Content Size,” “Content Direction,” “Protect Content Size,” “Protect Content Direction,” “Acceleration” (hereinafter referred to as “Accel”), and “Reserved.”
  • “Protect” is information on whether or not subjects 30 and 31 require copyright or privacy protection, and its information length is 4 bytes.
  • “Protect” also includes information on whether a protection target is a partial area or entire area, in addition to information on whether or not subjects 30 and 31 are protection targets. Accordingly, “Protect” is indicated, for example, by “No protection” (None), “Partial protection” (Partial), and “Entire protection” (All).
  • “Target Content Size” is the object size information that indicates the size of subjects 30 and 31 and the object shape information that indicates the shape of subjects 30 and 31, and its information size is 32 bytes. The object shape information may also be recorded as area information.
  • “Content Direction” is the object direction information that indicates the direction of subjects 30 and 31, and its information length is 16 bytes.
  • Protectt Content Size is the protection target size information that indicates the size of protection target contained in subjects 30 and 31 and the protection target shape information that indicates the shape of protection target, and its information length is 32 bytes. The protection target shape information may also be recorded as the area information.
  • “Protect Content Direction” is the protection target direction information that indicates the direction of protection target contained in subjects 30 and 31, and its information length is 16 bytes.
  • “Accel” is information on the movement of protection target contained in subjects 30 and 31, and its information length is 32 bytes. "Accel” is, for example, the protection target acceleration information or the protection target angular speed information. It is the protection target positional information that indicates the position of protection target. "Reserved” is a reserved area secured as a spare area for recording content information further required related to subjects 30 and 31 or the protection target contained in subjects 30 and 31, and its information length is variable.
  • End Info includes “Temp” and “Reserved.”
  • Temp is environmental temperature information on the temperature around subjects 30 and 31 or target temperature information on the temperature of the protection target contained in subjects 30 and 31; and its information length is 4 bytes.
  • Reserved is a reserved area secured as a spare area for recording environmental information that may be further required related to the environment around subjects 30 and 31 or environmental condition of the protection target itself; and its information length is variable.
  • EPG Info includes “Schedule” and “Reserved.”
  • Stule is an information area for recording the program schedule information related to an image contained in subject 31, and its information length is variable. Furthermore, “Schedule” includes the protection requirement information that indicates whether or not copyright or privacy protection is required for an image contained in subject 31.
  • “Reserved” is a reservation area secured as a spare area for recording the EPG information that may be further required related to the EPG information on the image contained in subject 31; and its information length is variable.
  • tag information stored in IC tag 34 are examples.
  • the tag information is not limited to the above structure. As long as camera device 10 has at least the subject information including the protection target information needed for specifying a protected area to apply protection, as described below, any structure is applicable.
  • Communications unit 14 can also communicate with server 39 having a server communications unit (not illustrated).
  • communications unit 14 transmits high-frequency electromagnetic waves, such as radio waves 39, to server 38, and receives radio waves 39 including a response signal, which is a response to the signal transmitted.
  • the response signal from server 38 is, for example, information on electronic program guide (EPG) for public broadcast that image display unit 36 displays.
  • EPG electronic program guide
  • Server 38 is connected to a network distribution server, such as broadcast or communication contents provider, and can thus obtain the EPG program information from a distribution server via network.
  • Camera control unit 15 controls cameras 11 and 12 and processing unit 13 in response to imaging instruction information from user I/F 24.
  • camera control unit 15 controls the optical lens group of cameras 11 and 12 so as to narrow the shooting field to be captured by cameras 11 and 12.
  • camera control unit 15 controls the optical lens group of cameras 11 and 12 so as to widen the shooting field to be captured by the cameras 11 and 12.
  • Camera control unit 15 notifies the information that camera control unit 15 has used for controlling cameras 11 and 12, such as magnification rate in case of zoom-in, to detection unit 83 and protected area control unit 72.
  • the control information is a part of imaging condition information that camera device 10 uses for taking an image.
  • processing unit 13 controls the focus of the optical lens group of cameras 11 and 12, respectively, such that cameras 11 and 12 are focused to an imaging target positioned close to the center of the shooting field.
  • the imaging target positioned close to the center of the shooting field of cameras 11 and 12 may be subjects 30 and 31.
  • processing unit 13 may also control the focus of the optical lens group of cameras 11 and 12, respectively, so as to focus on the imaging target in a large area. This large imaging target may include subjects 30 and 31.
  • Processing unit 13 generates distance information between subjects 30 and 31 and camera device 10, and notifies it to camera control unit 15.
  • Camera control unit 15 notifies this distance information notified from processing unit 13 to detection unit 83 and protected area control unit 72.
  • the distance information is also a part of the imaging condition information that camera device 10 uses for taking an image.
  • Gyro unit 26 has a built-in gyro sensor, such as an angular speed sensor or acceleration sensor, and outputs direction information and positional information of camera device 10 to which gyro unit 26 is installed, based on an output signal of the gyro sensor built in gyro unit 26.
  • the direction information and the positional information output is input to camera control unit 15.
  • the direction information and the positional information on camera device 10 input to camera control unit 15 are notified to detection unit 83 and protected area control unit 72.
  • the direction information on camera device 10 is information that indicates the relative direction of camera device 10.
  • the relative direction of camera device 10 may be, for example, a degree of directional change from the start of imaging, or the relative directional relationship between subjects 30 and 31 and camera device 10.
  • the positional information on camera device 10 is also information that indicates the relative position of camera device 10.
  • the relative positional information of camera device 10 is, for example, a degree of positional change from the start of imaging or the relative positional relationship between subjects 30 and 31 and camera device 10.
  • the direction information and the positional information on camera device 10 are both part of the imaging condition information that camera device 10 uses for taking an image.
  • Detection unit 83 functions as an identified area detection means for dynamically detecting an identified area corresponding to specific subjects 30 and 31 included in image data obtained through cameras 11 and 12.
  • detection unit 83 detects identified areas corresponding to specific subjects 30 and 31, which are protection targets, based on the protection target information, candidate information, digital watermark area information, the distance information, the imaging condition information, and so on notified to detection unit 83.
  • Detection unit 83 detects the identified area based on basic information in the tag information, including the protection target size information, the protection target shape information and the protection target direction information on specific subjects 30 and 31; and the operation information that dynamically changes or the distance information following the movement.
  • Detection unit 83 notifies information on a detected identified area to protected area control unit 72 as identified area information. Detection unit 83 receives the protection target information from protected area control unit 72, the candidate information from shape searching unit 85, the digital watermark area information from extraction unit 84, the distance information from distance estimation unit 82, and the imaging condition information from camera control unit 15, respectively. When detection unit 83 dynamically detects an identified area corresponding to subjects 30 and 31, the identified area does not have to be detected based on all of the protection target information, the candidate information, the digital watermark area information, the distance information, and the imaging condition information.
  • the identified area can be dynamically detected based on the protection target information and the candidate information. If a moving target in the shooting field is only subject 30, the identified area corresponding to subject 30 can be dynamically detected based only on the distance information or the operation information. To dynamically detect the identified area corresponding to subject 31, the identified area can be dynamically detected based only on the digital watermark area information.
  • the multiple uses of the protection target information, the candidate information, the digital watermark area information, the distance information and the operation information further improve the detection accuracy on dynamically detecting the identified area.
  • Shape searching unit 85 searches a partial image area corresponding to the protection target information notified from detection unit 83 in at least image data V10 or image data V20 stored in memory 81. In other words, if a shape like a facial shape, for example, is designated, shape searching unit 85 searches an area where the facial shape exists in image data V10 or image data V20 stored in memory 81. As a result of searching the presence of facial shape, if shape searching unit 85 determines that the facial shape exists, shape searching unit 85 generates positional information on where the facial shape exists and size information of face. Shape searching unit 85 notifies generated positional information and size information to detection unit 83 as the candidate information.
  • shape searching unit 85 searches image data sets V10 and V20 for the partial image area corresponding to the protection target shape information notified from detection unit 83. Shape searching unit 85 then notifies detection unit 83 and distance estimation unit 82 of the size information and the positional information of a partial image that is searched and detected as the candidate information.
  • Distance estimation unit 82 estimates a distance from camera device 10 to subject 30 with reference to image data V10 and image data V20.
  • the position of subject 30 deviates sideways when two images in image data sets V10 and V20 of twin lenses generated by two cameras 11 and 12 are overlaid. Closer the distance is between camera device 10 and subject 30, larger the sideways deviation is in subject 30 of image data V10 and image data V20. Accordingly, the distance from camera device 10 to subject 30 can be estimated based on amount of sideways deviation of subject 30.
  • Distance estimation unit 82 notifies detection unit 83 of a distance to subject 30 estimated based on this principle as the distance information.
  • Extraction unit 84 refers to at least image data V10 or image data V20 stored in memory 81, and determines whether or not a digital watermark exists in the image data referred to. If extraction unit 84 determines that the digital watermark exists in an image of image data sets V10 and V20, extraction unit 84 generates information on image display unit 36 displaying the digital watermark on an image or on area such as a display portion. Extraction unit 84 then notifies detection unit 83 of this information on area and information on this area at least including the size information, the shape information, and the positional information as the digital watermark area information.
  • IC tag 34 storing the tag information including information on a display state of image display unit 36 is attached, such as the case of subject 31, information on display state is notified from detection unit 83 to extraction unit 84 as the protection target information.
  • Extraction unit 84 uses this protection target information notified from detection unit 83 for adjusting the size information and the shape information on an area requiring copyright or privacy protection or an area to replace an image, and these pieces of information are notified to detection unit 83 as the digital watermark area information.
  • protection control unit 17 specifies a protected area in the identified area information detected by detection unit 83, based on information such as the tag information obtained by communications unit 14. Protection control unit 17 operates in response to an instruction signal based on user instruction notified via user I/F 24.
  • the instruction signal notified via user I/F 24 is, for example, processing output area designation information, image data accuracy level designation information, or camera operator schedule information.
  • the camera operator schedule information is, for example, output processing timing designation information.
  • protection control unit 17 includes protection determination unit 71, protected area control unit 72, protection accuracy control unit 73, and protection time control unit 74.
  • Protected area control unit 72 includes protected area prediction control unit 75.
  • the instruction information related to imaging that is set by the user, who is an operator of camera device 10, via user I/F 24 is notified to protection control unit 17.
  • protection control unit 17 notifies camera control unit 15 of the instruction information.
  • the tag information is notified from IC tag 34 to communications unit 14
  • the tag information is notified to protection control unit 17.
  • protection determination unit 71 first analyzes the tag information notified.
  • protection determination unit 71 determines that copyright or privacy protection is "required” in the requirement information on copyright/privacy protection that is generated as one of analysis results of the tag information, this determination result is notified to protected area control unit 72, protection accuracy control unit 73, and protection time control unit 74.
  • protected area control unit 72 Upon receiving the determination result that copyright or privacy protection is "required”, protected area control unit 72 notifies detection unit 83 of the protection target information on a protection target extracted from the tag information.
  • Detection unit 83 analyzes a target to be protected in the shooting field, based on the protection target information received from protected area control unit 72. If a target to be protected is detected, detection unit 83 notifies it to protected area control unit 72 as an identified area.
  • protection accuracy control unit 73 notifies the information on accuracy of protected area to protected area control unit 72 so as to control the accuracy in specifying the protected area.
  • Protection time control unit 74 also notifies information on the time to apply protection to protected area control unit 72 so as to control the time to specify the protected area.
  • protected area control unit 72 specifies a protected area in the identified area detected by detection unit 83.
  • Protected area control unit 72 is a protected area control means for controlling an area that protection processing unit 18 applies protection.
  • Protected area control unit 72 specifies the protected area in the identified area based on the protection target information included in the tag information notified from communications unit 14, and the imaging condition information including the operation information, the distance information and the positional information notified from camera control unit 15.
  • protected area control unit 72 specifies the protected area in the identified area based on processing output area designation designated by the user via user I/F 24.
  • the user selects, for example, "Entire shooting field,” "Projective area of protection target,” or "Area around the protection target inclusive.” If "Area around the protection target inclusive" is selected, a margin area of an accuracy-level designation value (e.g. 10%) set by the user is added to a boundary area of the protection target, and thus a broader area is specified as the protected area.
  • protected area control unit 72 specifies the protected area in the identified area based on the protection accuracy information on the protected area notified from protection accuracy control unit 73.
  • the protection accuracy information on protected area is, for example, projective area information or margin calculation information calculated based on a value on accuracy level designated by the user.
  • protection accuracy control unit 73 notifies the projective area information to protected area control unit 72, and protected area control unit 72 specifies the protected area in identified area based on this notified projective area information. If "Area around the protection target inclusive” is selected, protection accuracy control unit 73 notifies the margin calculation information to protected area control unit 72, and protected area control unit 72 considers this notified margin calculation information for specifying the protected area in the identified area.
  • Protected area control unit 72 also specifies the protected area in identified area based on the protection time information on the protection time notified from protection time control unit 74. Protected area control unit 72 notifies the protected area information on specified protected area to protection processing unit 18.
  • Protection accuracy control unit 73 is a protection accuracy control means provided for controlling the accuracy of protection processing when protection is applied to the protected area. Protection accuracy control unit 73 calculates an area requiring protection in subjects 30 and 31 as a projective area relative to the shooting field, depending on the direction information on subjects 30 and 31 to be protected. The projective area information can be notified to protected area control unit 72 as the protection accuracy information on protected area. In addition, protection accuracy control unit 73 calculates a margin from a boundary area of the protection target to a boundary area of protected area, depending on the accuracy level designation information previously set by the user, and notifies a margin calculation result as protection accuracy information to protected area control unit 72.
  • a margin area around subjects 30 and 31 to be protected may be set in accordance with a state of shooting field, i.e., the use environment of camera device 10.
  • the margin area is adjusted to a predetermined area or a predetermined shape for specifying the protected area. More specifically, in the environment that subjects 30 and 31 including protection target uses easily identifiable background, the margin area can be set small. Contrarily, in the use environment that subjects 30 and 31 including protection target is not easily identifiable, the margin area may be set large. For example, it is assumed that the protection target in subject 30 is a face, and multiple faces exist in the shooting field. If multiple faces, which are not protection targets, larger than the face of subject 30 exist, a protection target to be at least protected in subjects 30 and 31 is protected by setting a larger margin area.
  • Protection time control unit 74 is a protection time control means provided for controlling the time to apply protection when protection is applied to the protected area. Protection time control unit 74 notifies protected area control unit 72 of the protection time information on the time to apply protection to the protected area. This protection time information is needed for specifying the protected area by protected area control unit 72 based on the camera operator schedule information including the processing output timing designation information set by the user via user I/F 24 or the EPG information notified from communications unit 14.
  • the protection time information includes information on a timing to turn on and off protection of subjects 30 and 31, which are protection targets, in accordance with user setting.
  • the protection time information also includes information on the time to switch between a dynamic mask control and a static mask control relative to protection processing of subjects 30 and 31, which are protection targets, in accordance with the user setting.
  • the protection time information includes information on time corresponding to broadcast time of a specific program in which contents to be protected is broadcast, in accordance with the EPG information obtained by communications unit 14.
  • Protection time control unit 74 controls protected area control unit 72 such that protected area control unit 72 specifies the protected area in the identified area based on these pieces of the protection time information.
  • Protected area prediction control unit 75 is a protected area prediction control means provided for predicting the movement of protected area in the shooting field when the protected area is specified by protected area control unit 72, and preparing a mask area in advance.
  • Protected area prediction control unit 75 predicts the movement of subject 30 based on, for example, dynamically-changing specific information notified from detection unit 83 so as to prepare a mask area in advance.
  • Protected area prediction control unit 75 also receives captured image data sets V10 and V20 from detection unit 83, and predicts the movement of protected area based on changes in frames before and after image data sets V10 and V20, so as to prepare the mask area in advance.
  • protected area prediction control unit 75 can also prepare the mask area in advance by predicting the movement of protected area based on the positional information or the direction information on camera device 10 notified from camera control unit 15.
  • the positional information and the direction information of camera device 10 are pieces of information output from gyro unit 26.
  • Protected area control unit 72 can also specify the protected area based on the mask area information on the protected area prepared by protected area prediction control unit 75.
  • Protection processing unit 18 applies protection to the protected area specified by protection control unit 17. Protection processing unit 18 replaces the protected area with mask pattern, computer-generated mosaic, or other image; or erases the protected area so as to obscure or erase the protected area (hereinafter this type of processing is called "masking"). Protection processing unit 18 applies masking to image data corresponding to the protected area in image data sets V10, V20, and V30 so as to generate image data V31 after masking. Protection processing unit 18 supplies this image data V31 to processing unit 19 and output I/F 21. The protected area may also be protected by masking that dynamically replaces the protected area with another image.
  • protection processing unit 18 protects the protected area specified by protected area control unit 72, relative to image data sets V10, V20, and V30 supplied from image and audio forming unit 16, typically using computer-generated mosaic image. Protection processing unit 18 outputs an image after processing the protected area as image data V31. Protection control unit 17 and protection processing unit 18 function as an image data processing unit for applying protection to the protected area by specifying the protected area in the identified area detected by detection unit 83, based on the subject information included in the tag information obtained by communications unit 14.
  • Processing unit 19 functions as a recording unit for recording an image stream, which is generated by applying compression coding to image data V31 as image information to recording medium 22 via R/W control unit 20.
  • Processing unit 19 reads out the image stream recorded in recording medium 22 via R/W control unit 20, and applies extension decoding to the image stream read out. Image data restored by extension decoding is supplied to output I/F 21.
  • R/W control unit 20 controls writing to and reading from recording medium 22.
  • Processing unit 19 receives audio data A3 output from microphone 25, typically via image and audio forming unit 16. Audio data A3 is recorded to recording medium 22 and output to output I/F 21 as audio information in the same way as image data V31.
  • Processing unit 19 can also record and reproduce image auxiliary information related to the image information and the audio information to and from recording medium 22.
  • the image auxiliary information is, for example, the subject information including the protection target information related to image and sound.
  • the image auxiliary information is also the imaging condition information corresponding to an image taken. These pieces of the subject information and the imaging condition information are handled as the image auxiliary information separate from the image information and the audio information, and are recorded to and reproduced from recording medium 22.
  • the image auxiliary information recorded to recording medium 22 is notified to protection control unit 17 via R/W control unit 20 and processing unit 19 when related image information is reproduced.
  • Recording medium 22 is configured with an SD card, Blu-ray Disc (hereinafter referred to as BD), DVD, HDD, USB memory, or built-in flash memory; and is placed in a recording medium insertion unit (not illustrated) provided on camera device 10.
  • Output I/F 21 is an interface for outputting image data V31 and audio data A3 to presentation unit 23.
  • Presentation unit 23 includes monitor 28 for displaying images, such as liquid crystal display, organic EL display, head-mount display, and plasma display; and audio output unit (not illustrated), such as an earphone and speaker for outputting sound.
  • Presentation unit 23 displays an image captured by cameras 11 and 12, reproduces sound obtained by microphone 25, or reproduces an image and sound recorded in recording medium 22. Presentation unit 23 can also reproduce and display image data V31 in which protection is applied to the protected area.
  • User I/F 24 includes an input unit such as a menu (not illustrated) displayed on a touch panel attached to a liquid crystal display unit. Via user I/F 24, the user controls imaging by camera device 10 or on and off of audio recording. Also via user I/F 24, cameras 11 and 12 are switched, a shooting field such as zoom-in and zoom-out is changed, and autofocus function is selected. The user can also control processing unit 19 via user I/F 24. User I/F 24 also functions as a user designation unit for designating the protected area by the user. Other than designation of the protected area, the processing output area designation information, accuracy level designation information of image data, the camera operator schedule information, and so on are also set via user I/F 24. Each item set via user I/F 24 is notified to protection control unit 17.
  • a menu not illustrated
  • the user controls imaging by camera device 10 or on and off of audio recording.
  • cameras 11 and 12 are switched, a shooting field such as zoom-in and zoom-out is changed, and autofocus function is selected.
  • the user can also control
  • camera device 10 of the present invention at least includes the imaging unit, the communications unit, the identified area detection unit and the image data processing unit.
  • the imaging unit captures an area in shooting field 40 so as to obtain image data.
  • the communications unit communicates with IC tag 34 storing the subject information including the size, shape, direction, and position of a specific subject, and obtains the subject information.
  • the identified area detection unit dynamically detects identified area 44 corresponding to specific subjects 30 and 31 included in image data sets V10 and V20 obtained by the imaging unit.
  • the image data processing unit specifies protected area 46 from identified area 44 detected by the identified area detection unit and applies protection to protected area 46, based on the subject information obtained by the communications unit, when subjects 30 and 31 are targets to apply copyright or privacy protection. Accordingly, the present invention offers camera device that can follow subjects 30 and 31 and apply protection to an image including subjects 30 and 31 to be protected so as to obscure an image area of subjects 30 and 31, even if subjects 30 and 31 to be protected moves or changes.
  • Next describes an imaging method of generating image data V31 in which only an image area corresponding to subjects 30 and 31 to be protected is protected when subjects 30 and 31 captured by camera device 10 as configured above require copyright, portrait right or other privacy protections.
  • Fig. 3 is a flow chart illustrating a procedure for detecting identified area 44 in subjects 30 and 31 to be protected in a shooting field imaged by cameras 11 and 12, and generating image data V31 by processing detected identified area 44.
  • Figs. 4A to 4I illustrate examples of applying protection to subject 30, to which IC tag 34 is attached, that requires copyright, portrait right or privacy protection.
  • Figs. 4A to 4G show examples of protecting a specific person's face, who is subject 30 to which IC tag 34 is attached.
  • copyright protection or other protection is set to "Required" in the protection requirement information, and also the size, shape, direction and positional information on the specific person's face is stored as the tag information including information on a protection target.
  • Fig. 4A shows an image when no protection is applied within shooting field 40 of camera device 10.
  • FIG. 4B shows an image when identified area, which is a candidate protection target, is detected by detection unit 83 in shooting field 40 equivalent to Fig. 4A.
  • Fig. 4C shows an example of an image when protection is applied in shooting field 40 equivalent to Fig. 4A.
  • Fig. 4D is an image captured in shooting field 40 equivalent to Fig. 4C after some time from the shooting time of Fig. 4C .
  • Fig. 4E is an image captured in shooting field 40 equivalent to Fig. 4D still after some time from the shooting time of Fig. 4D .
  • Fig. 4F is an image captured in shooting field 40 equivalent to Fig. 4E still after some time from the shooting time of Fig. 4E.
  • FIG. 4G is an image when protected area 46 is specified based on a protection accuracy different from that in Fig. 4D in shooting field 40 equivalent to Fig. 4D .
  • Fig. 4H is an image captured in shooting field 40 equivalent to Fig. 4G after some time from the shooting time of Fig. 4G .
  • Fig. 4I is an image when protected area 46 is specified based on a protection accuracy different from that in Fig. 4D in shooting field 40 equivalent to Fig. 4D .
  • subject 30 requiring privacy protection and subject 42 not requiring privacy protection are captured in the same shooting field 40.
  • Fig. 5 is an example of an image after synthesizing twin-lens image data sets V10 and V20 stored in memory 81.
  • Fig. 6A shows an image when subject 31 captured in shooting field 40 is image display unit 36 on which image 35 requiring copyright protection is displayed.
  • Fig. 6B shows an image when protection is applied to image 35 to be protected in shooting field 40 equivalent to Fig. 6A.
  • Fig. 6C is an example of an image that protection is applied to image 35 to be protected by means of a dynamic mask control.
  • Fig. 7A shows an image when subjects 31 captured in shooting field 40 are image display unit 36 on which image 35 requiring copyright protection is displayed and art work 37 requiring copyright protection.
  • Fig. 7B shows camera device 10 capturing shooting field 40 equivalent to Fig. 7A.
  • Fig. 7C is an image taken after some time from the shooting time of Fig. 7A.
  • Fig. 7D is an image taken after some time from the shooting time of Fig. 7A .
  • Fig. 8A shows an image when subjects 31 captured in shooting field 40 are image display unit 36 on which image 35 requiring copyright protection is displayed and art work 37 requiring copyright protection.
  • Fig. 8B is an example of an image when protection is applied to shooting field 40 equivalent to Fig. 8A.
  • Fig. 8C is an example of an image when shooting field 40 is zoomed in, relative to Fig. 8B.
  • Fig. 8D is an example of an image when shooting field 40 is zoomed out, relative to Fig. 8B .
  • Step S100 when the start of imaging using camera device 10 is input from user I/F 24, camera device 10 starts imaging operation (Step S100). In other words, an imaging step is executed.
  • communications unit 14 waits for receiving the tag information including the subject information from subjects 30 and 31 (Step S101).
  • protection control unit 17 controls communications unit 14 to transmit radio waves 39 for starting communication between communications unit 14 and IC tag 34. For example, as shown in Fig. 1 , if IC tag 34 exists within the reach of radio waves 39, IC tag 34 responds to radio waves 39 transmitted from communications unit 14 for starting communications, and sends the tag information to communications unit 14.
  • Step S101 is repeated until communications unit receives data (Step S102).
  • Protection determination unit 71 receives and analyzes the tag information (Step S104). Protection determination unit 71 analyzes obtained tag information, and firstly determines the type information in the tag information. A determination result is notified to protection time control unit 74 (Step S106). If the type information does not conform to the EPG information (No), the operation proceeds to a first protection time control step (Step S108). If the type information conforms to the EPG information (Yes), the operation proceeds to a second protection time control step (Step S110).
  • protection time control unit 74 sets a protection period based on the protection time information set by the user via user I/F 24. In other words, based on the protection time information set by the user via user I/F 24, protection time control unit 74 notifies protected area control unit 72 of timing to apply protection to protected area 46. Then, the operation proceeds to a protection determination step (Step S112). In the second protection time control step, protection time control unit 74 sets the protection period based on the protection time information on airtime of a specific program included in the EPG information that communications unit 14 obtained.
  • a timing to apply protection to protected area 46 based on the protection time information on airtime of a specific program included in the EPG information obtained by communications unit 14, is notified from protection time control unit 74 to protected area control unit 72. Then, the operation proceeds to the protection determination step (Step S112).
  • protection determination unit 71 analyzes the tag information obtained, and determines the requirement information on copyright protection and/or privacy protection included in the tag information.
  • protection determination unit 71 determines that protection is "required” (Yes)
  • the requirement information on copyright or privacy protection the operation proceeds to Step S114. If protection determination unit 71 determines the protection is "not required” (No), the operation proceeds to Step S101. If the operation proceeds to Step S101, this Step S101 is repeated until communications unit 14 receives data again.
  • protected area control unit 72 In the protection determination step, if protection is determined to be "required,” protected area control unit 72 generates the protection target information in a step of obtaining information on protection target (Step S114). In other words, protected area control unit 72 obtains the protection target information, including the protection target size information, the protection target shape information, the protection target direction information and the protection target positional information on a protection target to which IC tag 34 is attached, i.e., the protection target in subjects 30 and 31. This protection target information obtained is notified to detection unit 83. At this point, protected area control unit 72 may also obtain the object information including the object size information, the object shape information and the object direction information on an imaging target to which IC tag 34 is attached, i.e., subjects 30 and 31. This object information obtained may also be notified to detection unit 83.
  • detection unit 83 dynamically detects identified area 44 corresponding to specific subjects 30 and 31 included in image data (Step S116). In this way, processing related to the tag information sent form IC tag 34 is executed.
  • Cameras 11 and 12 take an image in the shooting field, including subjects 30 and 31, and image data sets V10, V20, and V30 corresponding to the image for one frame, for example, are stored in memory 81.
  • image data V10 for one frame of an image in the shooting field, including subject 30, is stored in memory 81.
  • detection unit 83 notifies shape searching unit 85 of the protection target shape information indicating the shape of protection target, included in the protection target information.
  • Shape searching unit 85 searches image data sets V10, V20, and V30 stored in memory 81 for an area where partial image that meets the shape indicated by the protection target shape information exists in the image corresponding to the image data. This area is notified to detection unit 83 as the candidate information. For example, if the shape of face is designated as the protection target information, shape searching unit 85 searches for an area where facial image exists in the image.
  • Shape searching unit 85 notifies detection unit 83 of information on the size, shape, direction and position of a detected partial image that matches the target shape as the candidate information.
  • shape searching unit 85 detects multiple partial images that match the target shape in its search, they are all notified to detection unit 83 as candidates for protection target, respectively.
  • Information on the size, shape, direction, and position of each of partial images (hereinafter referred to as “candidate partial images”) is notified to detection unit 83.
  • distance estimation unit 82 may also calculate a distance from camera device 10 to each candidate partial image based on the candidate information notified from shape searching unit 85.
  • distance estimation unit 82 can calculate positional deviation to the left and right of each candidate partial image on two images, with reference to twin-lens image data V30 generated from image data sets V10 and V20 stored in memory 81, and estimate the distance from camera device 10 to subjects 30 corresponding to each candidate partial image. For example, if three candidate partial images of face are detected, an actual distance from camera device 10 to subjects 30 and 42, which are faces, is estimated, respectively.
  • Distance estimation unit 82 notifies detection unit 83 of this calculated distance to subjects 30 and 42, corresponding to each candidate partial image, as the distance information.
  • Fig. 5 is an example of an image synthesized from twin-lens image data of image data sets V10 and V20 stored in memory 81. As shown in Fig. 5 , closer the distance of a face is from camera device 10, larger the deviation is between image data V10 and image data V20 in synthesized image.
  • Distance estimation unit 82 utilizes this deviation width in image data sets V10 and V20 for estimating a distance to each face captured, and obtains the distance information.
  • the distance information can also be obtained by directly measuring a distance between camera device 10 and subject 30 by processing unit 13 and camera control unit 15. In case of obtaining the distance information by processing unit 13 and camera control unit 15, the distance information is notified to detection unit 83 and protected area control unit 72. Whether to obtain the distance information by distance estimation unit 82 or processing unit 13 depends on setting that the user input to user I/F 24.
  • Detection unit 83 dynamically detects identified area 44 using a part or all of the candidate information notified from shape searching unit 85, the distance information corresponding to each candidate partial image notified from distance estimation unit 82, and the imaging condition information notified from camera control unit 15. Detection unit 83 notifies the identified area information on identified area 44, as shown in Fig. 4B , to protected area control unit 72.
  • Step S118 a presence of protected area 46 specified by the user is confirmed. If there is user specification (Yes), user I/F 24 notifies protection control unit 17 of information on user-specified protected area 46 (Step S120). Information to be notified to protection control unit 17 is, for example, the processing output area designation information and the accuracy level designation information on image data. If there is no user specification (No), a predetermined designation condition is applied to protection control unit 17.
  • protection accuracy control unit 73 notifies protected area control unit 72 of the protection accuracy information on protected area 46 for specifying protected area 46 relative to the area information on the size, direction and position of the protection target (Step S122).
  • protected area control unit 72 specifies protected area 46 relative to identified areas 44 notified from detection unit 83 (Step S124). In other words, protected area control unit 72 calculates actual shape, size, direction and position of subjects 30 and 42 (hereinafter referred to as "candidate subjects") in identified areas 44 notified from detection unit 83, as shown in Fig. 4B . Protected area control unit 72 applies corrective calculation to the size of subjects 30 and 42 in identified areas 44, using a distance in the distance information and a zoom rate in the operation information so as to calculate actual size of the candidate subject included in each identified area 44.
  • protected area control unit 72 compares the protection target size information in the protection target information and an actual size value of each candidate subject calculated so as to pick out a candidate subject that has an actual size value conforming to or closest to the protection target size information. In other words, protected area control unit 72 picks out identified area 44 of the candidate subject with the actual size value conforming to or closest to the size of protection target in subject 30 holding IC tag 34 in the candidate subjects in identified areas 44. In the same way, protected area control unit 72 may also specify protected area 46 based on the protection target shape information, the protection target direction information and the protection target positional information in the protection target information. Protected area 46 can be more accurately specified by using many pieces of information as much as possible. Protected area control unit 72 notifies protection processing unit 18 of the information on specified protected area 46, i.e., the size, shape, position and direction of protected area 46 in the image, as the protected area information.
  • protected area prediction control unit 75 may also execute a protected area prediction control step. If protected area control unit 72 specifies protected area 46, the protected area prediction control step is the step of predicting the movement of protected area 46 in the shooting field so as to prepare a masking area in advance. In the protected area prediction cntrol step, protected area control unit 72 may also specify the protected area based on prepared mask area information.
  • protection processing unit 18 applies protection to protected area 46 in image data sets V10, V20, and V30 that correspond to the size, shape and position on the image and are supplied from image and audio forming unit 16, based on the protected area information notified from protected area control unit 72 (Step S126).
  • Protection processing is, for example, masking for obscuring an image using a computer-generated mosaic.
  • Image data V31, in which protected area 46 is protected, is generated, and sent to processing unit 19 or output I/F 21.
  • Step S108 to Step S126 protection is applied to an area requiring copyright, portrait right or privacy protection in subjects 30 and 31, and this area is obscured as protected area 46 in shooting field 40 of the image including subjects 30 and 31 to which IC tag 34 is attached.
  • the protected area control step and the protection processing step configure an image data processing step.
  • image data V31 in which an image protected by the image data processing unit is superimposed on image data sets V10, V20, and V30 captured by cameras 11 and 12, is sent to output I/F 21.
  • Audio data A3 collected by microphone 25 is also sent to output I/F 21.
  • Image data V31 and audio data A3 sent to output I/F 21 are reproduced on presentation unit 23, such as monitor 28 (Step S128).
  • image data V31 and audio data A3 reproduced in the reproduction step are sent to processing unit 19 in the same way.
  • Data sent to processing unit 19 is controlled by R/W control unit 20 and is recorded on recording medium 22 (Step S129).
  • image data V31 is recorded on recording medium 22 as the image information.
  • Audio data A3 collected by microphone 25 is recorded on recording medium 22 as the audio information.
  • the image information and the audio information may also be recorded together as image and audio information.
  • the subject information including the protection target information on subjects 30 and 31, which are protection targets, captured in shooting field 40 is also sent from protected area control unit 72 to processing unit 19.
  • the subject information sent to processing unit 19 is controlled by R/W control unit 20, and recorded on recording medium 22 as image auxiliary information separate from the image information and the audio information.
  • the imaging condition information including the operation information and the distance information, used for imaging an image by camera device 10 is also recorded on recording medium 22 as the image auxiliary information.
  • protection control unit 17 determines whether the instruction information on completion of imaging is notified from user I/F 24. If the instruction information on completion of imaging is notified (Yes), the imaging operation completes (Step S130). If the instruction information on completion of imaging operation is not notified (No), the operation proceeds to Step S102 again, and the above steps are repeatedly executed.
  • face-pattern processing typically used in face recognition technology is applied, for example, when multiple faces exist in shooting field 40, as shown in Fig. 4A , all faces in shooting field 40 are picked out. This makes it difficult to pick out a specific face to be protected, i.e., only subject 30 shown in Fig. 4A . A face of subject 42, which does not require protection, is also picked out.
  • camera device 10 in this exemplary embodiment first receives the protection target information on a specific face to be protected in subject 30, which is included in the tag information sent from IC tag 34. Then, as shown in Fig. 4B , detection unit 83 dynamically detects areas where images similar to the protection target information exist in image data within shooting field 40. These detected images are notified to protected area control unit 72 as identified areas 44. For example, as shown in Fig. 4A , three faces exist in shooting field 40. Therefore, detection unit 83 detects three facial images in shooting field 40, as shown in Fig. 4B , and notifies them to protected area control unit 72 as identified areas 44. Then, protected area control unit 72 calculates the size information, the shape information, the positional information and the direction information of a candidate partial image in each of three identified areas 44 notified, and specify protected area 46, taking into account the protection target information and the imaging condition information.
  • each identified area 44 to be protected are calculated for each frame so as to identify protected area 46.
  • the size of protected area 46 is corrected, based on a zoom rate of cameras 11 and 12, and the distance information. Accordingly, as shown in Fig. 4D , the movement of subject 30 is followed to change protected area 46, even if subject 30 to be protected comes close to camera device 10 and a face area to be protected becomes larger as time passes from that shown in Fig 4C . The protection target to be protected is thus reliably protected.
  • protected area control unit 72 specifies that there is no protected area 46 in shooting field 46. In this way, if the direction of a protection target turns around and the protection target no longer exists in shooting field 40, protected area control unit 72 accurately specifies protected area 46 by taking into account the direction information, in particular, on the protection target included in the subject information. As a result, there is no area to be protected within shooting field 40, and no unnecessary protection is applied.
  • protected area control unit 72 detects an overlaid portion of multiple subjects 30 and 42 and specifies protected area 46 based on the protection target positional information included in the subject information. Accordingly, only an area that requires protection in shooting field 40 is specified as protected area 46. As a result, dynamic protection is applied only to an area that requires protection in shooting field 40.
  • protection may be applied only to a facial portion of subject 30 to be protected in an area with the size and the shape almost equivalent to a boundary area of protection target.
  • protection accuracy control unit 73 controls protected area control unit 72 so as to specify a projective area relative to an imaged area that requires protection in subject 30, based on the direction information on subject 30 to be protected. Accordingly, as shown in Fig. 4G , protection is applied only to a facial portion of the protection target
  • protected area control unit 72 identifies protected area 46 in shooting field 40 corresponding to the direction of subject 30. If the target to be protected changes its direction, and therefore an area of protection target changes in shooting field 40, protected area control unit 72 accurately identifies protected area 46 by taking into account the protection target direction information, in particular, in the subject information. As a result, as shown in Fig. 4H , protection is applied only to a profile image of the facial portion, which is the protection target.
  • protection may also be applied to entire shooting field 40 including subject 30 to be protected as protected area 46. More specifically, if the user sets the accuracy level of protected area 46 to "Entire shooting field,” protection accuracy control unit 73 controls protected area control unit 72 to specify entire shooting field 40 including subject 30 to be protected as protected area 46. Accordingly, as shown in Fig. 4I , protection is applied to entire shooting field 40 including the facial portion of subject 30 to be protected.
  • Fig. 6A if subject 31 captured in shooting field 40 is image display unit 36 displaying image 35 to apply copyright protection, image 35 to be protected is specified as protected area 46, and protection is applied, as shown in Fig. 6B.
  • Figs. 6A to 6C are the cases that the type information does not conform to EPG information in the type determination described in Step S106 in Fig. 3 .
  • the user sets time to apply protection to subject 31 to be protected via user I/F 24.
  • Protection time control unit 74 controls protected area control unit 72 based on the protection time information on protection application time set by the user via user I/F 24. Accordingly, after the protection application time set by the user passes, protection of image 35 is turned off, as shown in Fig. 6A .
  • the dynamic mask control and the static mask control can also be switched depending on time.
  • protection based on the static mask control such as computer-generated mosaic
  • Fig. 6B protection based on the static mask control
  • Fig. 6C protection time control unit 74 controls protected area control unit 72 such that protected area control unit 72 applies protection to protected area 46 in subject 31 to be protected by switching between the dynamic mask control and the static mask control depending on time.
  • extraction unit 84 can also identify image display unit 36 or an area such as display portion, on which digital watermark is embedded, in the image. In this case, the position, size and shape of the area on the image determined by extraction unit 84 is notified to detection unit 83 as the watermark area information. Based on this watermark area information, detection unit 83 dynamically detects identified area 44, and notifies it to protected area control unit 72. Protected area control unit 72 then specifies protected area 46 from identified area 44 notified, and notifies this protected area 46 to protection processing unit 18. This enables protection processing unit 18 to apply protection to area where a digital watermark is detected on the image, as shown in Fig. 6B .
  • camera device 10 can apply protection to an image to be protected and obscure the image even if an image that requires copyright, portrait right or privacy protection is displayed on image display unit 36, such as a television set, and the user takes the image without being conscious of the image that requires protection.
  • image display unit 36 such as a television set
  • protection is applied only to image display unit 36, where image 35 to be protected is displayed, an area that the user intended to photograph is not eaten away.
  • Fig. 7A if subject 31 captured in shooting field 40 is image display unit 36 displaying image 35 to apply copyright protection and art work 37 to apply copyright protection, both image 35 to be protected and art work 37 to be protected are identified as protected areas 46, and protection is applied to these areas, as shown in Fig. 7B .
  • camera device 10 has monitor 28, which is presentation unit 23, and thus image data V31 after applying protection is visible on monitor 28.
  • Fig. 7C illustrates an image after protection is applied when camera device 10 is operated in line with a movement of the user capturing stationary subject 31.
  • the movement of camera device 10 is detected by gyro unit 26, and the position information and the direction information of camera device 10 are notified from gyro unit 26 to detection unit 83 and protected area control unit 72.
  • Protected area control unit 72 changes protected area 46 based on a movement of camera device 10.
  • Protected area prediction control unit 75 may also predict the shape, size, direction and position of protected area 46 that relatively moves, based on the position information and the direction information on camera device 10 notified from gyro unit 26 to protected area control unit 72, so as to prepare a masking area.
  • This prepared mask area is specified as protected area 46, and may also be notified to protection processing unit 18. This enables prediction of the movement of a protection target in shooting field 40. Accordingly, protected area 46 is dynamically specified, making feasible accurate protection of protected area 46.
  • Fig. 7D illustrates an image after protection is applied when camera device 10 moves in line with the movement of the user imaging stationary subject 31, same as in Fig. 7C.
  • Fig. 7D shows an image in shooting field when the user moves to the back of image display unit 36. Since image display unit 36 is present between art work 37 and camera device 10, a part of art work 37 to be protected overlaps with image display unit 36. Image 35 to be protected displayed on image display unit 36 does not exist in shooting field 40 because the user has moved to the back of image display unit 36. When image 35 to be protected no longer exists in shooting field 40, as is in this example, protected area control unit 72 specifies protected area 46 such that no protected area 46 corresponding to image 35 exists within shooting field 40.
  • protected area control unit 72 accurately specifies protected area 46 by taking into account the direction information, in particular, on the protection target included in the subject information. As a result, an area to apply protection does not exist in shooting field 40, and thus no unnecessary protection is applied. Furthermore, if a part of multiple subjects 31 in shooting field 40 exists in an overlapped state, protected area control unit 72 detects an overlapped portion of multiple subjects 31 based on the protection target positional information, in particular, included in the subject information. This enables a dynamic change of protected area 46 for protecting art work 37. As a result, only an area requiring protection in shooting field 40 is dynamically protected.
  • Fig. 8A if subject 31 captured in shooting field 40 is image display unit 36 displaying image 35 to apply copyright protection and art work 37 to apply copyright protection, image 35 to be protected and art work 37 to be protected are identified as protected areas 46, and protection is applied, as shown in Fig. 8B . If the user zooms in camera device 10, the size of protection target in shooting field 40 changes, as shown in Fig. 8C . If a magnification rate of camera device 10 changes by zoom-in or zoom-out operation, camera control unit 15 notifies protected area control unit 72 of information on a change in magnification rate as the operation information. Protected area control unit 72 then changes protected area 46, based on the operation information notified, and notifies protection processing unit 18 of changed protected area information.
  • protected area 46 is changed following the movement of protection target in shooting field 40, as shown in Fig. 8C , when camera device 10 executes the zoom-in operation.
  • protected area 46 is changed following the movement of the protection target in shooting field, as shown in Fig. 8D .
  • image display unit 36 captured in shooting field 40 reproduces and displays a content to be protected.
  • image display unit 36 holds the EPG information
  • the EPG information on contents to be displayed on image display unit 36 is stored in IC tag 34 attached to image display unit 36.
  • the type information included in the protection target information stored in IC tag 34 attached to image display unit 36 captured in shooting field 40 conforms to EPG information.
  • protection time control unit 74 obtains the EPG information via communications unit 14, and controls protected area control unit 72 such that it identifies protected area 46 only during the time assumed to be a broadcast time of a specific program, based on the EPG information.
  • protection processing unit 18 can apply protection only to a specific program to be protected. As a result, protection is applied to a protection target only when required.
  • the imaging method applied to camera device 10 of the present invention is the method of imaging and applying protection to subjects 30 and 31 that require copyright or privacy protection contained in a protection target.
  • This imaging method of the present invention at least includes the imaging step, the communications step, the identified area detection step and the image data processing step.
  • the imaging step is for imaging an area in shooting field 40 so as to obtain image data.
  • the communications step is for communicating with IC tag 34 storing the subject information including the size, shape, direction and position of a specific subject, so as to obtain the subject information.
  • the identified area detection step is for dynamically detecting identified area 44 corresponding to specific subjects 30 and 31 including image data sets V10 and V20 obtained in the imaging step.
  • the image data processing step is for specifying protected area 46 from identified area 44 detected in the identified area detection step, based on the subject information obtained in the communications step if subjects 30 and 31 requires copyright or privacy protection.
  • protected area 46 is dynamically specified for applying copyright, portrait right or privacy protection to subjects 30 and 31 of an image in shooting field 40 including subjects 30 and 31 where IC tag 34 is attached.
  • identified area 44 that becomes a protection target in a moving image is followed and protected so as to calculate for each image frame the position and size of protected area 46 and protect it.
  • the size of a target to be protected in a moving image is accurately corrected based on the operation information of camera device 10 and the distance information between camera device 10 and subjects 30 and 31. Accordingly, even if subjects 30 and 31 to be protected move, camera device 10 is moved, or the size of subject in the image changes due to a change in magnification rate; protection is applicable to the subject, corresponding to a change in its size.
  • This exemplary embodiment refers to a structure of recording image data V31 including protected area 46, to which protection is applied, in recording medium 22.
  • a structure to display image data sets V10, V20, and V30 that are not protected on presentation unit 23 is also applicable.
  • image data recorded to recording medium 22 and image data read out from recording medium 22 and displayed on presentation unit 23 may be changed as required and supplied as image data V31 including protected area 46 after applying protection and image data V10, V20, and V30 to which protection is not applied.
  • this exemplary embodiment refers to a structure of detecting identified areas 44 corresponding to subjects 30 and 31 to apply copyright or privacy protection, and specifying protected area 46 from identified areas 44 so as to apply protection.
  • a structure including an edit function to replace protected areas 46 corresponding to subjects 30 and 31 to be deleted with other images, respectively, is also applicable.
  • this exemplary embodiment refers to a structure of specifying protected area 46 so as to apply protection at the time of imaging subjects 30 and 31 including a protection target by using cameras 11 and 12 in camera device 10.
  • the present invention is not limited to the operation of dynamically detecting identified area 44, specifying protected area 46 from identified areas 44, and applying protection to specified protected area 46 at the same time as capturing an image.
  • imaging subjects 30 and 31 including a protection target for example, captured image data sets V10, V20, and V30, and audio data A3 are recorded as image and audio information on recoding media 22 via processing unit 19.
  • Fig. 9 is a flow chart illustrating protected reproduction method for applying protection to an image to be protected at the time of reproducing this type of image data. In the flow chart showing the protected reproduction method in Fig 9 , the step names same as those in the flow chart for the imaging method shown in Fig.
  • identified area 44 is detected and protected area 46 is specified to apply protection to image data sets V10, V20, and V30 captured by cameras 11 and 12.
  • protected reproduction method shown in Fig. 9 identified area 44 is detected and protected area 46 is specified to apply protection to image data recorded in recording medium 22.
  • Step S200 when the user inputs the start of protected reproduction to camera device 10 via user I/F 24, an image designated by the user is selected, and camera device 10 starts reproduction (Step S200).
  • a reproduction step starts.
  • processing unit 19 obtains the image and audio information and the image auxiliary information attached to the image and audio information from recording medium 22 (Step S202).
  • the image and audio information includes the image data and the audio data. If subjects 30 and 31 including a target to be protected are captured in an image to be reproduced, the subject information on a target to be protected is recorded in recording medium 22 as the image auxiliary information belonging to the image and audio information.
  • processing unit 19 determines whether or not the image auxiliary information attached to the image and audio information obtained is recorded in recording medium 22 (Step S204). If no image auxiliary information is obtained (No), the operation returns to Step S202. If the image auxiliary information is obtained (Yes), the operation proceeds to Step S206.
  • the image auxiliary information obtained by processing unit 19 is notified to and analyzed in protection determination unit 71.
  • protection determination unit 71 analyzes the image auxiliary information obtained, and determines the presence of information on copyright or privacy protection in the image auxiliary information (Step S206). If protection determination unit 71 determines that protection is required (Yes) based on the presence of information on copyright or privacy protection, the operation proceeds to Step S114. If protection determination unit 71 determines that protection is not required (No), the operation proceeds to Step S202. In case of proceeding to Step S202, processing unit 19 obtains the image and audio information and the image auxiliary information again.
  • protection determination step if protection determination unit 71 determines that protection is "required," as a result of analysis of the subject information included in the image auxiliary information, each step on and after the protection target information acquisition step is executed, same as that shown in Fig. 3 .
  • protected area control step protected area control unit 72 identifies a protected area based on the subject information and the imaging condition information included in the image auxiliary information.
  • protection control unit 17 determines whether or not the instruction information for completing protective reproduction is notified from user I/F 24. If the instruction information for completing reproduction is notified (Yes), reproduction ends (Step S230). If no instruction information for completing reproduction is notified (No), the operation proceeds to Step S202 again, and the above steps are repeated.
  • the protected reproduction method applied to camera device 10 of the present invention is a protected reproduction method of reproducing an imaging target after applying protection to subjects 30 and 31 to apply copyright or privacy protection.
  • the protected reproduction method at least includes the reproduction step, the image auxiliary information acquisition step, the identified area detection step and the image data processing step.
  • the reproduction step is to reproduce image data of an area captured within shooting field 40.
  • the image auxiliary information acquisition step is to obtain the subject information including the size, shape, direction and position of a specific subject. This subject information is attached to image data.
  • the identified area detection step is to dynamically detect identified area 44 corresponding to specific subjects 30 and 31 included in the image data obtained in the reproduction step.
  • the image data processing step is to specify protected area 46 from identified area 44 detected in the identified area detection step, based on the subject information obtained in the image auxiliary information acquisition step, and to apply protection to protected area 46 if subjects 30 and 31 require copyright or privacy protection.
  • protected area 46 to apply copyright or privacy protection in an image in shooting field 40 including subjects 30 and31 to be protected is dynamically specified and protected on reproducing the image.
  • the position and the size of protected area to be protected are calculated for each image frame, for example, so as to apply protection, identified area 44 to be protected in a moving image is followed and protected.
  • the size of protected area is accurately corrected by using the operation information of camera device 10 and the distance information on a distance between camera device 10 and subjects 30 and 31. Accordingly, even if the size of subject in the image changes due to movement of subjects 30 and 31 to be protected or camera device 10 or a change in zooming rate, protection is applicable to the subject corresponding to any change in the size.
  • the present invention is described in accordance with the above exemplary embodiment. However, the present invention is not limited to the above exemplary embodiment.
  • the present invention further includes the following structures.
  • a part of components configuring camera device 10 shown in Fig. 1 may be included in one piece of a system LSI (Large Scale Integration).
  • the system LSI is typically an ultra-multifunctonal LSI manufactured by integrating multiple components on one chip. If a part of components of camera device 10 is included in the system LSI, high-speed processing becomes feasible by configuring at least image and audio forming unit 16, protection control unit 17, protection processing unit 18 and detection unit 83 with the system LSI.
  • processing unit 13, communications unit 14, camera control unit 15 and memory 81 may also be integrated in the system LSI. Still more, processing unit 19, R/W control unit 20 and output I/F 21 may also be integrated.
  • distance estimation unit 82, extraction unit 84 and shape searching unit 85 may also be integrated.
  • each of components configuring circuits may take the form of a chip separately. Alternatively, a part or all components may be included in one chip.
  • a one-chip structure enables the same structure as the system LSI.
  • the present invention is not limited to the system LSI.
  • the present invention may also adopt a structure called IC, LSI, super LSI or ultra LSI, depending on a degree of integration.
  • circuit integration is not limited to LSI.
  • a dedicated circuit or general-purpose processing unit is also applicable.
  • PFGA Field Programmable Gate Array
  • PFGA Field Programmable Gate Array
  • the imaging method of the present invention may also be a computer program operated typically by a computer. Still more, the imaging method may also be digital signals of computer program.
  • the present invention may have a structure that a computer program or digital signal is recorded on readable recording medium such as flexible disk, hard disk, CD-ROM, MO, DVD, DVD-ROM, DVD-RAM, BD and semiconductor memory. Digital signals recorded in these recording media are also applicable.
  • the present invention may have a structure that a computer program or digital signal is transmitted via telecommunication line, wireless network, wire communication line and other networks typically Internet, or data broadcast.
  • the present invention may be a computer system including a microprocessor and memory.
  • the memory stores the computer program, and the microprocessor operates according to the computer program.
  • the present invention may have a structure that another independent computer system executes the operation by transmitting a computer program or digital signal recorded on a recording medium, or transmitting a computer program or digital signal via a network.
  • the present invention may have a structure in which the exemplary embodiment and other variations are combined.
  • the camera device of the present invention follows and identifies a subject requiring copyright, portrait right or other privacy protections, and applies protection to the image area of the subject even if the subject moves or changes. Accordingly, the present invention is applicable to a digital camera that photographs and captures video data or still-image data, and to an imaging method adopted in monitor cameras and other electronic devices.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Human Computer Interaction (AREA)
  • Business, Economics & Management (AREA)
  • Marketing (AREA)
  • Optics & Photonics (AREA)
  • Studio Devices (AREA)
  • Closed-Circuit Television Systems (AREA)
  • Television Signal Processing For Recording (AREA)

Claims (18)

  1. Dispositif de prise de vue (10) comprenant :
    une unité d'imagerie (11) pour former l'image d'une zone dans un champ de prise de vue afin d'obtenir des données d'images ;
    une unité de communication (14) pour recevoir, par l'intermédiaire d'un réseau, depuis un dispositif externe (34), des informations d'objet incluant la taille, la forme, la direction et la position, relatives à un sujet spécifique (30, 31), et des informations de cible de protection incluant les dimensions d'une cible de protection, la forme de la cible de protection, la direction de la cible de protection et la position de la cible de protection, relatives à une cible de protection du sujet spécifique, en tant qu'informations concernant le sujet ;
    une unité de détection de zone identifiée (83) pour détecter de façon dynamique une zone identifiée correspondant au sujet spécifique inclus dans les données d'images obtenues par l'unité d'imagerie ; et
    une unité de traitement de données d'image (72) pour spécifier une zone destinée à être protégée contre la vision dans la zone identifiée, en se basant sur les informations de cible de protection reçues par l'unité de communication, et pour appliquer une protection contre la vision uniquement dans la zone protégée lorsque le sujet spécifique comporte ladite cible de protection, dans lequel les informations concernant le sujet comportent à la fois les informations d'objet et les informations de cible de protection.
  2. Dispositif de prise de vue selon la revendication 1, comprenant en outre :
    une unité de désignation d'utilisateur (24) pour désigner la zone protégée par un utilisateur ;
    dans lequel l'unité de traitement de données d'image spécifie une zone protégée spécifique à un utilisateur dans la zone identifiée, en se basant sur les informations de cible de protection incluses dans les informations concernant le sujet, et applique la protection.
  3. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image (73) comporte une unité de commande de précision de protection pour commander la précision de la protection appliquée à la zone protégée.
  4. Dispositif de prise de vue selon la revendication 3,
    dans lequel l'unité de commande de précision de protection (73) commande la protection en fonction de la direction du sujet, la protection étant appliquée à une zone de projection relative à un visage photographié d'une zone du sujet nécessitant une protection.
  5. Dispositif de prise de vue selon la revendication 3,
    dans lequel l'unité de commande de précision de protection (73) commande la protection en réglant une zone de marge autour du sujet à protéger, à une plage et à une forme prédéterminées, conformément à au moins un parmi le réglage de l'utilisateur et l'environnement d'utilisation.
  6. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image (74) comporte une unité de commande de temps de protection pour commander le temps d'application de la protection à la zone protégée.
  7. Dispositif de prise de vue selon la revendication 6,
    dans lequel l'unité de commande de temps de protection (74) applique une protection à la zone protégée conformément au réglage d'un utilisateur.
  8. Dispositif de prise de vue selon la revendication 6,
    dans lequel l'unité de commande de temps de protection (74) commute une protection appliquée à la zone protégée entre une commande de masque dynamique et une commande de masque statique, en fonction du temps, conformément au réglage de l'utilisateur.
  9. Dispositif de prise de vue selon la revendication 6,
    dans lequel le sujet comporte une unité d'affichage d'image reproduisant et affichant un contenu à protéger,
    l'unité d'affichage d'image contient des informations de guide électronique de programme, EPG,
    l'unité de communication obtient les informations d'EPG, et
    l'unité de commande de temps de protection limite le temps d'application de la protection à la zone protégée à un temps correspondant au temps de diffusion d'un programme spécifique en fonction des informations d'EPG obtenues.
  10. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image comporte une unité de commande de zone protégée (72) pour commander la zone protégée.
  11. Dispositif de prise de vue selon la revendication 10,
    dans lequel l'unité de commande de zone protégée (72) modifie la zone protégée en fonction d'un parmi le mouvement du sujet et le fonctionnement du dispositif de prise de vue.
  12. Dispositif de prise de vue selon la revendication 10,
    dans lequel l'unité de commande de zone protégée (72) modifie la zone protégée en fonction d'un parmi des opérations de zoom avant ou de zoom arrière.
  13. Dispositif de prise de vue selon l'une des revendications 1 et 2, comprenant en outre :
    une unité d'enregistrement (20) pour enregistrer les informations de sujet relatives au sujet auquel est appliquée une protection parmi la protection des droits d'auteur et la protection de la vie privée, en tant qu'informations auxiliaires d'image, séparées des informations d'image.
  14. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image détecte une partie de recouvrement de plusieurs sujets, en se basant sur les informations concernant le sujet obtenues par l'unité de communication (14) et modifie de façon dynamique la zone protégée lorsque des parties des sujets multiples se superposent.
  15. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image remplace de façon dynamique la zone protégée du sujet par une image différente.
  16. Dispositif de prise de vue selon l'une des revendications 1 et 2,
    dans lequel l'unité de traitement de données d'image comporte une unité de prédiction de zone protégée (75) pour prédire la zone protégée et préparer à l'avance une zone de masquage, la zone protégée étant prédite en se basant au moins sur un mouvement du sujet et des informations concernant une trame qui précède ou qui suit une image enregistrée ou reproduite, et l'unité de traitement de données d'image applique la protection à la zone de masquage préparé.
  17. Dispositif de prise de vue selon l'une des revendications 1 et 2, comprenant en outre :
    un capteur gyroscopique (26),
    dans lequel l'unité de traitement de données d'image comporte une unité de prédiction de zone protégée (75) pour prédire la zone protégée et préparer à l'avance une zone de masquage, la zone protégée étant prédite en se basant sur les informations de sortie du capteur gyroscopique, et l'unité de traitement de données d'image appliquant une protection à la zone de masquage préparée.
  18. Procédé d'imagerie appliquant une protection contre la vision à un sujet spécifique à protéger contre la vision d'une cible d'imagerie dans un fonctionnement d'imagerie, le procédé comprenant les étapes consistant à :
    obtenir des données d'images par imagerie d'une zone dans un champ de prise de vue ;
    recevoir au moins des informations d'objet incluant la taille, la forme, la direction et la position, relatives au sujet spécifique, les informations d'objets étant fournies avec le sujet spécifique en tant qu'informations concernant le sujet par l'intermédiaire d'un réseau, depuis un dispositif externe ;
    détecter de façon dynamique une zone identifiée correspondant au sujet spécifique inclus dans les données d'images obtenues ; et
    spécifier une zone destinée à être protégée contre la vision dans la zone identifiée, en se basant sur les informations de cible de protection incluant les dimensions de la cible de protection, la forme de la cible de protection, la direction de la cible de protection et la position de la cible de protection, relatives à une cible de protection, et appliquer une protection contre la vision uniquement dans la zone destinée à être protégé contre la vision lorsque le sujet spécifique comporte ladite cible de protection, dans lequel les informations concernant le sujet comportent à la fois les informations d'objet et les informations de cible de protection.
EP08764175.9A 2007-06-22 2008-06-20 Dispositif de caméra et procédé d'imagerie Not-in-force EP2157781B1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2007164627 2007-06-22
PCT/JP2008/001592 WO2009001530A1 (fr) 2007-06-22 2008-06-20 Dispositif de caméra et procédé d'imagerie

Publications (3)

Publication Number Publication Date
EP2157781A1 EP2157781A1 (fr) 2010-02-24
EP2157781A4 EP2157781A4 (fr) 2011-01-26
EP2157781B1 true EP2157781B1 (fr) 2013-08-07

Family

ID=40185357

Family Applications (1)

Application Number Title Priority Date Filing Date
EP08764175.9A Not-in-force EP2157781B1 (fr) 2007-06-22 2008-06-20 Dispositif de caméra et procédé d'imagerie

Country Status (4)

Country Link
US (1) US8610787B2 (fr)
EP (1) EP2157781B1 (fr)
JP (1) JP4877391B2 (fr)
WO (1) WO2009001530A1 (fr)

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011015244A (ja) * 2009-07-03 2011-01-20 Sanyo Electric Co Ltd ビデオカメラ
US9077950B2 (en) * 2010-04-28 2015-07-07 Thomas William Hickie System, method, and module for a content control layer for an optical imaging device
JP5495930B2 (ja) * 2010-05-10 2014-05-21 キヤノン株式会社 画像処理装置、方法、およびプログラム
JP2011244251A (ja) * 2010-05-19 2011-12-01 Ricoh Co Ltd 撮像装置、撮影制限方法、プログラム及び記録媒体
JP5423893B2 (ja) * 2010-06-28 2014-02-19 株式会社ニコン 撮像装置、画像処理装置、画像処理プログラム記録媒体
JP5750253B2 (ja) 2010-09-28 2015-07-15 任天堂株式会社 画像生成プログラム、撮像装置、撮像システム、及び画像生成方法
JP5791256B2 (ja) * 2010-10-21 2015-10-07 キヤノン株式会社 表示制御装置、表示制御方法
JP5747646B2 (ja) * 2011-05-09 2015-07-15 株式会社ニコン 電子装置、データ生成方法およびデータ生成プログラム
JP5834671B2 (ja) * 2011-09-16 2015-12-24 富士通株式会社 画像処理装置、画像処理方法およびプログラム
JP5620414B2 (ja) * 2012-01-18 2014-11-05 株式会社スクウェア・エニックス ゲーム装置
JP5825121B2 (ja) * 2012-01-31 2015-12-02 株式会社Jvcケンウッド 画像処理装置、画像処理方法、画像処理プログラム
JP5966624B2 (ja) * 2012-05-29 2016-08-10 株式会社リコー 情報処理装置及び情報表示システム
US9578224B2 (en) 2012-09-10 2017-02-21 Nvidia Corporation System and method for enhanced monoimaging
US9208753B2 (en) * 2012-09-17 2015-12-08 Elwha Llc Unauthorized viewer detection system and method
JP6192306B2 (ja) * 2013-02-14 2017-09-06 オリンパス株式会社 撮像装置、管理サーバ、画像送信方法およびプログラム
US20140307116A1 (en) * 2013-04-12 2014-10-16 Nvidia Corporation Method and system for managing video recording and/or picture taking in a restricted environment
KR101484844B1 (ko) 2013-04-24 2015-01-21 정영규 실시간 영상에 프라이버시 마스킹 툴을 제공하는 장치 및 방법
JP6315895B2 (ja) * 2013-05-31 2018-04-25 キヤノン株式会社 撮像装置、画像処理装置、撮像装置の制御方法、画像処理装置の制御方法、プログラム
US9400943B2 (en) * 2013-08-02 2016-07-26 Qualcomm Incorporated Identifying IoT devices/objects/people using out-of-band signaling/metadata in conjunction with optical images
US9799036B2 (en) 2013-10-10 2017-10-24 Elwha Llc Devices, methods, and systems for managing representations of entities through use of privacy indicators
US20150104004A1 (en) 2013-10-10 2015-04-16 Elwha Llc Methods, systems, and devices for delivering image data from captured images to devices
US20150106195A1 (en) 2013-10-10 2015-04-16 Elwha Llc Methods, systems, and devices for handling inserted data into captured images
US10346624B2 (en) 2013-10-10 2019-07-09 Elwha Llc Methods, systems, and devices for obscuring entities depicted in captured images
US10185841B2 (en) 2013-10-10 2019-01-22 Elwha Llc Devices, methods, and systems for managing representations of entities through use of privacy beacons
US10013564B2 (en) 2013-10-10 2018-07-03 Elwha Llc Methods, systems, and devices for handling image capture devices and captured images
US10057764B2 (en) 2014-01-18 2018-08-21 Microsoft Technology Licensing, Llc Privacy preserving sensor apparatus
JP6289200B2 (ja) * 2014-03-26 2018-03-07 キヤノン株式会社 送信装置、その制御方法、及びプログラム
US9679194B2 (en) 2014-07-17 2017-06-13 At&T Intellectual Property I, L.P. Automated obscurity for pervasive imaging
JP6057386B2 (ja) * 2014-08-06 2017-01-11 日本電気株式会社 画像生成システム、及び合成画像出力方法
US9245500B1 (en) * 2014-11-10 2016-01-26 Yumei ZHANG System and method for preventing image capture
EP3224757B1 (fr) * 2014-12-30 2020-07-15 Deutsche Telekom AG Structure de respect de la vie privée intra-dispositif pour lunettes connectées et montres connectées
JP6124223B2 (ja) * 2015-04-16 2017-05-10 パナソニックIpマネジメント株式会社 事件映像ログデータ生成装置、事件映像処理システム、事件映像ログデータ生成方法、及びプログラム
EP3526964B1 (fr) 2016-10-14 2024-02-21 Genetec Inc. Masquage dans un flux vidéo
DE102016119637A1 (de) * 2016-10-14 2018-04-19 Uniqfeed Ag Fernsehübertragungssystem zur Erzeugung angereicherter Bilder
US10657361B2 (en) 2017-01-18 2020-05-19 International Business Machines Corporation System to enforce privacy in images on an ad-hoc basis
JP7027049B2 (ja) * 2017-06-15 2022-03-01 キヤノン株式会社 画像処理装置、画像処理方法及びプログラム
EP3454254B1 (fr) 2017-09-07 2023-11-08 Canon Kabushiki Kaisha Appareil de traitement d'image, appareil de fourniture d'image, procédés de commande associés et programme
JP7084795B2 (ja) * 2017-09-07 2022-06-15 キヤノン株式会社 画像処理装置、画像提供装置、それらの制御方法及びプログラム
JP7047394B2 (ja) * 2018-01-18 2022-04-05 セイコーエプソン株式会社 頭部装着型表示装置、表示システム、及び、頭部装着型表示装置の制御方法
EP3514760B1 (fr) * 2018-01-23 2020-06-17 Honda Research Institute Europe GmbH Procédé et système d'enregistrement de données respectant la confidentialité
WO2020027074A1 (fr) * 2018-07-31 2020-02-06 ソニーセミコンダクタソリューションズ株式会社 Dispositif d'imagerie à semi-conducteur et appareil électronique
US11820289B2 (en) 2018-07-31 2023-11-21 Sony Semiconductor Solutions Corporation Solid-state imaging device and electronic device
JP7103040B2 (ja) * 2018-08-02 2022-07-20 富士通株式会社 表示制御プログラム、表示制御方法、情報処理装置およびヘッドマウントユニット
EP3640903B1 (fr) * 2018-10-18 2023-12-27 IDEMIA Identity & Security Germany AG Vidéo surveillance dépendant d'un signal
TWI686087B (zh) * 2018-10-26 2020-02-21 中興保全科技股份有限公司 具隱私保護功能的影像擷取裝置
JP2020156033A (ja) * 2019-03-22 2020-09-24 日産自動車株式会社 情報処理装置及び情報処理方法
EP3796654A1 (fr) 2019-09-20 2021-03-24 Axis AB Brouillage de masques de confidentialité
JP6793369B1 (ja) * 2019-11-20 2020-12-02 パナソニックIpマネジメント株式会社 撮像装置
KR20220015019A (ko) * 2020-07-30 2022-02-08 삼성전자주식회사 데이터 마스킹을 이용하여 이미지를 변환하는 전자 장치 및 방법
EP4009226A1 (fr) * 2020-12-04 2022-06-08 Axis AB Étiquette pour indiquer une région d'intérêt et procédé permettant de trouver une région d'intérêt dans une image
EP4020981A1 (fr) 2020-12-22 2022-06-29 Axis AB Caméra et procédé associé pour faciliter son installation
CN114827396A (zh) * 2021-01-29 2022-07-29 佳能株式会社 摄像装置、摄像方法和存储介质

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0851611A (ja) * 1994-08-05 1996-02-20 Canon Inc 画像処理装置および画像処理方法
JPH08307877A (ja) * 1995-05-12 1996-11-22 Nippon Telegr & Teleph Corp <Ntt> 動画像構造化の符号化および復号化方法
JP3716519B2 (ja) 1996-11-15 2005-11-16 オムロン株式会社 カメラおよび外部装置ならびに画像処理装置
US6697103B1 (en) * 1998-03-19 2004-02-24 Dennis Sunga Fernandez Integrated network for monitoring remote objects
US6067399A (en) 1998-09-02 2000-05-23 Sony Corporation Privacy mode for acquisition cameras and camcorders
US6859877B2 (en) 2000-06-12 2005-02-22 Canon Kabushiki Kaisha Image processing apparatus and method, and computer readable memory medium storing program for executing image processing
JP2002076905A (ja) * 2000-06-12 2002-03-15 Canon Inc 画像符号化装置、画像符号化方法、画像符号化プログラムを記録したコンピュータ可読記録媒体及び画像符号化プログラム
US7522745B2 (en) * 2000-08-31 2009-04-21 Grasso Donald P Sensor and imaging system
US6763148B1 (en) * 2000-11-13 2004-07-13 Visual Key, Inc. Image recognition methods
JP2003087632A (ja) 2001-09-13 2003-03-20 Konica Corp 撮影機能を有する電子機器及び画像処理システム
JP4036051B2 (ja) * 2002-07-30 2008-01-23 オムロン株式会社 顔照合装置および顔照合方法
WO2004105383A1 (fr) * 2003-05-20 2004-12-02 Matsushita Electric Industrial Co., Ltd. Systeme d'imagerie
GB2411229B (en) * 2003-07-22 2006-04-12 Hitachi Int Electric Inc Object tracking method and object tracing apparatus
JP3938127B2 (ja) * 2003-09-29 2007-06-27 ソニー株式会社 撮像装置
US7248166B2 (en) 2003-09-29 2007-07-24 Fujifilm Corporation Imaging device, information storage server, article identification apparatus and imaging system
JP2005130463A (ja) * 2003-09-29 2005-05-19 Fuji Photo Film Co Ltd 撮像装置及び撮像システム
JP2005151124A (ja) 2003-11-14 2005-06-09 Kyodo Printing Co Ltd デジタル万引き防止方式及びその方法
KR100601933B1 (ko) * 2003-11-18 2006-07-14 삼성전자주식회사 사람검출방법 및 장치와 이를 이용한 사생활 보호방법 및 시스템
JP4481663B2 (ja) * 2004-01-15 2010-06-16 キヤノン株式会社 動作認識装置、動作認識方法、機器制御装置及びコンピュータプログラム
JP2005223601A (ja) 2004-02-05 2005-08-18 Seiko Epson Corp 撮影制御システム、撮影制御プログラム及び撮影制御方法、並びにデジタルカメラ
US7697026B2 (en) * 2004-03-16 2010-04-13 3Vr Security, Inc. Pipeline architecture for analyzing multiple video streams
JP2006148386A (ja) * 2004-11-18 2006-06-08 Nippon Telegr & Teleph Corp <Ntt> 映像撮影装置、映像撮影方法、この撮影方法のプログラム、および撮影方法のプログラムを記録した記録媒体
JP4512763B2 (ja) 2005-02-02 2010-07-28 株式会社国際電気通信基礎技術研究所 画像撮影システム
KR101113844B1 (ko) 2005-03-08 2012-02-29 삼성테크윈 주식회사 감시카메라 및 감시카메라의 프라이버시 보호방법
JP2007096864A (ja) 2005-09-29 2007-04-12 Orion Denki Kk 記録再生装置

Also Published As

Publication number Publication date
US20100182447A1 (en) 2010-07-22
US8610787B2 (en) 2013-12-17
JPWO2009001530A1 (ja) 2010-08-26
EP2157781A1 (fr) 2010-02-24
EP2157781A4 (fr) 2011-01-26
JP4877391B2 (ja) 2012-02-15
WO2009001530A1 (fr) 2008-12-31

Similar Documents

Publication Publication Date Title
EP2157781B1 (fr) Dispositif de caméra et procédé d&#39;imagerie
JP5056061B2 (ja) 撮像装置
JP5867424B2 (ja) 画像処理装置、画像処理方法、プログラム
JP5361528B2 (ja) 撮像装置およびプログラム
US20060104483A1 (en) Wireless digital image capture device with biometric readers
US8339469B2 (en) Process for automatically determining a probability of image capture with a terminal using contextual data
KR20180092621A (ko) 단말기 및 그 제어 방법
WO2004105383A1 (fr) Systeme d&#39;imagerie
US7978254B2 (en) Image capturing apparatus, its controlling method, and program
JP2006020166A (ja) 地図表示システム及びデジタルカメラ
EP3381180B1 (fr) Dispositif de prise de vues, et procédé de commande associé
US20180295283A1 (en) Mobile terminal and method of controlling the same
CN103888684A (zh) 图像处理装置以及图像处理方法
JP4440926B2 (ja) デジタルカメラシステム
JP3269470B2 (ja) 画像撮像装置
JP2004179919A (ja) 画像仲介システム
KR20180119281A (ko) 이동 단말기 및 그 제어 방법
JP2010170265A (ja) 画像管理装置、方法、プログラムおよびシステム
JP6836306B2 (ja) 撮像制御装置、その制御方法、プログラム及び記録媒体
JP5116494B2 (ja) 撮像装置
JP2016036081A (ja) 画像処理装置、方法及びプログラム並びに記録媒体
JP2008167206A (ja) 監視カメラシステム及び監視カメラ制御方法
JP2010161573A (ja) 情報処理装置およびその方法
JP2017224913A (ja) 撮像システム、情報処理装置
JP2010062778A (ja) 画像撮影装置及び画像撮影制御方法

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20091218

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

AX Request for extension of the european patent

Extension state: AL BA MK RS

DAX Request for extension of the european patent (deleted)
A4 Supplementary search report drawn up and despatched

Effective date: 20101227

GRAP Despatch of communication of intention to grant a patent

Free format text: ORIGINAL CODE: EPIDOSNIGR1

GRAS Grant fee paid

Free format text: ORIGINAL CODE: EPIDOSNIGR3

GRAA (expected) grant

Free format text: ORIGINAL CODE: 0009210

AK Designated contracting states

Kind code of ref document: B1

Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MT NL NO PL PT RO SE SI SK TR

REG Reference to a national code

Ref country code: GB

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: CH

Ref legal event code: EP

Ref country code: AT

Ref legal event code: REF

Ref document number: 626171

Country of ref document: AT

Kind code of ref document: T

Effective date: 20130815

REG Reference to a national code

Ref country code: IE

Ref legal event code: FG4D

REG Reference to a national code

Ref country code: DE

Ref legal event code: R096

Ref document number: 602008026617

Country of ref document: DE

Effective date: 20131002

REG Reference to a national code

Ref country code: AT

Ref legal event code: MK05

Ref document number: 626171

Country of ref document: AT

Kind code of ref document: T

Effective date: 20130807

REG Reference to a national code

Ref country code: NL

Ref legal event code: VDEP

Effective date: 20130807

REG Reference to a national code

Ref country code: LT

Ref legal event code: MG4D

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20131209

Ref country code: IS

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20131207

Ref country code: SE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: LT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130821

Ref country code: NO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20131107

Ref country code: HR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: AT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: PL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: LV

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: GR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20131108

Ref country code: ES

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: FI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: BE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: NL

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: SI

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: CY

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: SK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: DK

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: RO

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: EE

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: CZ

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: IT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PLBE No opposition filed within time limit

Free format text: ORIGINAL CODE: 0009261

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT

26N No opposition filed

Effective date: 20140508

REG Reference to a national code

Ref country code: DE

Ref legal event code: R097

Ref document number: 602008026617

Country of ref document: DE

Effective date: 20140508

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MC

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

Ref country code: LU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20140620

REG Reference to a national code

Ref country code: CH

Ref legal event code: PL

GBPC Gb: european patent ceased through non-payment of renewal fee

Effective date: 20140620

REG Reference to a national code

Ref country code: IE

Ref legal event code: MM4A

REG Reference to a national code

Ref country code: FR

Ref legal event code: ST

Effective date: 20150227

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: LI

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140630

Ref country code: IE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140620

Ref country code: CH

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: GB

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140620

Ref country code: FR

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20140630

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: MT

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: BG

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: HU

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO

Effective date: 20080620

Ref country code: TR

Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT

Effective date: 20130807

PGFP Annual fee paid to national office [announced via postgrant information from national office to epo]

Ref country code: DE

Payment date: 20200609

Year of fee payment: 13

REG Reference to a national code

Ref country code: DE

Ref legal event code: R119

Ref document number: 602008026617

Country of ref document: DE

PG25 Lapsed in a contracting state [announced via postgrant information from national office to epo]

Ref country code: DE

Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES

Effective date: 20220101