CN107038362A - Image processing apparatus and image processing method - Google Patents
Image processing apparatus and image processing method Download PDFInfo
- Publication number
- CN107038362A CN107038362A CN201610908844.5A CN201610908844A CN107038362A CN 107038362 A CN107038362 A CN 107038362A CN 201610908844 A CN201610908844 A CN 201610908844A CN 107038362 A CN107038362 A CN 107038362A
- Authority
- CN
- China
- Prior art keywords
- image
- personage
- privacy class
- facial
- face
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F21/00—Security arrangements for protecting computers, components thereof, programs or data against unauthorised activity
- G06F21/30—Authentication, i.e. establishing the identity or authorisation of security principals
- G06F21/31—User authentication
- G06F21/32—User authentication using biometric data, e.g. fingerprints, iris scans or voiceprints
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/162—Detection; Localisation; Normalisation using pixel segmentation or colour matching
Landscapes
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Computer Security & Cryptography (AREA)
- Physics & Mathematics (AREA)
- Health & Medical Sciences (AREA)
- General Engineering & Computer Science (AREA)
- Software Systems (AREA)
- Computer Hardware Design (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Human Computer Interaction (AREA)
- Multimedia (AREA)
- Image Processing (AREA)
- Studio Devices (AREA)
- Image Analysis (AREA)
Abstract
The present invention provides a kind of image processing apparatus and image processing method.The problem of the present invention is, will not seem it is unnatural in the case of perform the processing that corresponding with desired privacy class image is related to.Image processing apparatus (100) possesses:First determination unit (6d), judges to represent to identify whether the face for the personage that image is included meets defined condition as the privacy class of the facial degree of difficulty of specific personage;And shooting control part (6e), in the case where determining condition as defined in privacy class satisfaction, the execution for the defined processing being related to image is controlled.
Description
This application claims be willing to 2015-234701, in June, 2016 with Japanese patent application laid filed in 1 day December in 2015
Japanese patent application laid filed in 15 days is willing to the priority based on 2016-118545, and quotes the basic Shen in this application
Full content please.
Technical field
The present invention relates to image processing apparatus and image processing method.
Background technology
In the past, there is a situation where to want image that is open or recording following state, i.e. although shooting has personage, he
Even if people's viewing is also difficult to be the personage.It is thus known that there is such as Japanese Unexamined Patent Publication 2000-322660 publications right like that
The face of personage implements mosaic processing, blocks the technology for being processed so that and not damaging privacy.
However, in the case of above-mentioned patent document, not only needing to carry out horse to the image as open, record object
Sai Ke processing, the image procossings such as processing are blocked, but also exist due to the local mosaic being carried out, block and as unnatural
Image cause the problem of aesthetic property declines.
The content of the invention
The present invention be in view of so the problem of and complete, problem of the invention is, in the feelings that will not seem unnatural
The processing that image corresponding with desired privacy class is related to is performed under condition.
The form of the present invention is related to a kind of image processing apparatus, it is characterised in that possess processor, the processor:
Judge represent identify the face for the personage that image is included as the facial degree of difficulty of specific personage privacy class whether
Condition as defined in meeting;The result of determination of condition according to as defined in meeting whether the privacy class, is related to controlling image
The execution of defined processing.
Another form of the present invention is related to a kind of image processing apparatus, it is characterised in that possess processor, the processing
Device:Calculate and represent to identify privacy class of the face for the personage that image is included for the facial degree of difficulty of specific personage;
The execution for the defined processing that image is related to is controlled using the privacy class calculated.
The another form of the present invention is related to a kind of image processing method, has used image processing apparatus, described image processing
Method is characterised by, including:Judge to represent to identify that the face for the personage that image is included is stranded as the facial of specific personage
Whether the privacy class of difficult degree meets the processing of defined condition;And whether defined bar is met according to the privacy class
The result of determination of part controls the processing of the execution of defined processing that image is related to.
Another form of the present invention is related to a kind of image processing method, has used image processing apparatus, described image processing
Method is characterised by, including:Calculate and represent to identify that the face for the personage that image is included is stranded for the facial of specific personage
The processing of the privacy class of difficult degree;And holding for the defined processing that image is related to is controlled using the privacy class calculated
Capable processing.
With reference to the following description and accompanying drawing, above-mentioned and other purpose of the invention, novel feature will be apparent from.Should
Note, accompanying drawing is only used for illustrating the present invention, and the present invention is not limited to this.
Brief description of the drawings
When combining following accompanying drawing to consider following detailed record, the application can be more fully understood from.
Fig. 1 is the block diagram for showing to apply the schematic configuration of the image processing apparatus of embodiments of the present invention 1.
Fig. 2 is an example for showing to be handled the action being related to by the automatic shooting of Fig. 1 image processing apparatus progress
Flow chart.
Fig. 3 is an example for showing to be handled the action being related to by the privacy class calculating of Fig. 1 image processing apparatus progress
The flow chart of son.
Fig. 4 is to continue with showing that Fig. 3 privacy class calculates the flow chart of processing.
Fig. 5 is an example for showing to be handled the action being related to by the manual shooting of Fig. 1 image processing apparatus progress
Flow chart.
Fig. 6 is for illustrating that privacy class calculates the figure of processing.
Fig. 7 A~Fig. 7 C are for illustrating that privacy class calculates the figure of processing.
Fig. 8 A and Fig. 8 B are for illustrating that privacy class calculates the figure of processing.
Fig. 9 A and Fig. 9 B are for illustrating that privacy class calculates the figure of processing.
Figure 10 is an example of the display mode for the image for schematically showing the image processing apparatus based on Fig. 1
Figure.
Figure 11 is the example of action for showing to be related to by the decision content setting processing of Fig. 1 image processing apparatus progress
Flow chart.
Figure 12 is the block diagram for showing to apply the schematic configuration of the image processing apparatus of embodiments of the present invention 2.
Figure 13 is the example of action for showing to be related to by the image acquisition process of Figure 12 image processing apparatus progress
Flow chart.
Figure 14 is show to be handled the action being related to by the privacy class calculating of Figure 12 image processing apparatus progress one
The flow chart of example.
Figure 15 is to continue with showing that Figure 14 privacy class calculates the flow chart of processing.
Embodiment
Hereinafter, the specific mode of the present invention is illustrated using accompanying drawing.But, the scope of invention is not limited to diagram
Example.
[embodiment 1]
Fig. 1 is the block diagram for showing to apply the schematic configuration of the image processing apparatus 100 of embodiments of the present invention 1.
As shown in figure 1, the image processing apparatus 100 of embodiment 1 specifically possesses:Central control 1, memory 2, shooting
Portion 3, signal processing part 4, motion detecting section 5, operation control part 6, image processing part 7, image record portion 8, display part 9, communication
Control unit 10 and operation inputting part 11.
In addition, central control 1, memory 2, image pickup part 3, signal processing part 4, motion detecting section 5, operation control part 6,
Image processing part 7, image record portion 8, display part 9 and communication control unit 10 are connected via bus 12.
In addition, image processing apparatus 100 such as can as possessing the pocket telephone of camera function, smart phone
Movement station, PDA (the Personal Data Assistants used in mobile radio communication:Personal digital assistant) etc. communication terminal
Constitute, can also be constituted by possessing digital camera of communication function etc..
Central control 1 is controlled to each several part of image processing apparatus 100.Specifically, although eliminate diagram, but
It is that central control 1 possesses CPU (Central Processing Unit:CPU) etc., and filled according to image procossing
The various processing routines (omitting diagram) for putting 100 carry out various control actions.
Memory 2 is for example by DRAM (Dynamic Random Access Memory:Dynamic random access memory) etc.
Constitute, preserve temporarily by data of processing such as central control 1, operation control part 6 etc..For example, 2 pairs of memory is in automatic shooting
The benchmark face-image F (describing in detail below) imaged in processing is preserved temporarily.
Subject (for example, personage etc.) is imaged and generates two field picture as defined in 3 pairs of image pickup part (image unit).Specifically
Ground, image pickup part 3 possesses camera lens part 3a, electro-photographic portion 3b and imaging control part 3c.
Camera lens part 3a has passed through the amount of the light of camera lens such as multiple lens as zoom lens, condenser lens and adjustment
Aperture etc. constitute.In addition, camera lens part 3a be configured to expose with display panel 9b identicals side (object side) so that energy
Enough carry out so-called auto heterodyne.
Electro-photographic portion 3b is for example by CCD (Charge Coupled Device:Charge coupling device), CMOS
(Complementary Metal-Oxide Semiconductor:Complementary metal oxide semiconductor) etc. imaging sensor (take the photograph
Element) constitute.Moreover, the optical image that electro-photographic portion 3b will pass through camera lens part 3a various lens is transformed to two-dimentional figure
As signal.
Imaging control part 3c is for example scanned driving by timing generator, driver to electro-photographic portion 3b so that
The optical image for having passed through camera lens part 3a is transformed to the picture signal of two dimension every specified period by electro-photographic portion 3b, from the electronics
Read two field picture and be output to signal processing part 4 image pickup part 3b one, the picture picture of camera watch region one.
The signal of 4 pairs of analogues value from the electro-photographic portion 3b two field pictures forwarded of signal processing part implements various picture signals
Processing.Specifically, signal processing part 4 for example presses RGB each color composition progress gain to the signal of the analogue value of two field picture
Adjustment, holding is sampled with sample-and-hold circuit (omit and illustrate) and digital number is transformed to A/D converters (omit and illustrate)
According to the color process for include pixel interpolation processing and γ correcting process with color process circuit (omit and illustrate) is handled, raw
Into the luminance signal Y and colour difference signal Cb, Cr (yuv data) of digital value.In addition, signal processing part 4 is by the luminance signal Y of generation
The memory 2 used as buffer storage is output to colour difference signal Cb, Cr.
Motion detecting section 5 detects the motion of the image processing apparatus 100.
That is, motion detecting section 5 for example possesses respectively to the mutually orthogonal 3 axle (roll axis with image processing apparatus 100
(roll axis), pitch axis (pitch axis) and yawing axis (yaw axis)) centered on the angular speed of rotation examined
3 axis angular rate sensors surveyed.Moreover, motion detecting section 5 such as shoot subject when will by 3 axis angular rate sensors according to
The secondary signal detected is output to operation control part 6 as movable information.
Operation control part 6 possesses the first image acquiring unit 6a, detection process portion 6b, the first calculating part 6c, the first determination unit
6d and shooting control part 6e.
In addition, each several part of operation control part 6 is for example made up of defined logic circuit, but the structure is an example
Son, however it is not limited to this.
First image acquiring unit 6a obtains the two field picture imaged successively by image pickup part 3.
Specifically, the first image acquiring unit 6a obtained successively among memory 2 imaged successively by image pickup part 3 and by
The view data for the two field picture that the live view image that signal processing part 4 is sequentially generated is related to.
Detection process portion 6b detects face, the facial structure among the two field picture got by the first image acquiring unit 6a
Into position, sight.
That is, detection process portion 6b to being imaged by image pickup part 3 and being obtained successively by the first image acquiring unit 6a successively
Each two field picture carries out face detection processing, to detect the facial facial zone for including the personage as subject, and then, from
The constituting parts of the face such as detection eyes, mouth in the facial zone detected in face detection processing.In addition, detection process portion
6b to being imaged and carrying out line-of-sight detection by each two field picture that the first image acquiring unit 6a is obtained successively successively by image pickup part 3
Processing, to detect the sight of the personage as subject.
In addition, detection process portion 6b to being imaged by image pickup part 3 and being obtained successively by the first image acquiring unit 6a successively
The detection of each two field picture as the whole body of the personage of subject, and determine its shape.According to the shape of the personage determined
The change of shape, can differentiate personage whether outside two field picture, although or whether in two field picture body direction
Change.
Further, since above-mentioned face detection processing, the detection process of constituting parts, line-of-sight detection handle, determine personage
The processing of shape is known technology, therefore omits detailed description herein, but for example in face detection processing, constituting parts
Detection process in, can for example use AAM (Active Appearance Model:Active appearance models), in addition, regarding
In line detection process, for example, it can detect the position of eyes and be regarded according to the area ratio of the white of the eye of black eye ball both sides to detect
Line.
First calculating part (computing unit) 6c calculates privacy class.
Here, privacy class represents that the face for identifying the personage that image (for example, two field picture etc.) is included is specific people
The facial degree of difficulty of thing, is easily identified as the facial image of specific personage, then privacy class is relatively lower, separately
On the one hand, the facial image for specific personage is difficult to, then privacy class is relatively higher.Specifically, privacy level
The personage not included according to image face, the facial constituting parts (for example, eyes, mouth etc.) state (for example, angle,
Size, whether be blocked, color etc.) and the screening of outside that face, the facial constituting parts of the personage are blocked
The state of block material (for example, sunglasses, mask etc.) and change.
That is, the face for being possible to identify the personage that two field picture is included is the facial state of specific personage, for example, energy
Enough detect the face of personage and the face of personage and sight be set to normal condition for state of positive direction (prescribed direction) etc.,
Relative change based on relative to the normal condition, personage whole face, the facial constituting parts;Personage face or
The change for the ratio that the facial constituting parts are blocked;The facial size or the face to personage for the personage that two field picture is included
Camera distance change;The change of the shelter for the outside that the face of personage or the facial constituting parts are blocked;
And the change of facial color of personage etc., privacy class can change.
Here, the relative change case of the whole face of personage, the facial constituting parts is if due to dynamic below progress
Make and produce:As the personage of subject with defined axle (for example, yawing axis, pitch axis;Reference picture 6) centered on rotate face
Action;Personage rotates whole body centered on yawing axis, so that the action that face is pivoted about with yawing axis;
Mobile sight so that the action that sight deviates from positive direction;And make image processing apparatus 100 in above-below direction, right and left
Displacement is produced upwards so that the camera lens part 3a of image processing apparatus 100 facial front side of the optical axis direction relative to personage
To inclined action;Deng.In addition, making the facial action pivoted about with roll axis of personage, making image processing apparatus
100 actions pivoted about with camera lens part 3a optical axis direction are because the influence to facial detection process is small, therefore row
Except outside, but the privacy class of image for example can be more meticulously set, therefore these actions can also be included.
In addition, the change case for the ratio that the face of personage or the facial constituting parts are blocked is if as follows due to carrying out
Action and produce, i.e. as subject personage with the hand of oneself, hair or outside shelter (for example, sunglasses,
Mask etc.) etc. block it is whole face, face constituting parts (for example, eyes, mouth etc.).
In addition, the facial size for the personage that two field picture is included or to personage facial camera distance change case if
Produced due to carrying out following action:Adjust as the personage of subject and moving for the physical distance of image processing apparatus 100
Make;Adjust the action of the zoom ratio (focal length) of image pickup part 3;And the region sheared is specified among the image photographed
Action;Deng.
In addition, the change case for the shelter of outside that face, the facial constituting parts of personage are blocked if
Due to the shelter for the outside for blocking eyes be glasses, the difference of species as sunglasses, eyeshade, winker produces.This
Outside, even mask, facial composition that also can be by the mask for oral area and for being blocked as the mask of nose
Position difference and produce.
In addition, the change case of the facial color of personage in the case of following if produce:According to being irradiated to as quilt
Take the photograph light, the exposure status of image processing apparatus 100 of the personage of body, facial backlight and become completely black or due to over-exposed
And become complete white situation;Or make up etc. in the facial situation for smearing the other colors different from the colour of skin;Deng.
Specifically, the first calculating part 6c is based on detection process portion 6b to being imaged successively by image pickup part 3 and by the first figure
As the testing result of the acquisition unit 6a two field pictures obtained successively, it is possible to detect the face of the facial and personage of personage and regards
Line is defined as normal condition for the state of positive direction (prescribed direction).Then, the first calculating part 6c detections are relative to the benchmark
The following change of the two field picture (benchmark face-image F) of state, to calculate privacy class, the change includes:The entire surface of personage
Portion, the relative change of the facial constituting parts;The change for the ratio that the face of personage or the facial constituting parts are blocked;
The change of the facial size for the personage that two field picture is included or facial camera distance to personage;To the face of personage or the face
The change of the shelter for the outside that the constituting parts in portion are blocked;And the change of the facial color of personage.
For example, 6c pairs of the first calculating part face detection corresponding with the facial zone of the benchmark face-image F personages included
The corresponding face of the facial zone of the shape of frame and personage with being included by the first image acquiring unit 6a two field pictures obtained successively
The shape of detection block is compared, to determine the face of personage relative to normal condition, as subject with defined axle
The change of the anglec of rotation centered on (for example, yawing axis, pitch axis).That is, for example, when the face of personage is centered on yawing axis
When being rotated, face detection frame turns into the long rectangle in longitudinal direction, when the face of personage is pivoted about with pitch axis, face
Portion's detection block turns into the rectangle of laterally length.Moreover, the first calculating part 6c is for example counted according to the change of the shape of face detection frame
The central shaft and the anglec of rotation of the facial rotation of personage are calculated, it is determined that relative to normal condition, personage face with defined
The variable quantity or rate of change of the anglec of rotation centered on axle.Here, making whole body rotation big centered on yawing axis in personage
Cause in the case of 180 °, i.e. in the case where rotating to be and making face orientation rear, also can detect personage's by detection process portion 6b
The rotation of whole body, the first calculating part 6c can determine the change of the anglec of rotation according to the testing result.In addition, the first meter
Calculation portion 6c can also calculate the anglec of rotation of image processing apparatus 100 based on the movable information exported from motion detecting section 5, and
It is determined that facial front of the optical axis relative to personage relative to normal condition, the image processing apparatus 100 camera lens part 3a
Direction variable quantity or rate of change.
In addition, for example, the first calculating part 6c is obtained to the sight of the benchmark face-image F personages included and by the first image
The sight for the personage that the two field picture that portion 6a is obtained successively is included is compared, come determine it is relative to normal condition, as shot
The change of the sight of the personage of body.
Moreover, the first calculating part 6c based on the personage relative to normal condition determined face using defined axle in
The variable quantity or rate of change of the anglec of rotation of the heart, relative to normal condition the image processing apparatus 100 camera lens part 3a light
Axle is used as subject relative to the variable quantity or rate of change in the facial positive direction of personage and relative to normal condition
Personage sight change, to calculate privacy class.For example, making the face of personage be carried out to the left and right centered on yawing axis
Rotate or make in the case that sight moves left and right, as shown in Figure 7 A, in terms of the first calculating part 6c is come by following dependency relation
Calculate privacy class, i.e. front is in facial direction and only makes the privacy class of sight (or right) mobile state to the left minimum,
And privacy class gradually rises with direction (or right) rotation to the left for making face.In addition, for example in the face for making personage
Portion centered on pitch axis upwards under rotate or make sight in the case of moving up and down, as shown in Figure 7 B, the first calculating part
6c calculates privacy class with following dependency relation, i.e. front is in facial direction and only make sight (or on) moves downwards
The privacy class of dynamic state is minimum, and privacy class gradually rises with direction (or on) rotation downwards for making face
It is high.In addition, for example make personage face rotated to the left and right and up and down centered on yawing axis and pitch axis or make sight to
Left and right and in the case of moving up and down, as seen in figure 7 c, the first calculating part 6c calculates privacy class with following dependency relation,
That is, facial direction for front and only make sight to left down (or bottom right, upper left, upper right) mobile state privacy class most
It is low, and with make face direction to left down (or bottom right, upper left, upper right) rotation and privacy class gradually rises.
In addition, the first calculating part 6c can not also make the anglec of rotation of the sight and face of personage centered on defined axle
Privacy class is individually set with dependency relation.
In addition, the method for the change of the anglec of rotation of the face of above-mentioned determination personage centered on defined axle is one
Individual example, however it is not limited to this, for example, it is also possible to use multiple knowledges corresponding with the anglec of rotation of face in face detection processing
Other device, and the testing result detected using expression by which identifier.In addition, the method for the change of above-mentioned determination sight is only
It is an example, however it is not limited to this, can suitably, arbitrarily it be changed.
In addition, for example, the first calculating part 6c to the number from the benchmark face-image F facial constituting parts detected and
It is compared from the number of the facial constituting parts detected by the first image acquiring unit 6a two field pictures obtained successively, comes true
The change of the fixed ratio being blocked relative to normal condition, personage facial constituting parts.Specifically, the first calculating part 6c
According to the number of the constituting parts of the face such as eyes, mouth detected in the facial zone detected from being handled in face detection
Change, to determine the change for the ratio that relative to normal condition, personage facial constituting parts are blocked.
Moreover, the first calculating part 6c is based on determining relative to normal condition, personage facial constituting parts quilt
The change of the ratio blocked calculates privacy class.For example, as shown in Figure 8 A, the first calculating part 6c is come with following dependency relation
Calculate privacy class, i.e. the privacy class for the state that any position (for example, mouth etc.) in facial constituting parts is blocked is most
It is low, and with face constituting parts in the constituting parts being blocked number increase and privacy class gradually rises.
In addition, the method for the change for the ratio that the facial constituting parts of above-mentioned determination personage are blocked is an example
Son, however it is not limited to this, can suitably, arbitrarily it be changed.
In addition, for example, the first calculating part 6c is to constituting the facial pixel count and structure of personage that benchmark face-image F is included
Facial pixel count into the personage included by the first image acquiring unit 6a two field pictures obtained successively is compared, to determine phase
For the change of normal condition, the personage that two field picture is included facial size.That is, for example, personage and image processing apparatus
100 physical distance is more remote, then the facial pixel count for constituting the personage that two field picture is included is fewer.First calculating part 6c such as roots
According to the change for the pixel count for constituting the facial facial zone comprising personage, to determine relative to normal condition, two field picture bag
The change of the facial size of the personage contained.In addition, the first calculating part 6c can also be converted to according to focal length of image pickup part 3 etc.
The facial camera distance (subject distance) of personage, and according to camera distance change come determine it is relative to normal condition,
The change of the facial size for the personage that two field picture is included.
Moreover, the first calculating part 6c is based on determining relative to normal condition, the personage that two field picture is included face
The change of size calculate privacy class.For example, as shown in Figure 8 B, the first calculating part 6c is calculated with following dependency relation
Privacy class, i.e. the facial size for the personage that two field picture is included is bigger, then privacy class is lower, the personage that two field picture is included
Facial size it is smaller, then privacy class is higher.
In addition, the method for the change of the facial size of above-mentioned determination personage is an example, however it is not limited to this, energy
It is enough suitably, arbitrarily to be changed.
In addition, for example, the first calculating part 6c is to the shelter from the benchmark face-image F outsides detected and from by first
The species of the shelter for the outside that the two field picture that image acquiring unit 6a is obtained successively is detected is compared, to determine relative to base
Quasi- state, outside shelter change.Specifically, for example if sunglasses, then the first calculating part 6c is from face inspection
Study has on the face-image of sunglasses in large quantities in the facial zone detected in survey processing, if mask, then the first meter
Learn masked face-image in large quantities in the facial zone that calculation portion 6c is detected from being handled in face detection, so that from
The outside shelter such as detection sunglasses, mask in facial zone, and the change of the species according to the shelter of the outside detected
Change to determine the change of outside shelter relative to normal condition.
Moreover, the first calculating part 6c is based on determining relative to normal condition, outside shelter change come based on
Calculate privacy class.For example, shelter table ST shown in the first calculating part 6c reference pictures 9A calculates the kind with outside shelter
The corresponding privacy class of class.The species of outside shelter is set up with privacy class and is accordingly stored in shelter table ST.
In addition, the method for the change of shelter outside above-mentioned determination is an example, however it is not limited to this, Neng Goushi
It is local, arbitrarily changed, it is, for example, possible to use the technology of known target identification is recognized with facial constituting parts not
With object and be detected. as outside shelter.
In addition, the species of the shelter of the outside of detection is an example, however it is not limited to this, for example, it is also possible to be to hide
Cap, scarf of stopper hair etc..
In addition it is also possible to calculate higher privacy class in the case of the shelter for detecting multiple outsides.
In addition, for example, the first calculating part 6c to the facial color from the benchmark face-image F personages detected and from by
The facial color for the personage that the two field picture that first image acquiring unit 6a is obtained successively is detected is compared, come determine relative to
The change of normal condition, personage facial color.Specifically, the first calculating part 6c is detected from being handled in face detection
Facial zone in determine except as eyes, mouth with the color different from the colour of skin constituting parts in addition to, area of skin color
Average RGB value, and to the facial R values, G values, B values from the benchmark face-image F personages detected with from by the first image
The facial R values for the personage that the two field picture that acquisition unit 6a is obtained successively is detected, G values, the absolute value of the difference of B values each are closed
Meter, so that it is determined that the change of the facial color of personage.
Moreover, the first calculating part 6c is based on the change relative to normal condition, personage facial color determined
To calculate privacy class.For example, as shown in Figure 9 B, the first calculating part 6c calculates privacy class with following dependency relation, i.e.
The difference of the facial color of the personage included from the facial color and two field picture of the benchmark face-image F personages detected is got over
Small, then privacy class is lower, and the difference of color is bigger, then privacy class is higher.
In addition, the method for the change of the facial color of above-mentioned determination personage is an example, however it is not limited to this, example
If suitably, arbitrarily being changed as following, i.e. in the facial color for determining the personage that two field picture is included
, have more than defined value compared with the average RGB value from the facial color of the benchmark face-image F personages detected
The ratio of the area in the region of the color of difference, and privacy class is calculated according to the size of the ratio of the area.
In addition, the first calculating part 6c according to being imaged and by the first figure successively in being handled in automatic shooting by image pickup part 3
As each two field picture that the acquisition unit 6a live view images obtained successively are related to, to calculate privacy class successively, and it will calculate
The privacy class gone out is sequentially output to the first determination unit 6d.
As long as in addition, the first calculating part 6c is by least any change in the above-mentioned following change relative to normal condition
Privacy class is calculated as benchmark, the change includes:The whole face of personage, the relative change of the facial constituting parts
Change;The change for the ratio that the face of personage or the facial constituting parts are blocked;The personage's that two field picture is included is facial big
The change of facial camera distance small or to personage;The change of outside shelter;And the change of the facial color of personage
Change.In addition, in the case where by this, multiple changes are turned to benchmark, the first calculating part 6c can also enter according to each of projects
Row is individually evaluated, and carries out overall merit to calculate privacy class according to its result.
In addition, the calculating of privacy class with the body of personage in two field picture for condition, when the body of personage is in two field picture
Outside in the case of, do not calculate privacy class, and handle without automatic shooting.
First determination unit (identifying unit) 6d judges whether sentenced than defined by the privacy class that the first calculating part 6c is calculated
Definite value is high.
That is, the first determination unit 6d is according to each live view imaged successively by image pickup part 3 in being handled in automatic shooting
Image judges whether the privacy class calculated successively by the first calculating part 6c is higher than decision content (whether meeting defined condition).
Specifically, the first determination unit 6d is obtained from memory 2 and is specified by user and be stored in desired hidden in memory 2 in advance first
Private rank is used as decision content.Then, the first determination unit 6d obtains the privacy class calculated successively by the first calculating part 6c successively,
And judge whether the privacy class got is higher than decision content.
In addition, on decision content, such as can be previously set multiple according to what is calculated in privacy class calculating processing
The value that privacy class is obtained by rule of thumb is used as default value.
Shooting control part (control unit) 6e control image pickup parts 3 make it perform the defined processing that image is related to.
That is, by the first determination unit 6d determine privacy class it is higher than defined decision content in the case of, shooting control part
6e control image pickup parts 3 make it shoot recording image.Specifically, shooting control part 6e is for example based in automatic shooting processing
First determination unit 6d result of determination, using from the privacy class calculated successively by the first calculating part 6c as the state below decision content
It is opportunity to change privacy class to be calculated successively by the first calculating part 6c state higher than decision content, exports and remembers to image pickup part 3
The shooting for the image employed is indicated, the image pickup part 3 is shot recording image.
In addition, shooting control part 6e can also for example be judged with comparing from the privacy class calculated successively by the first calculating part 6c
The high state change of value makes to take the photograph to be that the state below decision content is opportunity by the first calculating part 6c privacy class calculated successively
As portion 3 shoots recording image.That is, for example when the privacy class of live view image becomes too high, Yong Hujin
Row action as reduced the privacy class of image (for example, action etc. as making the direction close to front of face), so that
Reduce privacy class, in this process, shooting control part 6e can make the shooting based on the first determination unit 6d result of determination
Portion 3 shoots recording image.
In addition, shooting control part 6e indicates operation to control image pickup part 3 to make its shooting recording according to the shooting of user
Image.Specifically, shooting control part 6e for example in manual shooting processing to be entered by user via operation inputting part 11 described later
Row, which is shot, indicates that operation is opportunity, and the shooting for exporting recording image to image pickup part 3 is indicated, the image pickup part 3 is shot record
Image.
Image processing part 7 with defined compressed format (for example, JPEG forms etc.) to by signal processing part 4 generate it is static
The view data of image is encoded and generates the recording view data of rest image.
In addition, image processing part 7 according to corresponding defined coded system to from memory 2, image record portion 8 reading
The view data for the rest image that display object is related to is decoded, and is output to display part 9.Now, image processing part 7 is for example
Can also the display resolution based on display panel 9b described later etc. and be deformed into given size (for example, VGA, full HD size)
And it is output to display part 9.
Image record portion 8 is constituted such as by nonvolatile memory (flash memory), record by image processing part 7 with
The recording view data of rest image after defined compressed format coding.Specifically, either handled in automatic shooting
In the case of or manual shooting handle in the case of, image record portion (recording unit) 8 is obtained by shooting control part 6e
The recording image photographed obtains the privacy class calculating before privacy image P is shot as privacy image P
The privacy class calculated in processing, regard privacy class as Exif (Exchangeable Image File Format:It can hand over
Change image file format) information and set up and accordingly recorded with privacy image P view data.
In addition, image record portion 8 for example can also be following structure, i.e. be configured to freely load and unload recording medium (province
Sketch map shows), control to read data from the recording medium of installation, data are write to recording medium.
Display part 9 possesses display control unit 9a and display panel 9b.
Display control unit 9a is controlled as follows, i.e. based on being read and by image procossing from memory 2, image record portion 8
The view data for the given size that portion 7 is decoded, makes defined image be shown in display panel 9b viewing area.Specifically,
Display control unit 9a possesses VRAM (Video Random Access Memory:Video RAM), VRAM control
Device, digital video code etc..Moreover, digital video code via VRAM controllers among VRAM with defined regeneration frame
Frequency read by image processing part 7 be decoded and stored in luminance signal Y and colour difference signal Cb that VRAM (omit illustrate) plants,
Cr, produces vision signal based on these data and is output to display panel 9b.
Display panel 9b is aobvious by image photographed by image pickup part 3 etc. based on the vision signal from display control unit 9a
Show in viewing area.Specifically, display panel 9b under the screening-mode of rest image while with defined frame frequency to passing through
The multiple two field pictures for being shot subject by image pickup part 3 and being generated are updated while showing live view image successively.
In addition, display panel 9b shows image of the record in image record portion 8 under the regeneration mode of image.Now,
Display control unit 9a makes display panel 9b according to point on the basis of establishing corresponding privacy class as display control unit
The privacy image P (reference picture 10) of class or order display record in image record portion 8.For example, as shown in Figure 10, display control
Portion 9a will record multiple privacy image P in image record portion 8 by defined each privacy class (for example, 10,30,50,
70th, 100 etc.) image sets are categorized as, and display panel 9b is shown according to each image sets being classified.Now, for
The corresponding each image sets of privacy class, can show any privacy image P as representative image Ps to carry out thumbnail.
In addition, display control unit 9a for example can also be according to establishing corresponding privacy class to recording in image record portion
Multiple privacy image P in 8 are rearranged, and display panel 9b is shown privacy image P according to the order rearranged.
In addition, as display panel 9b, such as liquid crystal display panel, organic EL (Electro- can be enumerated
Luminescence:Electroluminescent) display panel etc., but this is an example, however it is not limited to these contents.
Communication control unit 10 carries out the transmitting-receiving of data via communication antenna 10a and communication network (omit and illustrate).
That is, communication antenna 10a can be carried out with the image processing apparatus 100 logical with wireless base station (omit and illustrate)
The defined communication mode used in letter is (for example, W-CDMA (Wideband Code Division Multiple Access:
WCDMA) mode, CDMA2000 modes, GSM (Global System for Mobile Communications (note
Volume trade mark);Global system for mobile communications) mode etc.) corresponding data transmitting-receiving antenna.Moreover, communication control unit 10 according to
Communication protocol corresponding with defined communication mode, by the communication channel that is set in the communication mode and with wireless base station
Between via communication antenna 10a carry out data transmitting-receiving.Specifically, communication control unit (transmitting element) 10 is via communication antenna
10a sends the privacy image P photographed by shooting control part 6e to outside record server (defined external device (ED)) S.
Record server S is, for example, the server for constituting cloud, is possessed as Web (World Wide Web:WWW) clothes
Business device opens up the function of Web page (for example, image discloses webpage etc.) on the internet.Moreover, record server S is for example received
Various images sent out from the grade communication terminal of image processing apparatus 100 etc., and it is open in Web page as content.
Thus, the content being disclosed in the Web page that opens up of record server S turns into can be by can be via communication network
Access the state that the user of the communication terminal of the Web page is read.
Can be any knot as long as being made up of in addition, recording server S the computer for being connectable to communication network
Structure, omits detail explanation.
Communication network be, for example, via wireless base station, gateway server (omit illustrate) etc. by image processing apparatus 100 with
The communication network that the external device (ED)s such as record server S are attached.In addition, communication network is to utilize special circuit, existing one
As the communication network constructed of public line, WAN (Wide Area Network can be applied:Wide area network), LAN (Local Area
Network:LAN) etc. various line modes.
In addition, communication network is for example defended including telephone wire road network, isdn line road network, special circuit, mobile radio communication, communication
The various communication networks such as star circuit, CATV line networks and IP network, VoIP (Voice over Internet Protocol:Net
Network phone) gateway, ISP etc..
In addition, the structure of above-mentioned communication control unit 10 is an example, however it is not limited to this, can suitably, arbitrarily
Ground is changed, for example, although eliminate diagram, but can also be configured to be equipped with wireless LAN module and can be via connecing
Access point (Access Point) is linked into communication network.
Operation inputting part 11 is used for the predetermined operation for carrying out the image processing apparatus 100.Specifically, operation inputting part 11 has
The shooting of standby subject indicate the selection decision button that the selection of the shutter release button, screening-mode, the function that are related to etc. indicates to be related to,
The adjustment of zoom amount indicates zoom button being related to etc. (omitting diagram) operating portion, and the behaviour of each button according to the operating portion
Defined operation signal is output to central control 1 by work.
<Automatic shooting is handled>
Then, reference picture 2 illustrates the automatic shooting processing carried out by image processing apparatus 100.
Fig. 2 is the flow chart for an example for showing the action that automatic shooting processing is related to.
As shown in Fig. 2 when when the shooting of the live view image of the subject carried out by image pickup part 3, signal transacting
The signal of the analogue value for the two field picture that 4 pairs of portion is related to from the electro-photographic portion 3b live view images forwarded implements various image letters
Number processing, generate digital value view data (step S1).Deposited in addition, the view data of generation is output to by signal processing part 4
Reservoir 2, the view data of 2 pairs of inputs of memory is preserved temporarily.
First image acquiring unit 6a of operation control part 6, which reads among memory 2 and obtains live view image, to be related to
Two field picture view data (step S2).
Then, detection process portion 6b by the first image acquiring unit 6a two field pictures for turning into process object got to being carried out
Whether line-of-sight detection handles (step S3), and judge the sight as the personage of subject as positive (step S4).
Here, when determining sight and being not front (step S4;It is no), the first image acquiring unit 6a is among memory 2
The view data (step S5) for the new two field picture that live view image is related to is read and obtained, and processing is returned to step
S3.Then, in step s3, detection process portion 6b with it is above-mentioned it is substantially same to by the first image acquiring unit 6a get it is new
Two field picture carry out line-of-sight detection processing (step S3).
On the other hand, (the step S4 when determining sight in step s 4 for front;It is), operation control part 6 is transferred to
The holding state (step S6) of automatic shooting.
That is, in the case where as the sight of the personage of subject not being positive (watching video camera attentively), do not transfer to certainly
The dynamic holding state shot, thus it is for example possible to prevent from not watching automatic in the state of video camera attentively be not intended to be shot
Shot.
Hereafter, detection process portion 6b carries out detection process to the two field picture obtained successively by the first image acquiring unit 6a, the
Testing results of the one calculating part 6c based on detection process portion 6b is possible to detect the face of personage and the sight of the personage is just
The state in face direction (prescribed direction) is defined as normal condition (step S7).In addition, the first calculating part 6c is by the normal condition
Two field picture is output to memory 2, and is preserved temporarily as benchmark face-image F.Moreover, the reference plane that this is preserved temporarily
Portion image F is set to the state of face orientation positive direction (prescribed direction).
Then, the first image acquiring unit 6a reads and obtained the new frame that live view image is related among memory 2
The view data (step S8) of image, operation control part 6 calculate the privacy class meter of the privacy class of the new two field picture
Calculation handles (reference picture 3 and Fig. 4) (step S9;Describe in detail below).
Then, the first determination unit 6d of operation control part 6 judges to calculate in step S9 privacy class calculating processing
Privacy class it is whether higher than the decision content set in decision content setting processing (step S10).Specifically, the first determination unit 6d
Read among memory 2 and obtain decision content, and judge in privacy class calculating processing whether is the privacy class that calculates
It is higher than decision content.
Here, when determine privacy class it is high unlike decision content when (step S10;It is no), operation control part 6 returns to processing
To step S8, and perform the later each processing of step S8.I.e., in step s 8, the first image acquiring unit 6a obtains new frame figure
The view data of picture, and carry out privacy class calculating processing in step S9.
On the other hand, when determine in step slo privacy class it is higher than decision content when (step S10;It is), shoot control
6e control image pickup parts 3 in portion make the recording image (step S11) that it shoots subject.Specifically, for example, shooting control part 6e
The timer that subject is automatically shot after the stipulated time is set in, when have passed through the stipulated time set with timer
Image pickup part 3 is shot subject, and signal processing part 4 is generated view data.Then, image processing part 7 is with defined compression
Form (for example, JPEG forms etc.) is encoded to the view data generated by signal processing part 4, generates recording image
View data.
Hereafter, image record portion 8 obtains recording image from image processing part 7 and is used as privacy image P, and obtains
The privacy class that calculates in privacy class calculating processing, and using privacy class as Exif information with privacy image P's
View data is set up and is accordingly recorded (step S12).
Thus, automatic shooting processing is terminated.
In addition, though a privacy image P is shot on the basis of privacy class in above-mentioned automatic shooting processing, but
This is an example, however it is not limited to this, for example, it is also possible to shoot multiple privacy class privacy figure higher than defined decision content
Picture P is simultaneously recorded in image record portion 8 so that user selects desired privacy image P among the privacy image P of record.
<Privacy class calculating is handled>
Then, reference picture 3 and Fig. 4 illustrate the privacy class calculating processing carried out by image processing apparatus 100.
Fig. 3 and Fig. 4 are the flow charts for an example for showing the action that privacy class calculating processing is related to.
As shown in figure 3, first, detection process portion 6b is to the two field picture as process object (for example, in Fig. 2 step S8
In the new two field picture that gets etc.) carry out face detection processing (step S21), and determine whether to detect comprising as shot
The facial facial zone (step S22) of the personage of body.
Herein, although as the personage of subject in two field picture, if it is determined that but being not detected by facial zone (step
Rapid S22;It is no), then for example it is considered that personage is made whole body rotate substantially 180 ° centered on yawing axis and causes face orientation
Privacy class is calculated as highest value (step S23) by the state at rear, the first calculating part 6c.Thus, privacy class meter is terminated
Calculation is handled.
On the other hand, when determined in step S22 detect facial zone when (step S22;It is), the first calculating part 6c
Sunglasses is detected in the facial zone detected from the two field picture of process object, wearing product as mask, (outside blocks
Thing) (step S24), and determine whether to detect the product of wearing (outside shelter) (step S25).Detection is determined here, working as
To (step S25 when product of wearing (outside shelter);It is), with reference to shelter table ST (Fig. 9 A), according to product of wearing (outside screening
Block material) species evaluate and determine the fraction (step S26) for calculating privacy class.On the other hand, do not examine when determining
(step S25 when measuring the product of wearing (outside shelter);It is no), skip step S26 processing.
Then, area of skin color during the first calculating part 6c determines the facial zone that is detected from the two field picture of process object
Color, and the first calculating part 6c obtains benchmark face-image F from memory 2, and face is also determined to benchmark face-image F
The color of area of skin color in region, calculate the color detection processing (step of the skin of the difference of each colour of skin determined
S27).Then, it is determined that the difference for whether detecting the colour of skin calculated is the colour of skin (step more than defined value, with being typically different
Rapid S28).Here, when determining the colour of skin for detecting and being typically different (step S28;It is), evaluate and determine the face with skin
The corresponding fraction (step S29) for being used to calculate privacy class of color (difference of the colour of skin calculated).On the other hand, when determining not
(step S28 during the colour of skin for detecting and being typically different;It is no), skip step S29 processing.
Then, the first calculating part 6c determines the pixel count for constituting the facial zone detected from the two field picture of process object
For the size (step S30) of face.In addition, the first calculating part 6c obtains benchmark face-image F from memory 2, and to the benchmark
Face-image F also determines to constitute the pixel count of the facial facial zone of personage.
Then, the first calculating part 6c is determined relative to normal condition according to the change for the pixel count for constituting facial zone
, the change of the facial size of the personage that two field picture is included, and evaluated simultaneously according to the change of the facial size determined
Determine the fraction (step S31) for calculating privacy class.
Then, 6c pairs of the first calculating part face detection corresponding with the facial zone of the benchmark face-image F personages included
The shape of frame and the shape of face detection frame corresponding with the facial zone for the personage that new two field picture is included are compared, and root
The central shaft and the anglec of rotation (step S32) for the facial rotation for calculating personage according to the change of the shape of face detection frame.
Then, the first calculating part 6c judge the personage detected from the two field picture of process object facial direction whether as
Positive (step S33).
(the step S33 when the direction that face is determined in step S33 is not front;It is no), the first calculating part 6c is determined
Relative to the variable quantity or rate of change of the anglec of rotation of normal condition, the personage face centered on defined axle, and according to
The fraction (step S34) for calculating privacy class is evaluated and determined in the change for the facial anglec of rotation determined.
On the other hand, (the step S33 when determining the direction of face in step S33 for front;It is), the first calculating part
6c skips step S34 processing.
Fig. 4 is transferred to, detection process portion 6b is from the two field picture as the process object, detection in face detection processing
The constituting parts (step S35) of the face such as detection eyes, mouth in the facial zone gone out.In addition, for benchmark face-image F, inspection
Survey the constituting parts that processing unit 6b also detects the face such as eyes, mouth out of facial zone.
Then, the first calculating part 6c according to face constituting parts number change, i.e. according to relative to normal condition
, the change of the ratio that the facial constituting parts of personage are blocked, to evaluate and determine the fraction for calculating privacy class
(step S36).
Then, detection process portion 6b determines whether to detect eyes (step among the two field picture as process object
S37)。
Here, when determine detect eyes when (step S37;It is), detection process portion 6b is to the frame as process object
Whether image carries out line-of-sight detection processing (step S38), and judge the sight as the personage of subject as positive (step
S39)。
When determining (step S39 when sight is not front in step S39;It is no), the first calculating part 6c determine relative to
The change of normal condition, personage as subject sight, and evaluated according to the change of the sight of the personage determined
And determine fraction (step S40) for calculating privacy class.
On the other hand, (the step S39 when determining sight in step S39 for front;It is), the first calculating part 6c is skipped
Step S40 processing.
In addition, determined in step S37 be not detected by eyes in the case of (step S37;It is no), skip step S38~
S40 each processing.
Then, results of the first calculating part 6c based on fraction corresponding with the wearing product evaluation in step S26, step S29
In fraction corresponding with the color of skin evaluate result, the fraction corresponding with the change of the size of face in step S31
In the result of evaluation, the result of the fraction evaluation corresponding with the change of the anglec of rotation of face in step S34, step S36
In the result and step S40 of fraction evaluation corresponding with the change of ratio that the constituting parts of face are blocked and personage
Sight the result evaluated of the corresponding fraction of change, privacy class (step S41) is calculated using defined conversion formula.That is,
First calculating part 6c calculates privacy class so that change relatively small two field picture relative to normal condition, personage, then
Privacy class is relatively lower, in addition, relative to the relatively large two field picture of normal condition, personage change, then privacy level
It is relatively not higher.
In addition, hidden to calculate for carrying out overall merit to whole assessment items for the conversion formula for calculating privacy class
Private rank, for example, answering preferential assessment item (for example, direction of face etc.) and not to assessment item progress that can specify
In the case that fraction is evaluated, privacy class can also be calculated according to the result of the fraction evaluation to other assessment items.
Thus, privacy class calculating processing is terminated.
<Manual shooting processing>
Then, reference picture 5 illustrates the manual shooting processing carried out by image processing apparatus 100.
Fig. 5 is the flow chart for an example for showing the action that manual shooting processing is related to.
As shown in figure 5, when when the shooting of the live view image of the subject carried out by image pickup part 3, signal transacting
The signal of the analogue value for the two field picture that 4 pairs of portion is related to from the electro-photographic portion 3b live view images forwarded implements various image letters
Number processing, generate digital value view data (step S13).Deposited in addition, the view data of generation is output to by signal processing part 4
Reservoir 2, the view data of 2 pairs of inputs of memory is preserved temporarily.
Then, central control 1 determines whether that the push of the shutter release button by operation inputting part 11 is clapped
Take the photograph instruction and (shoot and indicate operation) (step S14).
Carry out shooting (step S14 when instruction is operated when determining;It is), shooting control part 6e control image pickup parts 3 make it
Shoot the recording image (step S15) of subject.Specifically, for example, shooting control part 6e makes image pickup part 3 enter subject
Row shooting, and signal processing part 4 is generated view data.Then, image processing part 7 with defined compressed format (for example, JPEG
Form etc.) view data generated by signal processing part 4 is encoded, generate the view data of recording image.
Next, the first image acquiring unit 6a of operation control part 6 reads among memory 2 and obtains what is preserved temporarily
The view data (step S16) for the two field picture that live view image is related to, operation control part 6 carries out above-mentioned privacy class and calculated
Handle (reference picture 3 and Fig. 4) (step 17).
Hereafter, image record portion 8 obtains recording image from image processing part 7 and is used as privacy image P, and obtains
The privacy class that calculates in privacy class calculating processing, and using privacy class as Exif information with privacy image P's
View data is set up and is accordingly recorded (step S18).
Thus, manual shooting processing is terminated.
As described above, according to the image processing apparatus 100 of embodiment 1, calculating the two field picture imaged by image pickup part 3
The facial privacy class for the personage that (example of the species of image) is included (is used to judge whether privacy class meets regulation
Condition an example, wherein, privacy class represent to identify be specific personage facial degree of difficulty), judge
Go out calculating privacy class it is higher than defined decision content in the case of, control image pickup part 3 makes its shoot recording image (privacy
Image P) (example for performing the control for the defined processing that image is related to).Thus, for example without to as open, record
The image of object carry out mosaic processing, block the image procossings etc. such as processing, will not be carried out mosaic due to local, block
And make variation attractive in appearance, more natural privacy image P corresponding with desired privacy class can be obtained.
Taken moreover, the recording image (privacy image P) photographed on the basis of privacy class is sent to record
External device (ED) as defined in business device S etc., so as to which the image is disclosed in Web page as content.
In addition, though causing in above-mentioned embodiment 1 higher than defined decision content in the privacy class for determining calculating
In the case of control image pickup part 3 its is shot recording image (privacy image P), it is still, opposite it is also possible that generally clapping
Take the photograph privacy image P but do not shoot privacy image P in the case where the privacy class for not determining calculating is higher than defined decision content
(example for performing the control for the defined processing that image is related to).Thus, it can obtain and the same effect of above-mentioned embodiment 1
Really.
In addition, calculate privacy class successively according to each two field picture imaged successively by image pickup part 3, and based on according to
The privacy class of secondary calculating result of determination whether higher than defined decision content, using below from privacy class as defined decision content
State change be that the privacy class state higher than defined decision content is opportunity, recording image is shot, thus, for example energy
It is enough to be acted during the shooting of live view image as user oneself as being made privacy class change, so as to adjust
The privacy class of each two field picture, also, be transitioned into the privacy class state higher than defined decision content, thus obtain with it is desired
The corresponding image of privacy class.
Furthermore it is possible to based on relative to can recognize that the face of the personage that two field picture is included is the face of specific personage
Normal condition (for example, be capable of detecting when the face of personage and the face of the personage and sight for prescribed direction state etc.)
, personage face or constitute the personage facial position relative change, to calculate privacy class.For example, can be based on
The variable quantity or rate of change, image processing apparatus 100 of the anglec of rotation of the face of personage centered on defined axle are relative to people
The variable quantity or rate of change in the facial direction of thing, to calculate privacy class.That is, by using relative to normal condition, people
The relative change at the facial position that is facial or constituting the personage of thing, so that for example without being predefined to each privacy class
The anglec of rotation of face, and face is rotated to be the angle of determination, or adjust the direction of image processing apparatus 100, Neng Gouti
High ease of use, and the manifestation mode for the privacy image P that can make on the basis of privacy class and obtain is colorful.
And then, the change for the ratio that the facial position that is facial or constituting the personage based on the personage in two field picture is blocked
Change to calculate privacy class, or the personage included based on two field picture facial size or the facial camera distance to personage
Change calculate privacy class, or screening based on the outside blocked to the face of personage or the facial constituting parts
The change of block material calculates privacy class, therefore energy to calculate privacy class, or the change of the facial color based on personage
It is enough to increase the benchmark for calculating privacy class, the privacy image P manifestation mode obtained on the basis of privacy class can be made more
Add coloured silk.
In addition, accordingly being recorded by the way that privacy image P is set up with privacy class, so as to make display panel 9b
Privacy image P is shown according to the classification on the basis of establishing corresponding privacy class or order, there are multiple privacies in record
Also the desired privacy image P of user can be easily selected in the case of image P, so as to improve ease of use.
In addition, in manual shooting processing, being controlled as follows, i.e. calculate and represent to identify the personage that two field picture is included
Face be specific personage facial degree of difficulty privacy class, and using the privacy class calculated, and according to user
Shooting indicate to set up as the image that is shot of record image and accordingly recorded (the defined place that execution image is related to
One example of the control of reason).Corresponding privacy is established thereby, it is possible to be shown in the lump when calling record image to be shown
Rank, or will be established with privacy class it is corresponding record image be output to external device (ED).Moreover, in user with reference to showing
The record image shown establish corresponding privacy class or external device (ED) established with reference to the record image with being exported it is corresponding
Privacy class (be used to judging privacy class whether meet as defined in condition an example) and determine the privacy class ratio
In the case that defined decision content is high, it is capable of deciding whether as disclosed object or carries out disclosed handle.
In addition, as performing one of control of defined processing for being related to of image, will be indicated according to the shooting of user
During the image photographed shows that the playback of stipulated time is shown, the privacy class calculated by showing in the lump, so that with
Family is capable of deciding whether the object as open record, and performed according to the defined operation during playback display disclosure,
The processing of the taken image of record.
In addition, the decision content of privacy class can also be automatically set in the above-described embodiment.
Hereinafter, reference picture 11 illustrates the decision content setting processing carried out by image processing apparatus 100.
Figure 11 is the flow chart for an example for showing the action that decision content setting processing is related to.
If decision content setting processing is the processing carried out before above-mentioned automatic shooting processing, and is set have selected
The processing carried out in the state of the pattern for being scheduled on the decision content used in automatic shooting processing.
When as shown in figure 11, when the shooting of the live view image of the subject carried out by image pickup part 3, signal transacting
The signal of the analogue value for the two field picture that 4 pairs of portion is related to from the electro-photographic portion 3b live view images forwarded implements various image letters
Number processing, generate digital value view data (step S51).Deposited in addition, the view data of generation is output to by signal processing part 4
Reservoir 2, the view data of 2 pairs of inputs of memory is preserved temporarily.
First image acquiring unit 6a of operation control part 6, which reads among memory 2 and obtains live view image, to be related to
Two field picture view data (step S52).
Then, detection process portion 6b carries out face detection processing to the two field picture got by the first image acquiring unit 6a,
To detect the facial facial zone for including the personage as subject, and line-of-sight detection processing is carried out, quilt is used as to detect
Take the photograph the sight (step S53) of the personage of body.
Next, the first calculating part 6c is handled based on the face detection processing carried out by detection process portion 6b and line-of-sight detection
Result, come determine whether to determine be capable of detecting when the face of personage and the face and sight of the personage be positive direction base
Quasi- state (step S54).
When determined in step S54 do not determine normal condition when (step S54;It is no), the first image acquiring unit 6a is from depositing
The view data (step S55) for the new two field picture that live view image is related to is read and obtained among reservoir 2, and returns processing
Return to step S53.Then, in step S53, detection process portion 6b with it is above-mentioned substantially same to by the first image acquiring unit 6a
The new two field picture got carries out face detection processing and line-of-sight detection processing (step S53).
On the other hand, when determined in step S54 normal condition is determined when (step S54;It is), the first image is obtained
Portion 6a reads and obtained the view data (step S56) for the new two field picture that live view image is related among memory 2, moves
Make privacy class calculating processing (reference picture 3 and Fig. 4) (step that control unit 6 calculate the privacy class of the new two field picture
S57).In addition, the view data of the two field picture (benchmark face-image F) of normal condition can be temporarily held in 2 kinds of memory.
Then, operation control part 6 determine whether will in privacy class calculating processing the privacy class that calculate as
It is used for the decision content (step S58) for judging privacy class in automatic shooting processing.Specifically, for example, display control unit 9a makes to show
Show that panel 9b is shown in the confirmation of privacy class calculated in privacy class calculating processing and (is omitted and illustrated) with picture.Then,
After privacy class is confirmed by user, according to whether predetermined operation based on operation inputting part 11 and have input to calculate
Privacy class is used as the instruction of decision content, and operation control part 6 determines whether that the privacy class that will be calculated is used as decision content.
Here, when determining not using the privacy class calculated as decision content (step S58;It is no), operation control part 6
Processing is returned to step S56, and perform the later each processing of step S56.That is, in step S56, the first image acquiring unit 6a
Obtain the view data of new two field picture, and the progress privacy class calculating processing in step S57.
On the other hand, (the step S58 when determining the privacy class that will be calculated in step S58 as decision content;
It is), the privacy class calculated is set as decision content (step S59) by operation control part 6.Specifically, the of operation control part 6
The privacy class calculated is output to memory 2 by one calculating part 6c, and memory 2 enters the privacy class of input as decision content
Row is interim to be preserved.
Thus, decision content setting processing is terminated.
[embodiment 2]
Hereinafter, reference picture 12 is illustrated to the image processing apparatus 200 of embodiment 2.
Figure 12 is the block diagram for showing to apply the schematic configuration of the image processing apparatus 200 of embodiments of the present invention 2.
As shown in figure 12, the image processing apparatus 200 of present embodiment possesses:Central control 1, memory 2, action control
Portion 206 processed, image processing part 7, image record portion 8, display part 9, communication control unit 10 and operation inputting part 11.
In addition, central control 1, memory 2, operation control part 206, image processing part 7, image record portion 8, display part
9 and communication control unit 10 connected via bus 12.
In addition, in addition to aspect described further below, image processing apparatus 200 and the above-mentioned implementation of embodiment 2
The image processing apparatus 100 of mode 1 is same structure, omits detailed description.
Operation control part 206 possesses the second image acquiring unit 206a, detection process portion 6b, the second calculating part 206c, second
Determination unit 206d and acquisition control unit 206f.
Second image acquiring unit 206a obtains record image from image record portion 8.
Specifically, the second image acquiring unit 206a is obtained and for example remembered as the process object of image acquisition process (aftermentioned)
Record the record image in image record portion 8.In addition, in the case that record has multiple record images in image record portion 8, example
Operation will can only can also such as be inputted based on user using whole record images as the process object of image acquisition process
The predetermined operation in portion 11 and specify out record image as image acquisition process process object.
Second calculating part (computing unit) 206c calculates privacy class.
That is, the second calculating part 206c is based on detection process portion 6b to the record figure that is got by the second image acquiring unit 206a
The testing result of picture, the face for being possible to identify the personage that record image is included is the facial state of specific personage, example
Such as, it is capable of detecting when the face of personage and the face of the personage and sight is set as vacation for the state of positive direction (prescribed direction)
The normal condition thought.Moreover, the second calculating part 206c determine relative to it is the imaginary normal condition, personage it is whole face,
The relative change of the facial constituting parts, to calculate privacy class.
That is, for the record image got by the second image acquiring unit 206a, the second calculating part 206c assumes personage's
Face orientation is positive and sight is positive state, and the state is set as into imaginary normal condition.Moreover, the second calculating part
206c determines whole face, face of the personage imaginary normal condition, going out from record image detection relative to setting
Constituting parts with defined axle (for example, yawing axis, pitch axis;Reference picture 6) centered on the anglec of rotation imaginary change.
In addition, the second calculating part 206c determines the personage's imaginary normal condition, going out from record image detection relative to setting
The imaginary change of sight.
Moreover, faces of the second calculating part 206c based on the personage relative to imaginary normal condition determined is to provide
Axle centered on the anglec of rotation imaginary change, relative to imaginary normal condition the personage as subject sight
Imaginary change, to calculate privacy class.
In addition, faces of the second calculating part 206c based on personage or constituting the ratio that the facial position of the personage is blocked
Rate, to calculate privacy class.
That is, the second calculating part 206c is determined from the people gone out by the second image acquiring unit 206a record image detections got
The number of the facial constituting parts (for example, eyes, mouth etc.) of thing.Moreover, the second calculating part 206c is for example based on relative to only
Will be as the personage of subject without shelter (for example, sunglasses, mask etc.) with the hand of oneself, hair or outside etc.
Block whole face, the number for the facial constituting parts that the action of the constituting parts (for example, eyes, mouth etc.) of face just can determine that
The ratio that mesh (for example, being " 3 " in the case of eyes and mouth), personage facial constituting parts are blocked, it is hidden to calculate
Private rank.
In addition, the facial size of personage that the second calculating part 206c is included based on record image or to the facial of personage
Camera distance, to calculate privacy class.
That is, the second calculating part 206c will be for example constituted from the record image detection got by the second image acquiring unit 206a
The pixel count for the facial zone that the face of the personage gone out is included is defined as recording the facial size for the personage that image is included.
Moreover, the facial size for the personage that the second calculating part 206c is included based on the record image determined, to calculate privacy class.
In addition, the second calculating part 206c for example establishes corresponding Exif information to obtain shooting according to the view data with record image
The focal length in portion 3 etc., is scaled the facial camera distance of personage, to calculate privacy class.
In addition, changes of the second calculating part 206c based on relative to normal condition outside shelter, to calculate privacy
Rank.
That is, the second calculating part 206c is to the shelter from the benchmark face-image F outsides detected and from by the second image
The species of the shelter for the outside that the record image detection that acquisition unit 206a is got goes out is compared, to determine relative to benchmark
State, outside shelter change.Moreover, the second calculating part 206c is relative to normal condition, outer based on determining
The change of the shelter in portion, to calculate privacy class.
In addition, by the second calculating part 206c is based on hidden to calculate relative to normal condition, outside shelter change
The specific method of private rank is roughly the same with the method that the first calculating part 6c of above-mentioned embodiment 1 is utilized, and omits detailed herein
Thin explanation.
In addition, changes of the second calculating part 206c based on relative to normal condition, personage facial color, to calculate
Privacy class.
That is, the second calculating part 206c is to the facial color from the benchmark face-image F personages detected and from by second
The facial color for the personage that the record image detection that image acquiring unit 206a is got goes out is compared, to determine relative to base
The change of quasi- state, personage facial color.Moreover, the second calculating part 206c is based on determining relative to normal condition
, the change of the facial color of personage, to calculate privacy class.
In addition, by the second calculating part 206c based on relative to normal condition, personage facial color change come based on
The specific method for calculating privacy class is roughly the same with the method that the first calculating part 6c of above-mentioned embodiment 1 is utilized, and saves herein
Slightly it is described in detail.
Whether the second determination unit (identifying unit) 206d judges the privacy class calculated by the second calculating part 206c than regulation
Decision content it is high.
That is, the second determination unit 206d judges record image is calculated by the second calculating part 206c in image acquisition process
Whether privacy class is higher than decision content.Specifically, the second determination unit 206d is obtained among memory 2 and is stored in memory 2
Desired privacy class be used as decision content, and judge whether decision content is compared by the privacy class that the second calculating part 206c is calculated
It is high.
Obtain control unit 206f and obtain the image for performing defined processing.
That is, control unit 206f is obtained to obtain as the privacy image P for performing defined processing (for example, transmission processing etc.)
The privacy class record image higher than decision content is determined as by the second determination unit 206d.
<Image acquisition process>
Then, reference picture 13 illustrates the image acquisition process carried out by image processing apparatus 200.
Figure 13 is the flow chart for an example for showing the action that image acquisition process is related to.
As shown in figure 13, the second image acquiring unit 206a reads the figure of any one record image among image record portion 8
As data, obtained (step S61) as the process object of the image acquisition process.
Then, operation control part 206 calculate at the privacy class calculating of the privacy class of the record image got
Manage (reference picture 14 and Figure 15) (step S62;Describe in detail below).
Then, the second determination unit 206d of operation control part 206 judges to fall into a trap in step S62 privacy class calculating processing
Whether the privacy class calculated is higher than decision content (step S63).Specifically, the second determination unit 206d is read among memory 2
And decision content is obtained, and judge whether the privacy class calculated in privacy class calculating processing is higher than decision content.
Here, when determine privacy class it is higher than decision content when (step S63;It is), obtain control unit 206f and be used as transmission
The record image (step S64) of process object is obtained to the privacy image P of record server S.
Hereafter, operation control part 206 determines whether to regard the whole record images recorded in image record portion 8 as figure
As the process object for obtaining processing is handled (step S65).
In addition, (step S63 in the case of privacy class ought be determined in step S63 unlike decision content height;It is no), action
Control unit 206 skips step S64 processing, similarly determines whether to regard whole record images as the place of image acquisition process
Reason object is handled.
When being determined in step S65 not using whole record images as at the process object progress of image acquisition process
(step S65 during reason;It is no), the second image acquiring unit 206a reads the picture number of new record image among image record portion 8
According to being obtained (step S66) as the process object of the image acquisition process, processing is returned to step S62.Then,
In step S62, operation control part 206 and above-mentioned substantially same the privacy level for carrying out calculating the new record image got
Other privacy class calculating handles (step S62).
On the other hand, when determined in step S65 using whole record images as image acquisition process processing pair
As having carried out (step S65 during processing;It is), terminate image acquisition process.
<Privacy class calculating is handled>
Then, reference picture 14 and Figure 15 illustrate the privacy class calculating processing carried out by image processing apparatus 200.
Figure 14 and Figure 15 are the flow charts for an example for showing the action that privacy class calculating processing is related to.
As shown in figure 14, first, detection process portion 6b to the record image as process object (for example, in Figure 13 step
Record image got in rapid S56 etc.) face detection processing (step S71) is carried out, and determine whether to detect comprising personage
Facial facial zone (step S72).
Here, when determine detect facial zone when (step S72;It is), the second calculating part 206c is from process object
The facial zone that goes out of record image detection in detection sunglasses, wear product (outside shelter) (step as mask
S73), and determine whether to detect the product of wearing (outside shelter) (step S74).Here, detecting the product of wearing when determining
(step S74 when (outside shelter);It is), with reference to shelter table ST (Fig. 9 A), according to product of wearing (outside shelter)
The fraction (step S75) for calculating privacy class is evaluated and determined to species.On the other hand, wear when determining to be not detected by
(step S74 during product (outside shelter);It is no), skip step S75 processing.
Then, the colour of skin area during the second calculating part 206c determines the facial zone gone out from the record image detection of process object
The color in domain, and the second calculating part 206c obtains benchmark face-image F from memory 2, and benchmark face-image F is also surveyed
Determine the color of the area of skin color in facial zone, calculated the color detection processing of the skin of the difference of each colour of skin determined
(step S76).Then, it is determined that the difference for whether detecting the colour of skin calculated is the defined value above and the colour of skin that is typically different
(step S77).Here, when determining the colour of skin for detecting and being typically different (step S77;It is), evaluate and determine and skin
Color (difference of the colour of skin calculated) corresponding be used to calculate the fraction (step S78) of privacy class.On the other hand, judgement is worked as
(step S77 when going out the colour of skin for being not detected by and being typically different;It is no), skip step S78 processing.
Then, the second calculating part 206c will constitute the pixel count of the facial zone gone out from the record image detection of process object
It is defined as the size (step S79) of face.Then, the second calculating part 206c is evaluated and determined according to the facial size determined
The fixed fraction (step S80) for being used to calculate privacy class.
On the other hand, when determine be not detected by facial zone when (step S72;It is no), terminate privacy class calculating processing.
Then, the face that the second calculating part 206c is possible to identify the personage that record image is included is specific personage's
The state of face, for example, the face and sight that are capable of detecting when the facial and personage of personage are positive direction (prescribed direction)
State be set as imaginary normal condition (step S81).Hereafter, the second calculating part 206c calculates imaginary relative to setting
Normal condition, the central shaft of the facial rotation of personage that detects and the anglec of rotation (step S82).
Then, the second calculating part 206c judges that the facial direction of the personage gone out from the record image detection of process object is
No is positive (step S83).
(the step S83 when the direction that face is determined in step S83 is not front;It is no), the second calculating part 206c is true
The variable quantity of the fixed anglec of rotation relative to imaginary normal condition, personage face centered on defined axle or change
Rate, and evaluate and determine the fraction for calculating privacy class according to the imaginary change of the facial anglec of rotation determined
(step S84).
On the other hand, (the step S83 when determining the direction of face in step S83 for front;It is), the second calculating part
206c skips step S84 processing.
Be transferred to Figure 15, detection process portion 6b from it is in the record image as process object, in face detection processing
The constituting parts (step S85) of the face such as detection eyes, mouth in the facial zone detected.Then, the second calculating part 206c roots
According to the number of the facial constituting parts detected, i.e. as long as according to whole facial, face without blocking relative to personage
The number for the facial constituting parts that the action of constituting parts (for example, eyes, mouth etc.) just can determine that is (for example, in eyes and mouth
In the case of be " 3 "), ratio that the facial constituting parts of personage are blocked evaluates and determines to be used for calculate privacy class
Fraction (step S86).
Then, detection process portion 6b determines whether to detect eyes (step among the record image as process object
S87)。
Here, when determine detect eyes when (step S87;It is), detection process portion 6b is to the note as process object
Record image and carry out line-of-sight detection processing (step S88), and judge the sight as the personage of subject whether as positive (step
S89)。
When determining (step S89 when sight is not front in step S89;It is no), the second calculating part 206c determines relative
Commented in the change of imaginary normal condition, personage sight, and according to the imaginary change of the sight of the personage determined
Valency simultaneously determines fraction (step S90) for calculating privacy class.
On the other hand, (the step S89 when determining sight in step S89 for front;It is), the second calculating part 206c is jumped
Cross step S90 processing.
In addition, determined in step S87 be not detected by eyes in the case of (step S87;It is no), skip step S88~
S90 each processing.
Then, result, step of the second calculating part 206c based on fraction corresponding with the wearing product evaluation in step S75
Fraction evaluation corresponding with the size of face in result that fraction corresponding with the color of skin in S78 is evaluated, step S80
Result, the fraction evaluation corresponding with the change of the facial anglec of rotation relative to imaginary normal condition in step S84
Result, the corresponding fraction of the ratio being blocked with the constituting parts of face in step the S86 result and step evaluated
The result that the corresponding fraction of the change with the sight of the personage relative to imaginary normal condition in S90 is evaluated, it is hidden to calculate
Private rank (step S91).
Thus, privacy class calculating processing is terminated.
As described above, according to the image processing apparatus 200 of embodiment 2, calculating record image (one of the species of image
Example) include personage facial privacy class (be used to judging privacy class whether meet as defined in condition an example,
Wherein, privacy class represent to identify be specific personage facial degree of difficulty), determining the privacy class ratio of calculating
In the case that defined decision content is high, obtains the image and (perform image to be related to as the image for being sent to defined external device (ED)
Defined processing control an example).Thus, in the same manner as above-mentioned embodiment 1, for example without to as it is open,
The image of the object of record carries out mosaic processing, blocks image procossings such as processing etc., will not due to it is local be carried out mosaic,
Block and make variation attractive in appearance, more natural privacy image P corresponding with desired privacy class can be obtained.
Furthermore it is possible to based on relative to can recognize that the face of personage that record image is included is the face of specific personage
The imaginary normal condition in portion is (for example, the face and sight that are capable of detecting when the facial and personage of personage are prescribed direction
State etc.), personage face or constitute the personage facial position relative change, to calculate privacy class.Therefore, with
Above-mentioned embodiment 1 substantially in the same manner, by using relative to imaginary normal condition, personage face or constituting the personage
Facial position relative change, so as to improve ease of use, and can make to obtain on the basis of privacy class
The privacy image P taken manifestation mode is colorful.
And then, face based on the personage in record image or constitute ratio that the facial position of the personage is blocked come
Calculate privacy class, or the facial size of personage that is included based on record image or to personage facial camera distance come
Calculate privacy class, or the change based on the shelter of outside blocked to the face of personage or the facial constituting parts
Change to calculate privacy class, or the change of the facial color based on personage to calculate privacy class, therefore, it is possible to increase meter
The benchmark of privacy class is calculated, the manifestation mode for the privacy image P that can make on the basis of privacy class and obtain is more colorful.
In addition, accordingly being recorded by the way that privacy image P and privacy class are set up, so as to make display panel 9b
Privacy image P is shown according to the classification on the basis of establishing corresponding privacy class or order, there are multiple privacies in record
Also the desired privacy image P of user can be easily selected in the case of image P, so as to improve ease of use.
In addition, the present invention is not limited to above-mentioned embodiment 1,2, it can enter without departing from the scope of spirit of the present invention
The various improvement of row and the change of design.
For example, although the privacy image P record server Ss for being sent to outside are subjected to public affairs in above-mentioned embodiment 1,2
Open, but this is an example, however it is not limited to this.For example, can the outside of the privacy image P transmitted by disclosure service
Only shown without privacy image P record in device, and delete shown privacy image P successively, in addition, also may be used
So that the image processing apparatus 200 of the image processing apparatus 100 of embodiment 1, embodiment 2 possesses server capability, and from
Outside terminal is linked into image processing apparatus 100,200 couples of privacy image P and read.In this case, for example can be right
Whether each privacy image P is carried out openly according to establishing corresponding privacy class and be automatically set.
In addition, in above-mentioned embodiment 1,2, the facial size of the personage that privacy image P can also be included as
Exif information and set up and accordingly recorded with privacy image P view data, display control unit 9a makes display panel 9b
Classification or order on the basis of by the facial size for establishing corresponding personage are recorded in image record portion 8 to show
Privacy image P.
In addition, the structure on image processing apparatus 100,200, the simply example illustrated in above-mentioned embodiment 1,2
Son, however it is not limited to this.For example, although image processing apparatus 100 is equipped with image pickup part 3, but not limited to this, can not also carry
Image unit and be connected with outside image unit in the way of it can communicate, shoot and control.
In addition, although be set in above-mentioned embodiment 1 by first determination unit under the control of central control 1
6d, shooting control part 6e are driven to realize the structure of the function as identifying unit, control unit, but not limited to this,
It can also be set to by by central control 1 performs regulated procedure etc. the structure realized.
That is, (it will omit and illustrate) in program storage comprising determination processing program, the program of control process program record.And
And, can be made by determination processing program the CPU of central control 1 as judge privacy class whether meet as defined in condition
Unit function, wherein, privacy class represents to identify that the face for the personage that image is included is the facial of specific personage
Degree of difficulty.Furthermore, it is possible to which being used as the CPU of central control 1 by control process program is determining privacy class satisfaction
The unit function of the execution for the defined processing that control image (privacy image P) is related in the case of defined condition.
Similarly, although be set in above-mentioned embodiment 2 by second calculating part under the control of central control 1
206c, acquisition control unit 206f are driven to realize the structure of the function as computing unit, control unit, but are not limited to
This, can also be set to by by central control 1 performs regulated procedure etc. the structure realized.
That is, (it will omit and illustrate) in program storage comprising calculating processing routine, the program of control process program record.And
And, the CPU of central control 1 can be made as the unit function for calculating privacy class by calculating processing routine, wherein,
Privacy class represents that the face for identifying the personage that image is included is the facial degree of difficulty of specific personage.Furthermore, it is possible to
The CPU of central control 1 is set to control image (privacy image P) as the privacy class using calculating by control process program
The unit function of the execution for the defined processing being related to.
And then, as the medium for the embodied on computer readable for preserving the program for being used to perform above-mentioned each processing, except application
Beyond ROM, hard disk etc., additionally it is possible to the portable recording medium such as nonvolatile memory, the CD-ROM such as application flash memory.This
Outside, as the medium for the data that program is provided via defined communication line, carrier wave (carrier wave) can also be applied.
Several embodiments of the invention is illustrated, but the scope of the present invention is not limited to above-mentioned embodiment party
Formula, including the scope of invention recorded of claims and the scope that is equal with it.
Claims (18)
1. a kind of image processing apparatus, it is characterised in that possess processor,
The processor:
Judge to represent to identify privacy class of the face for the personage that image is included as the facial degree of difficulty of specific personage
Whether defined condition is met;
The result of determination of condition according to as defined in meeting whether the privacy class, to control defined processing that image is related to
Perform.
2. image processing apparatus according to claim 1, it is characterised in that
The processor:
Also calculate and represent to identify that the face for the personage that described image is included is the hidden of the facial degree of difficulty of specific personage
Private rank;
Judge whether the privacy class calculated is higher than defined decision content.
3. image processing apparatus according to claim 2, it is characterised in that
The processor is based on relative to can recognize that the face of the personage that described image is included is the specific personage's
Defined normal condition, the personage the face of face or the relative change at the facial position for constituting the personage, to count
Calculate the privacy class.
4. image processing apparatus according to claim 3, it is characterised in that
The variable quantity or rate of change of the face anglec of rotation by defined axle centered on of the processor based on the personage, come
Calculate the privacy class.
5. image processing apparatus according to claim 3, it is characterised in that
The processor is based on relative to face that the is defined normal condition, the personage facial or constituting the personage
Position relative change, to calculate the privacy class, wherein, it is described as defined in normal condition include being capable of detecting when it is described
The face of personage and the face of the personage and the state that sight is prescribed direction.
6. image processing apparatus according to claim 3, it is characterised in that
Variable quantity or rate of change of the processor based on the image processing apparatus relative to the facial direction of the personage, come
Calculate the privacy class.
7. image processing apparatus according to claim 3, it is characterised in that
The change for the ratio that facial position facial or that constitute the personage of the processor based on the personage is blocked, comes
Calculate the privacy class.
8. image processing apparatus according to claim 3, it is characterised in that
The facial size for the personage that the processor is included based on described image or the facial camera distance to the personage
Change, to calculate the privacy class.
9. image processing apparatus according to claim 3, it is characterised in that
The facial constituting parts of facial or described personage of the processor based on the personage included to described image hide
The change of the shelter of the outside of gear, to calculate the privacy class.
10. image processing apparatus according to claim 3, it is characterised in that
The change of the facial color for the personage that the processor is included based on described image, to calculate the privacy class.
11. image processing apparatus according to claim 2, it is characterised in that
The processor:
Calculate the privacy class of the image imaged by image pickup part;
In the case where the privacy class for being judged to calculating is higher than defined decision content, enters to exercise the image pickup part and shoot
The control of recording image.
12. image processing apparatus according to claim 11, it is characterised in that
The processor:
The privacy class is calculated successively to each image imaged successively by the image pickup part;
Judge whether the privacy class calculated successively is higher than the defined decision content successively;
It is institute using the state change from the privacy class as defined in described below decision content based on the result after judging successively
The privacy class state higher than the defined decision content is stated for opportunity, to carry out the control for shooting the recording image.
13. image processing apparatus according to claim 11, it is characterised in that
The processor:
The recording image photographed and the privacy class calculated are set up and accordingly recorded in record portion;
Enter to exercise image of the record in the record portion according to using with the image establish the corresponding privacy class as
The classification of benchmark or order are shown in the control of display part.
14. image processing apparatus according to claim 2, it is characterised in that
The processor:
Calculate the privacy class of image of the record in record portion;
Enter to be about to be determined as that the privacy class calculated the image higher than defined decision content is defined outer as being sent to
The image of part device is come the control that obtains.
15. a kind of image processing apparatus, it is characterised in that possess processor,
The processor:
Calculate and represent to identify privacy class of the face for the personage that image is included for the facial degree of difficulty of specific personage;
The execution for the defined processing that image is related to is controlled using the privacy class calculated.
16. image processing apparatus according to claim 15, it is characterised in that
Described image is the image imaged by image pickup part,
The processor enters the privacy class for being about to calculate with being shot by the image pickup part as recording image
Image set up accordingly record the control in record portion.
17. a kind of image processing method, has used image processing apparatus, described image processing method is characterised by, including:
Judge to represent to identify privacy class of the face for the personage that image is included as the facial degree of difficulty of specific personage
The processing of condition as defined in whether meeting;And
The result of determination of condition according to as defined in meeting whether the privacy class controls defined processing that image is related to
The processing of execution.
18. a kind of image processing method, has used image processing apparatus, described image processing method is characterised by, including:
Calculate and represent to identify privacy class of the face for the personage that image is included for the facial degree of difficulty of specific personage
Processing;And
The processing of the execution for the defined processing that image is related to is controlled using the privacy class calculated.
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015234701 | 2015-12-01 | ||
JP2015-234701 | 2015-12-01 | ||
JP2016-118545 | 2016-06-15 | ||
JP2016118545A JP6206542B2 (en) | 2015-12-01 | 2016-06-15 | Image processing apparatus, image processing method, and program |
Publications (2)
Publication Number | Publication Date |
---|---|
CN107038362A true CN107038362A (en) | 2017-08-11 |
CN107038362B CN107038362B (en) | 2020-11-17 |
Family
ID=59061085
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN201610908844.5A Active CN107038362B (en) | 2015-12-01 | 2016-10-18 | Image processing apparatus, image processing method, and computer-readable recording medium |
Country Status (2)
Country | Link |
---|---|
JP (1) | JP6206542B2 (en) |
CN (1) | CN107038362B (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109035167A (en) * | 2018-07-17 | 2018-12-18 | 北京新唐思创教育科技有限公司 | Method, apparatus, equipment and the medium that multiple faces in image are handled |
CN110049233A (en) * | 2018-01-16 | 2019-07-23 | 佳能株式会社 | Image processing equipment, image processing system and image processing method |
CN110188589A (en) * | 2018-02-23 | 2019-08-30 | 拉碧斯半导体株式会社 | Operate decision maker and operation determination method |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN109711297A (en) * | 2018-12-14 | 2019-05-03 | 深圳壹账通智能科技有限公司 | Risk Identification Method, device, computer equipment and storage medium based on facial picture |
US11430088B2 (en) * | 2019-12-23 | 2022-08-30 | Samsung Electronics Co., Ltd. | Method and apparatus for data anonymization |
CN116206558B (en) * | 2023-05-06 | 2023-08-04 | 惠科股份有限公司 | Display panel control method and display device |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101140620A (en) * | 2007-10-16 | 2008-03-12 | 上海博航信息科技有限公司 | Human face recognition system |
US8254647B1 (en) * | 2012-04-16 | 2012-08-28 | Google Inc. | Facial image quality assessment |
US20140037155A1 (en) * | 2011-02-14 | 2014-02-06 | Neti Solucoes Tecnologicas Ltda | Validation system for register confirmation and/or access authorization for natural persons using biometric facial recognitionvalidation system for register confirmation and/or access authorization for natural persons using biometric facial recognition |
US9036875B2 (en) * | 2013-02-06 | 2015-05-19 | Kabushiki Kaisha Toshiba | Traffic control apparatus, method thereof, and program therefor |
CN104718742A (en) * | 2013-10-16 | 2015-06-17 | 奥林巴斯映像株式会社 | Display device, image generation device, display method and program |
Family Cites Families (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP4036051B2 (en) * | 2002-07-30 | 2008-01-23 | オムロン株式会社 | Face matching device and face matching method |
JP2014067131A (en) * | 2012-09-25 | 2014-04-17 | Zenrin Datacom Co Ltd | Image processing apparatus, image processing system, image processing method, and computer program |
-
2016
- 2016-06-15 JP JP2016118545A patent/JP6206542B2/en active Active
- 2016-10-18 CN CN201610908844.5A patent/CN107038362B/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN101140620A (en) * | 2007-10-16 | 2008-03-12 | 上海博航信息科技有限公司 | Human face recognition system |
US20140037155A1 (en) * | 2011-02-14 | 2014-02-06 | Neti Solucoes Tecnologicas Ltda | Validation system for register confirmation and/or access authorization for natural persons using biometric facial recognitionvalidation system for register confirmation and/or access authorization for natural persons using biometric facial recognition |
US8254647B1 (en) * | 2012-04-16 | 2012-08-28 | Google Inc. | Facial image quality assessment |
US9036875B2 (en) * | 2013-02-06 | 2015-05-19 | Kabushiki Kaisha Toshiba | Traffic control apparatus, method thereof, and program therefor |
CN104718742A (en) * | 2013-10-16 | 2015-06-17 | 奥林巴斯映像株式会社 | Display device, image generation device, display method and program |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN110049233A (en) * | 2018-01-16 | 2019-07-23 | 佳能株式会社 | Image processing equipment, image processing system and image processing method |
US11064092B2 (en) | 2018-01-16 | 2021-07-13 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method for detecting a predetermined object in a captured image |
CN110049233B (en) * | 2018-01-16 | 2021-09-14 | 佳能株式会社 | Image processing apparatus, image processing system, and image processing method |
CN110188589A (en) * | 2018-02-23 | 2019-08-30 | 拉碧斯半导体株式会社 | Operate decision maker and operation determination method |
CN109035167A (en) * | 2018-07-17 | 2018-12-18 | 北京新唐思创教育科技有限公司 | Method, apparatus, equipment and the medium that multiple faces in image are handled |
CN109035167B (en) * | 2018-07-17 | 2021-05-18 | 北京新唐思创教育科技有限公司 | Method, device, equipment and medium for processing multiple faces in image |
Also Published As
Publication number | Publication date |
---|---|
JP2017108374A (en) | 2017-06-15 |
JP6206542B2 (en) | 2017-10-04 |
CN107038362B (en) | 2020-11-17 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107038362A (en) | Image processing apparatus and image processing method | |
CN107431760B (en) | The image processing method and storage medium of photographic device, photographic device | |
CN105609035B (en) | Image display device and method | |
US10546185B2 (en) | Image processing apparatus for performing image processing according to privacy level | |
CN110191291B (en) | Image processing method and device based on multi-frame images | |
CN106797453B (en) | Image processing apparatus, photographic device, image processing method and image processing program | |
CN109068058B (en) | Shooting control method and device in super night scene mode and electronic equipment | |
CN104995912B (en) | Camera head, image processing apparatus and image processing method | |
CN110072052B (en) | Image processing method and device based on multi-frame image and electronic equipment | |
CN101827214B (en) | Image processor and recording medium | |
CN107862653B (en) | Image display method, image display device, storage medium and electronic equipment | |
CN106165409B (en) | Image processing apparatus, photographic device, image processing method and program | |
CN103716529B (en) | Threshold value setting device, object detection device, threshold setting method | |
CN106664366A (en) | Image processing device, image capturing apparatus, image processing method, and program | |
CN107911625A (en) | Light measuring method, device, readable storage medium storing program for executing and computer equipment | |
CN113411498B (en) | Image shooting method, mobile terminal and storage medium | |
CN107424117B (en) | Image beautifying method and device, computer readable storage medium and computer equipment | |
CN105493493A (en) | Imaging device, imaging method, and image processing device | |
CN109712177A (en) | Image processing method, device, electronic equipment and computer readable storage medium | |
CN112771612A (en) | Method and device for shooting image | |
CN105453540A (en) | Image processing device, imaging device, image processing method, and program | |
CN106815803A (en) | The processing method and processing device of picture | |
CN110336945A (en) | A kind of intelligence assisted tomography patterning process and system | |
US20160292842A1 (en) | Method and Apparatus for Enhanced Digital Imaging | |
CN110677592B (en) | Subject focusing method and device, computer equipment and storage medium |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
GR01 | Patent grant | ||
GR01 | Patent grant |