Embodiment
To combine the accompanying drawing in the embodiment of the invention below, the technical scheme in the embodiment of the invention is carried out clear, intactly description, obviously, described embodiment only is the present invention's part embodiment, rather than whole embodiment.Based on the embodiment among the present invention, those of ordinary skills are not making the every other embodiment that is obtained under the creative work prerequisite, all belong to the scope of the present invention's protection.
The embodiment of the invention provides a kind of face image processing process, and is as shown in Figure 1, comprises the steps:
101, obtain first facial image through image acquiring device from the angle of overlooking, obtain second facial image from the angle of looking up.
102, the pixel in said first facial image, second facial image is projected on the display plane according to preset corresponding relation.
The face image processing process that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
As a kind of improvement of present embodiment, the embodiment of the invention provides another kind of face image processing process, and is as shown in Figure 2, comprises the steps:
201, through image acquiring device, such as, camera obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.
Can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image, use second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up from the angle of overlooking.
As a kind of embodiment of present embodiment, said first facial image, second facial image both can be the general images of people's face, also can be the parts of images of people's face.Can get the image of the first half of people's face such as said first facial image, said second facial image can be got the image of the latter half of people's face.
Another kind of embodiment as present embodiment; Said first facial image can be the people face part of said first image acquiring device from the image that the angle of overlooking gets access to, and said second facial image can be the people face part of said second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, said image acquiring device can be two.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, can be on the top of mobile phone the centre position of front end a camera is set, the centre position of front end is provided with a camera at the end of mobile phone.The viewfinder range of each camera can be set according to actual conditions, specifically can be following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As a kind of embodiment of present embodiment, the angle of looking up when the angle of overlooking when said first image acquiring device obtains image can be obtained image with said second image acquiring device is identical.
202, in order to make the said face image processing process of the embodiment of the invention more targeted; After said first image acquiring device gets access to image from the angle of overlooking; Adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After said second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
203, calculate the cosine value of the first angle a between said first facial image plane, place and the display plane, the cosine value of the second angle b between said second facial image plane, place and the display plane.
Here suppose the user when using the terminal device that adopts the said face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at said first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Usually; When making terminal device; Shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined; Then can confirm and first image acquiring device and people's face between shooting light path and the angle between the display plane mutually surplus angle a 0 can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the cosine value cos (b) of the second angle b between second facial image plane, place and the display plane according to shooting light path between predefined second image acquiring device and the people's face and the angle between the display plane.
Another kind of embodiment as this step; When making terminal device; Shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined; But the employing image recognition technology such as the organ identification technology, makes camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt following method to calculate the value of cos (a):
At first, need to confirm the distance between people's face and the display plane, can obtain the distance between people's face plane and the image display plane through range sensor.
Secondly, as shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at said first facial image place of AC representative and GD represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner; The cosine value of angle b0 between cosine value and the GF that can draw the second angle b between plane, second facial image place and the display plane and the GE equates;
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
204, according to the said first included angle cosine value be cos (a) with said the first face image projection to display plane, be that cos (b) projects to said second facial image on the display plane according to the said second included angle cosine value.
At first, calculate pixel in said first facial image to the coordinate of said display plane projection, and the pixel in said first facial image is projected on the said display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane.
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
In the present embodiment pixel in said first facial image is projected on the said display plane, perhaps the pixel in said second facial image is projected on the said display plane, can not have sequencing in time.
A kind of embodiment as present embodiment; The projection that same coordinate points on display plane is repeated, said pixel in said first facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said second facial image is not projected to.Perhaps
Said pixel in said second facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said first facial image is not projected to.
In this embodiment; After perhaps second facial image projects to display plane with first facial image; Next when projecting to another facial image on the display plane; No longer the whole pixels with another facial image all project on the display plane, but, only with the pixel in the facial image to correspondence not by the coordinate points projection of projection.The projection of avoiding the same coordinate points on the display plane to be repeated.
The face image processing process that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
The embodiment of the invention provides a kind of facial image treating apparatus, and is as shown in Figure 4, comprising: first acquiring unit 41, projecting cell 42.
Wherein, said first acquiring unit 41 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Said projecting cell 42 projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
The embodiment of the invention provides another kind of facial image treating apparatus, and is as shown in Figure 5, comprising: first acquiring unit 51, second acquisition unit 52, projecting cell 53.
Wherein projecting cell 53 comprises: acquisition module 531, projection module 532.
Wherein, said first acquiring unit 51 obtains first facial image from the angle of overlooking, and obtains second facial image from the angle of looking up.Second acquisition unit 52 obtains the distance between people's face plane and image display plane through range sensor.Said projecting cell 53 projects to the pixel in said first facial image, second facial image on the display plane according to preset corresponding relation; Be specially: at first; Acquisition module 531 obtains the first included angle cosine value between said first facial image plane, place and the display plane, the second included angle cosine value between said second facial image plane, place and the display plane; Projection module 532 according to the said first included angle cosine value with said the first face image projection to display plane, according to the said second included angle cosine value said second facial image is projected on the display plane.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
A kind of embodiment as present embodiment; Said first acquiring unit 51 can use first image acquiring device in the centre position of the top front end that is arranged on terminal device to obtain first facial image from the angle of overlooking, and uses second image acquiring device in the centre position of the end front end that is arranged on terminal device to obtain second facial image from the angle of looking up.
As a kind of embodiment of present embodiment, first facial image that said first acquiring unit 51 gets access to, second facial image both can be the general images of people's face, also can be the parts of images of people's face.
Another kind of embodiment as present embodiment; First facial image that said first acquiring unit 51 gets access to can be the people face part of said first image acquiring device from the image that the angle of overlooking gets access to, and said second facial image can be the people face part of said second image acquiring device from the image that the angle of looking up gets access to.
As a kind of embodiment of this step, said first acquiring unit 51 comprises two image acquiring devices.An image acquiring device is arranged on the centre position of the top front end of terminal device, and another image acquiring device is arranged on the centre position of the end front end of terminal device.With the mobile phone is example, can be on the top of mobile phone the centre position of front end a camera is set, the centre position of front end is provided with a camera at the end of mobile phone.The viewfinder range of each camera can be set according to actual conditions, specifically can be following scheme:
The first, the camera that is provided with of the centre position of mobile phone top front end is used for obtaining the image of the first half of people's face; The camera that the centre position of front end is provided with at the bottom of the mobile phone is used for obtaining the image of the latter half of people's face.
The second, two cameras can absorb the general image of people's face.
As a kind of embodiment of present embodiment, the angle of looking up when the angle of overlooking when said first image acquiring device obtains image can be obtained image with said second image acquiring device is identical.
More targeted when making first acquiring unit 51 in the said facial image treating apparatus of the embodiment of the invention obtain image; After said first image acquiring device gets access to image from the angle of overlooking; Adopt image recognition technology, from the image that gets access to, extract the people face part as the first pending facial image such as face recognition technology; After said second image acquiring device gets access to image from the angle of overlooking, adopt image recognition technology, from the image that gets access to, extract the people face part as the second pending facial image such as face recognition technology.
A kind of embodiment as present embodiment; Said acquisition module 531 obtains the first included angle cosine value between said first facial image plane, place and the display plane, and the second included angle cosine value between said second facial image plane, place and the display plane can adopt following method:
Here suppose the user when using the terminal device that adopts the said face image processing process of the embodiment of the invention, the nose of people's face is always vertical with display plane with the line at the center of terminal device display plane.
As shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, first image acquiring device represented of the plane at said first facial image place of AC representative and DG is vertical with shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
Usually; When making terminal device; Shooting light path between first image acquiring device and the people's face and the angle between the display plane are predefined; Then can confirm and first image acquiring device and people's face between shooting light path and the angle between the display plane mutually surplus angle a0 can calculate, and then can draw the value of angle a, then certainly calculate the value of cos (a).
In like manner, can calculate the value of cos (b).
Another kind of embodiment as this step; When making terminal device; Shooting light path and the angle between the display plane between first image acquiring device and the people's face are not predefined; But the employing image recognition technology such as the organ identification technology, makes camera carry out track up to certain organ of people's face.In the process that the user uses, shooting light path between first image acquiring device and the people's face and the angle between the display plane are with the variable in distance between people's face and the display plane.In this case, adopt following method to calculate the value of cos (a):
At first, need to confirm the distance between people's face and the display plane, can obtain the distance between people's face plane and the image display plane through range sensor.
Secondly, as shown in Figure 3, AC represents the plane at said first facial image place, and GD represents first image acquiring device when obtaining image, the shooting light path between first image acquiring device and the people's face.Obviously, no matter how the shooting light path between first image acquiring device and the people's face changes, and first image acquiring device that the plane at said first facial image place of AC representative and DG represent is vertical with the shooting light path between people's face.
AB representative image display plane, GE is perpendicular to the display plane of terminal device.Obviously, GE is perpendicular to the image display plane of AB representative.
So; AC represents the plane at said first facial image place and the angle between the AB representative image display plane; Angle a0 between the shooting light path between first image acquiring device that the i.e. first angle a between plane, first facial image place and the display plane, and GE and GD represent and the people's face equates.So, cos (a)=cos (a0).
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
In like manner; Can draw
wherein; M is the distance between people's face plane and image display plane, and n is in the heart a distance in image acquiring device center and the image display plane.
A kind of embodiment as present embodiment; Said projection module 432 according to the said first included angle cosine value with said the first face image projection to display plane, according to the said second included angle cosine value said second facial image is projected on the display plane and can adopt following method:
According to the said first included angle cosine value be cos (a) with said the first face image projection to display plane, be that cos (b) projects to said second facial image on the display plane according to the said second included angle cosine value.
At first, calculate pixel in said first facial image to the coordinate of said display plane projection, and the pixel in said first facial image is projected on the said display plane according to the relation of x0=x1/cos (a), y0=y1/cos (a).
Calculate pixel in said second facial image to the coordinate of said display plane projection according to the relation of x0=x2/cos (b), y0=y2/cos (b), and the pixel in said second facial image is projected on the said display plane.
Wherein, Said display plane and people's face plane parallel; (x1, y1) is the coordinate of pixel in said first facial image, and (x2, y2) is the pixel coordinate in said second facial image, and (x0, y0) is the point coordinate on the said display plane; Cos (a) is the included angle cosine value between said first facial image plane, place and the display plane, and cos (b) is the included angle cosine value between said second facial image plane, place and the display plane.
In the present embodiment pixel in said first facial image is projected on the said display plane, perhaps the pixel in said second facial image is projected on the said display plane, can not have sequencing in time.
A kind of embodiment as present embodiment; The projection that same coordinate points on display plane is repeated, said pixel in said first facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said second facial image is not projected to.Perhaps
Said pixel in said second facial image is projected on the said display plane can be coordinate position projection on the said display plane that the pixel in said first facial image is not projected to.
In this embodiment; After perhaps second facial image projects to display plane with first facial image; Next when projecting to another facial image on the display plane; No longer the whole pixels with another facial image all project on the display plane, but, only with the pixel in the facial image to correspondence not by the coordinate points projection of projection.The projection of avoiding the same coordinate points on the display plane to be repeated.
The facial image treating apparatus that present embodiment provides; First facial image and the pixel from second facial image that the angle of looking up is obtained that will obtain from the angle of overlooking; Project on the display plane, the image after the projection on the display plane is the neutralization of first facial image and second facial image.Can change the angles of display of original image, said facial image is shown according to predetermined angular.
Terminal device described in the embodiment of the invention can be equipment such as mobile phone, PDA, computing machine.
Through the description of above embodiment, the those skilled in the art can be well understood to the present invention and can realize by the mode that software adds essential common hardware, can certainly pass through hardware, but the former is better embodiment under a lot of situation.Based on such understanding; The part that technical scheme of the present invention contributes to prior art in essence in other words can be come out with the embodied of software product, and this computer software product is stored in the storage medium that can read, like the floppy disk of computing machine; Hard disk or CD etc.; Comprise some instructions with so that computer equipment (can be personal computer, server, the perhaps network equipment etc.) carry out the described method of each embodiment of the present invention.
The above; Be merely embodiment of the present invention, but protection scope of the present invention is not limited thereto, any technician who is familiar with the present technique field is in the technical scope that the present invention discloses; Can expect easily changing or replacement, all should be encompassed within protection scope of the present invention.Therefore, protection scope of the present invention should be as the criterion by said protection domain with claim.