CN108108668B - Age prediction method and device based on image - Google Patents

Age prediction method and device based on image Download PDF

Info

Publication number
CN108108668B
CN108108668B CN201711249580.8A CN201711249580A CN108108668B CN 108108668 B CN108108668 B CN 108108668B CN 201711249580 A CN201711249580 A CN 201711249580A CN 108108668 B CN108108668 B CN 108108668B
Authority
CN
China
Prior art keywords
age
face
prediction
predicted
average
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201711249580.8A
Other languages
Chinese (zh)
Other versions
CN108108668A (en
Inventor
张水发
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing Xiaomi Mobile Software Co Ltd
Original Assignee
Beijing Xiaomi Mobile Software Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing Xiaomi Mobile Software Co Ltd filed Critical Beijing Xiaomi Mobile Software Co Ltd
Priority to CN201711249580.8A priority Critical patent/CN108108668B/en
Publication of CN108108668A publication Critical patent/CN108108668A/en
Application granted granted Critical
Publication of CN108108668B publication Critical patent/CN108108668B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/178Human faces, e.g. facial parts, sketches or expressions estimating age from face image; using age information for improving recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Image Analysis (AREA)

Abstract

The present disclosure relates to an age prediction method and device based on images. The method comprises the following steps: predicting the age of the face in the target image to obtain the predicted age; acquiring an average predicted age corresponding to the face according to the feature information of the face; and determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age. The method and the device have the advantages that the age of the face in the target image is predicted to obtain the predicted age at this time, the average predicted age corresponding to the face is obtained according to the feature information of the face, and the age prediction result corresponding to the face in the target image is determined according to the predicted age at this time and the average predicted age, so that the accuracy and the reliability of the age prediction can be improved.

Description

Age prediction method and device based on image
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to an age prediction method and apparatus based on an image.
Background
In the related art, age prediction based on an image is affected by a photographing angle, illumination intensity, background, a prediction method and the like, so that the accuracy and reliability of an age prediction result are low.
Disclosure of Invention
To overcome the problems in the related art, the present disclosure provides an image-based age prediction method and apparatus.
According to a first aspect of the embodiments of the present disclosure, there is provided an image-based age prediction method, including:
predicting the age of the face in the target image to obtain the predicted age;
acquiring an average predicted age corresponding to the face according to the feature information of the face;
and determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age.
In a possible implementation manner, determining an age prediction result corresponding to a face in the target image according to the current prediction age and the average prediction age includes:
and when the difference value between the current prediction age and the average prediction age meets the condition, taking the current prediction age as an age prediction result corresponding to the face in the target image.
In a possible implementation manner, determining an age prediction result corresponding to a face in the target image according to the current prediction age and the average prediction age includes:
and under the condition that the difference value between the current prediction age and the average prediction age does not meet the condition, determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age.
In one possible implementation, the condition is:
the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
In one possible implementation, the method further includes:
and determining the predicted age variance corresponding to the face according to the predicted age corresponding to each image containing the face in the designated photo album and the average predicted age corresponding to the face.
In one possible implementation, the method further includes:
predicting the age of the face in each image containing the face in the designated photo album to obtain the predicted age corresponding to each image containing the face;
and determining the average predicted age corresponding to the face according to the predicted age corresponding to each image containing the face.
According to a second aspect of the embodiments of the present disclosure, there is provided an image-based age prediction apparatus including:
the first prediction module is used for predicting the age of the face in the target image to obtain the predicted age;
the acquisition module is used for acquiring the average predicted age corresponding to the face according to the feature information of the face;
and the first determining module is used for determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age.
In one possible implementation manner, the first determining module includes:
and the first determining submodule is used for taking the current prediction age as an age prediction result corresponding to the face in the target image under the condition that the difference value between the current prediction age and the average prediction age meets the condition.
In one possible implementation manner, the first determining module includes:
and the second determining submodule is used for determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age under the condition that the difference value between the current prediction age and the average prediction age does not meet the condition.
In one possible implementation, the condition is:
the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
In one possible implementation, the apparatus further includes:
and the second determining module is used for determining the predicted age variance corresponding to the face according to the predicted age corresponding to each image containing the face in the designated photo album and the average predicted age corresponding to the face.
In one possible implementation, the apparatus further includes:
the second prediction module is used for predicting the age of the face in each image containing the face in the designated photo album to obtain the predicted age corresponding to each image containing the face;
and the third determining module is used for determining the average predicted age corresponding to the face according to the predicted age corresponding to each image containing the face.
According to a third aspect of the embodiments of the present disclosure, there is provided an image-based age prediction apparatus comprising: a processor; a memory for storing processor-executable instructions; wherein the processor is configured to perform the above method.
According to a fourth aspect of embodiments of the present disclosure, there is provided a non-transitory computer-readable storage medium having instructions which, when executed by a processor, enable the processor to perform the above-described method.
The technical scheme provided by the embodiment of the disclosure can have the following beneficial effects: the method comprises the steps of obtaining the current predicted age by predicting the age of the face in the target image, obtaining the average predicted age corresponding to the face according to the feature information of the face, and determining the age prediction result corresponding to the face in the target image according to the current predicted age and the average predicted age, so that the accuracy and the reliability of the age prediction can be improved.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory only and are not restrictive of the disclosure.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this specification, illustrate embodiments consistent with the present disclosure and together with the description, serve to explain the principles of the disclosure.
Fig. 1 is a flowchart illustrating an image-based age prediction method according to an exemplary embodiment.
Fig. 2 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment.
Fig. 3 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment.
Fig. 4 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment.
Fig. 5 is a block diagram illustrating an image-based age prediction apparatus according to an exemplary embodiment.
Fig. 6 is an exemplary block diagram illustrating an image-based age prediction apparatus according to an exemplary embodiment.
Fig. 7 is a block diagram illustrating an apparatus 800 for age prediction of an image according to an example embodiment.
Detailed Description
Reference will now be made in detail to the exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, like numbers in different drawings represent the same or similar elements unless otherwise indicated. The implementations described in the exemplary embodiments below are not intended to represent all implementations consistent with the present disclosure. Rather, they are merely examples of apparatus and methods consistent with certain aspects of the present disclosure, as detailed in the appended claims.
Fig. 1 is a flowchart illustrating an image-based age prediction method according to an exemplary embodiment. The method can be applied to terminal equipment. As shown in fig. 1, the method may include steps S11 through S13.
In step S11, the age of the face in the target image is predicted, and the current predicted age is obtained.
The target image may refer to an image for which age prediction is required.
In this embodiment, in the case where only one face is included in the target image, the age of the only face in the target image can be predicted. In the case where the target image includes a plurality of faces, the age prediction may be performed on a face selected by the user in the target image, or the age prediction may be performed on all faces in the target image. The method for predicting the ages of a plurality of faces is similar to the method for predicting the ages of one face, and the embodiment only describes the method for predicting the ages of one face.
In this embodiment, an age prediction method in the related art may be adopted to predict the age of the face in the target image, so as to obtain the predicted age this time.
In step S12, an average predicted age corresponding to the face is obtained based on the feature information of the face.
In a possible implementation manner, the personal information corresponding to the face may be determined according to the feature information of the face, and an average predicted age corresponding to the personal information may be obtained. For example, if the feature information of the face matches the feature information of the face of person a, it may be determined that the face corresponds to person a, and an average predicted age corresponding to the face of person a may be obtained.
The Feature information of the face may be Scale-Invariant Feature Transform (SIFT-Invariant Feature Transform) Feature information of the face.
In step S13, an age prediction result corresponding to the face in the target image is determined based on the current prediction age and the average prediction age.
In a possible implementation manner, the current predicted age and an average value of the average predicted ages may be used as an age prediction result corresponding to the face in the target image.
In another possible implementation manner, determining an age prediction result corresponding to a face in a target image according to the current prediction age and the average prediction age may include: and under the condition that the difference value between the current prediction age and the average prediction age meets the condition, taking the current prediction age as the age prediction result corresponding to the face in the target image.
As an example of this implementation, the condition may be: the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer. For example, N is equal to 3, in
Figure BDA0001491429260000041
In this case, it may be determined that a difference between the present predicted age and the average predicted age meets a condition, where a denotes the present predicted age,
Figure BDA0001491429260000042
represents the average predicted age, and α represents the predicted age variance corresponding to the face.
In another possible implementation manner, determining an age prediction result corresponding to a face in a target image according to the current prediction age and the average prediction age may include: and under the condition that the difference value between the current prediction age and the average prediction age does not meet the condition, determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age. For example, in
Figure BDA0001491429260000043
In this case, it may be determined that the difference between the present predicted age and the average predicted age does not meet the condition, where a denotes the present predicted age,
Figure BDA0001491429260000044
represents the average predicted age, and α represents the predicted age variance corresponding to the face.
As an example of this implementation, the age prediction result corresponding to the face in the target image is determined according to the current prediction age and the average prediction age, and may be: determining the age prediction result corresponding to the face in the target image as
Figure BDA0001491429260000051
Wherein, a represents the age of the current forecast,
Figure BDA0001491429260000052
represents the average predicted age, and M is a positive number. For example, M equals 4.
According to the implementation mode, under the condition that the difference value between the current predicted age and the average predicted age does not meet the condition, the age prediction result corresponding to the face in the target image is determined according to the current predicted age and the average predicted age, so that the phenomenon that the age prediction deviation is large due to the face shooting angle, the illumination intensity, the background or a prediction method aiming at the target image and the like in the target image can be avoided. The age prediction result of this embodiment is based not only on the average predicted age but also on the current predicted age, and therefore, when the number of images used for calculating the average predicted age is small, a more reliable age prediction result can be obtained.
According to the embodiment, the age of the face in the target image is predicted to obtain the predicted age at this time, the average predicted age corresponding to the face is obtained according to the feature information of the face, and the age prediction result corresponding to the face in the target image is determined according to the predicted age at this time and the average predicted age, so that the accuracy and reliability of the age prediction can be improved.
Fig. 2 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment. As shown in fig. 2, the method may include steps S21 through S25.
In step S21, the age of the face in the target image is predicted, and the current predicted age is obtained.
Wherein, for step S21, refer to the description above for step S11.
In step S22, an average predicted age corresponding to the face is obtained based on the feature information of the face.
Wherein, for step S22, refer to the description above for step S12.
In step S23, the predicted age variance corresponding to the face is determined based on the predicted age corresponding to each image including the face in the designated album and the average predicted age corresponding to the face.
In one possible implementation, the designated album may be a local album of the terminal device.
In another possible implementation manner, the designated album may be a cloud album corresponding to the target user.
In another possible implementation manner, the designated album may include a local album of the terminal device and a cloud album corresponding to the target user.
It should be noted that although the designated photo album is described above in the above three implementations, those skilled in the art will understand that the present disclosure should not be limited thereto. Those skilled in the art can set the designated photo album according to the requirements of the actual application scene and/or personal preferences.
In this embodiment, the predicted age variance corresponding to the face can be expressed as
Figure BDA0001491429260000061
Wherein, ai' represents the predicted age corresponding to the face in the ith image in the image set corresponding to the face,
Figure BDA0001491429260000062
the average predicted age corresponding to the face is represented, and n represents the total number of images in the image set corresponding to the face.
In step S24, when the difference between the current predicted age and the average predicted age satisfies the condition that: the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
In step S25, when the difference between the current predicted age and the average predicted age does not satisfy the condition, the age prediction result corresponding to the face in the target image is determined based on the current predicted age and the average predicted age.
Fig. 3 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment. As shown in fig. 3, the method may include steps S31 through S35.
In step S31, the age of the face in the target image is predicted, and the current predicted age is obtained.
Wherein, for step S31, refer to the description above for step S11.
In step S32, the age of the face in each image including the face in the designated album is predicted, and the predicted age corresponding to each image including the face is obtained.
In this embodiment, the faces in the designated album may be clustered to obtain an image set corresponding to each person. For example, the image set corresponding to the person B includes an image 1, an image 2, an image 3, an image 4, and an image 5. The age of each image in the image set corresponding to the person B is predicted, and the predicted age a corresponding to the face of the person B in the image 1 can be obtained1Predicted age a corresponding to the face of person B in image 22Predicted age a corresponding to the face of person B in image 33Predicted age a corresponding to the face of person B in image 44Predicted age a corresponding to the face of person B in image 55
In step S33, the average predicted age corresponding to the face is determined based on the predicted ages corresponding to the respective images including the face.
In a possible implementation manner, according to the predicted age corresponding to each image including the face and the difference value between the shooting time and the current time of each image including the face, the corrected age corresponding to each image including the face may be determined. For example, the difference between the photographing time and the current time according to the image 1 is Δ a1The corrected age corresponding to person B in image 1 can be determined to be a1′=a1+Δa1(ii) a According to the difference value delta a between the shooting time of the image 2 and the current time2The corrected age corresponding to person B in image 2 can be determined to be a2′=a2+Δa2(ii) a According to the difference value delta a between the shooting time of the image 3 and the current time3It can be determined that the corrected age corresponding to person B in image 3 is a3′=a3+Δa3(ii) a According to the difference value delta a between the shooting time and the current time of the image 44The corrected age corresponding to person B in image 4 may be determined to be a4′=a4+Δa4(ii) a According to the difference value delta a between the shooting time of the image 5 and the current time5It can be determined that the corrected age corresponding to person B in image 5 is a5′=a5+Δa5. The average value of the corrected ages corresponding to the images containing the face is calculated, and the average predicted age corresponding to the face can be obtained. For example, the average predicted age corresponding to the face of person B
Figure BDA0001491429260000071
In this embodiment, the average predicted age corresponding to the face can be expressed as
Figure BDA0001491429260000072
Wherein, aiRepresents the predicted age, Δ a, corresponding to the face in the ith image in the image set corresponding to the faceiWhen the ith image in the image set corresponding to the human face is shotThe difference between the current time and the previous time, and n represents the total number of images in the image set corresponding to the human face.
In step S34, an average predicted age corresponding to the face is obtained based on the feature information of the face.
Wherein, for step S34, refer to the description above for step S12.
In step S35, an age prediction result corresponding to the face in the target image is determined based on the current prediction age and the average prediction age.
Wherein, for step S35, refer to the description above for step S13.
Fig. 4 is an exemplary flowchart illustrating a method of image-based age prediction according to an exemplary embodiment. As shown in fig. 4, the method may include steps S41 through S47.
In step S41, the age of the face in the target image is predicted, and the current predicted age is obtained.
Wherein, for step S41, refer to the description above for step S11.
In step S42, the age of the face in each image including the face in the designated album is predicted, and the predicted age corresponding to each image including the face is obtained.
Wherein, for step S42, refer to the description above for step S32.
In step S43, the average predicted age corresponding to the face is determined based on the predicted ages corresponding to the respective images including the face.
Wherein, for step S43, refer to the description above for step S33.
In step S44, the predicted age variance corresponding to the face is determined based on the predicted age corresponding to each image including the face in the designated album and the average predicted age corresponding to the face.
Wherein, for step S44, refer to the description above for step S23.
In step S45, an average predicted age corresponding to the face is obtained based on the feature information of the face.
Wherein, for step S45, refer to the description above for step S12.
In step S46, when the difference between the current predicted age and the average predicted age satisfies the condition that: the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
Wherein, for step S46, refer to the description above for step S24.
In step S47, when the difference between the current predicted age and the average predicted age does not satisfy the condition, the age prediction result corresponding to the face in the target image is determined based on the current predicted age and the average predicted age.
Wherein, for step S47, refer to the description above for step S25.
Fig. 5 is a block diagram illustrating an image-based age prediction apparatus according to an exemplary embodiment. Referring to fig. 5, the apparatus includes a first prediction module 51, an acquisition module 52, and a first determination module 53.
The first prediction module 51 is configured to predict the age of the face in the target image, and obtain the predicted age this time.
The obtaining module 52 is configured to obtain an average predicted age corresponding to the face according to the feature information of the face.
The first determining module 53 is configured to determine an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age.
Fig. 6 is an exemplary block diagram illustrating an image-based age prediction apparatus according to an exemplary embodiment. As shown in fig. 6:
in one possible implementation, the first determining module 53 includes a first determining submodule 531.
The first determining submodule 531 is configured to, in a case where a difference between the present predicted age and the average predicted age meets a condition, regard the present predicted age as an age prediction result corresponding to the face in the target image.
In one possible implementation, the first determination module 53 includes a second determination submodule 532.
The second determining submodule 532 is configured to determine an age prediction result corresponding to the face in the target image according to the current predicted age and the average predicted age in a case where a difference between the current predicted age and the average predicted age does not meet a condition.
In one possible implementation, the condition is: the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
In one possible implementation, the apparatus further includes a second determining module 54.
The second determining module 54 is configured to determine the predicted age variance corresponding to the face according to the predicted age corresponding to each image containing the face in the designated album and the average predicted age corresponding to the face.
In one possible implementation, the apparatus further includes a second prediction module 55 and a third determination module 56.
The second prediction module 55 is configured to predict the age of the face in each image containing the face in the designated album, and obtain the predicted age corresponding to each image containing the face.
The third determining module 56 is configured to determine an average predicted age corresponding to the face according to the predicted ages corresponding to the images containing the face.
With regard to the apparatus in the above-described embodiment, the specific manner in which each module performs the operation has been described in detail in the embodiment related to the method, and will not be elaborated here.
According to the embodiment, the age of the face in the target image is predicted to obtain the predicted age at this time, the average predicted age corresponding to the face is obtained according to the feature information of the face, and the age prediction result corresponding to the face in the target image is determined according to the predicted age at this time and the average predicted age, so that the accuracy and reliability of the age prediction can be improved.
Fig. 7 is a block diagram illustrating an apparatus 800 for age prediction of an image according to an example embodiment. For example, the apparatus 800 may be a mobile phone, a computer, a digital broadcast terminal, a messaging device, a game console, a tablet device, a medical device, an exercise device, a personal digital assistant, and the like.
Referring to fig. 7, the apparatus 800 may include one or more of the following components: processing component 802, memory 804, power component 806, multimedia component 808, audio component 810, input/output (I/O) interface 812, sensor component 814, and communication component 816.
The processing component 802 generally controls overall operation of the device 800, such as operations associated with display, telephone calls, data communications, camera operations, and recording operations. The processing components 802 may include one or more processors 820 to execute instructions to perform all or a portion of the steps of the methods described above. Further, the processing component 802 can include one or more modules that facilitate interaction between the processing component 802 and other components. For example, the processing component 802 can include a multimedia module to facilitate interaction between the multimedia component 808 and the processing component 802.
The memory 804 is configured to store various types of data to support operations at the apparatus 800. Examples of such data include instructions for any application or method operating on device 800, contact data, phonebook data, messages, pictures, videos, and so forth. The memory 804 may be implemented by any type or combination of volatile or non-volatile memory devices such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic or optical disks.
Power components 806 provide power to the various components of device 800. The power components 806 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for the apparatus 800.
The multimedia component 808 includes a screen that provides an output interface between the device 800 and a user. In some embodiments, the screen may include a Liquid Crystal Display (LCD) and a Touch Panel (TP). If the screen includes a touch panel, the screen may be implemented as a touch screen to receive an input signal from a user. The touch panel includes one or more touch sensors to sense touch, slide, and gestures on the touch panel. The touch sensor may not only sense the boundary of a touch or slide action, but also detect the duration and pressure associated with the touch or slide operation. In some embodiments, the multimedia component 808 includes a front facing camera and/or a rear facing camera. The front camera and/or the rear camera may receive external multimedia data when the device 800 is in an operating mode, such as a shooting mode or a video mode. Each front camera and rear camera may be a fixed optical lens system or have a focal length and optical zoom capability.
The audio component 810 is configured to output and/or input audio signals. For example, the audio component 810 includes a Microphone (MIC) configured to receive external audio signals when the apparatus 800 is in an operational mode, such as a call mode, a recording mode, and a voice recognition mode. The received audio signals may further be stored in the memory 804 or transmitted via the communication component 816. In some embodiments, audio component 810 also includes a speaker for outputting audio signals.
The I/O interface 812 provides an interface between the processing component 802 and peripheral interface modules, which may be keyboards, click wheels, buttons, etc. These buttons may include, but are not limited to: a home button, a volume button, a start button, and a lock button.
The sensor assembly 814 includes one or more sensors for providing various aspects of state assessment for the device 800. For example, the sensor assembly 814 may detect the open/closed status of the device 800, the relative positioning of components, such as a display and keypad of the device 800, the sensor assembly 814 may also detect a change in the position of the device 800 or a component of the device 800, the presence or absence of user contact with the device 800, the orientation or acceleration/deceleration of the device 800, and a change in the temperature of the device 800. Sensor assembly 814 may include a proximity sensor configured to detect the presence of a nearby object without any physical contact. The sensor assembly 814 may also include a light sensor, such as a CMOS or CCD image sensor, for use in imaging applications. In some embodiments, the sensor assembly 814 may also include an acceleration sensor, a gyroscope sensor, a magnetic sensor, a pressure sensor, or a temperature sensor.
The communication component 816 is configured to facilitate communications between the apparatus 800 and other devices in a wired or wireless manner. The device 800 may access a wireless network based on a communication standard, such as WiFi, 2G or 3G, or a combination thereof. In an exemplary embodiment, the communication component 816 receives a broadcast signal or broadcast related information from an external broadcast management system via a broadcast channel. In an exemplary embodiment, the communication component 816 further includes a Near Field Communication (NFC) module to facilitate short-range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, Ultra Wideband (UWB) technology, Bluetooth (BT) technology, and other technologies.
In an exemplary embodiment, the apparatus 800 may be implemented by one or more Application Specific Integrated Circuits (ASICs), Digital Signal Processors (DSPs), Digital Signal Processing Devices (DSPDs), Programmable Logic Devices (PLDs), Field Programmable Gate Arrays (FPGAs), controllers, micro-controllers, microprocessors or other electronic components for performing the above-described methods.
In an exemplary embodiment, a non-transitory computer-readable storage medium comprising instructions, such as the memory 804 comprising instructions, executable by the processor 820 of the device 800 to perform the above-described method is also provided. For example, the non-transitory computer readable storage medium may be a ROM, a Random Access Memory (RAM), a CD-ROM, a magnetic tape, a floppy disk, an optical data storage device, and the like.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This application is intended to cover any variations, uses, or adaptations of the disclosure following, in general, the principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It will be understood that the present disclosure is not limited to the precise arrangements described above and shown in the drawings and that various modifications and changes may be made without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (10)

1. An image-based age prediction method, comprising:
predicting the age of the face in the target image to obtain the predicted age;
acquiring an average predicted age corresponding to the face according to the feature information of the face, wherein the average predicted age corresponding to the face is obtained by predicting the age of the face in each image containing the face in a designated album, so as to obtain the predicted age corresponding to each image containing the face, and the average predicted age is determined according to the predicted age corresponding to each image containing the face;
determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age;
wherein, the determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age includes:
and when the difference value between the current prediction age and the average prediction age meets the condition, taking the current prediction age as an age prediction result corresponding to the face in the target image.
2. The method according to claim 1, wherein determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age comprises:
and under the condition that the difference value between the current prediction age and the average prediction age does not meet the condition, determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age.
3. The method according to claim 1 or 2, characterized in that the conditions are:
the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
4. The method of claim 3, further comprising:
and determining the predicted age variance corresponding to the face according to the predicted age corresponding to each image containing the face in the designated photo album and the average predicted age corresponding to the face.
5. An image-based age prediction apparatus, comprising:
the first prediction module is used for predicting the age of the face in the target image to obtain the predicted age;
the acquisition module is used for acquiring the average predicted age corresponding to the face according to the feature information of the face;
the first determining module is used for determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age;
wherein the first determining module comprises:
the first determining submodule is used for taking the current prediction age as an age prediction result corresponding to the face in the target image under the condition that the difference value between the current prediction age and the average prediction age meets the condition;
the device further comprises: the second prediction module is used for predicting the age of the face in each image containing the face in the designated photo album to obtain the predicted age corresponding to each image containing the face;
and the third determining module is used for determining the average predicted age corresponding to the face according to the predicted age corresponding to each image containing the face.
6. The apparatus of claim 5, wherein the first determining module comprises:
and the second determining submodule is used for determining an age prediction result corresponding to the face in the target image according to the current prediction age and the average prediction age under the condition that the difference value between the current prediction age and the average prediction age does not meet the condition.
7. The apparatus according to claim 5 or 6, wherein the condition is:
the absolute value of the difference value between the current predicted age and the average predicted age is smaller than N times of the variance of the predicted age corresponding to the face, wherein N is a positive integer.
8. The apparatus of claim 7, further comprising:
and the second determining module is used for determining the predicted age variance corresponding to the face according to the predicted age corresponding to each image containing the face in the designated photo album and the average predicted age corresponding to the face.
9. An image-based age prediction apparatus, comprising:
a processor;
a memory for storing processor-executable instructions;
wherein the processor is configured to perform the method of any one of claims 1 to 4.
10. A non-transitory computer readable storage medium having instructions therein, which when executed by a processor, enable the processor to perform the method of any one of claims 1 to 4.
CN201711249580.8A 2017-12-01 2017-12-01 Age prediction method and device based on image Active CN108108668B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201711249580.8A CN108108668B (en) 2017-12-01 2017-12-01 Age prediction method and device based on image

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201711249580.8A CN108108668B (en) 2017-12-01 2017-12-01 Age prediction method and device based on image

Publications (2)

Publication Number Publication Date
CN108108668A CN108108668A (en) 2018-06-01
CN108108668B true CN108108668B (en) 2021-04-27

Family

ID=62208906

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201711249580.8A Active CN108108668B (en) 2017-12-01 2017-12-01 Age prediction method and device based on image

Country Status (1)

Country Link
CN (1) CN108108668B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109034078B (en) * 2018-08-01 2023-07-14 腾讯科技(深圳)有限公司 Training method of age identification model, age identification method and related equipment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103886284A (en) * 2014-03-03 2014-06-25 小米科技有限责任公司 Character attribute information identification method and device and electronic device
CN106203306A (en) * 2016-06-30 2016-12-07 北京小米移动软件有限公司 The Forecasting Methodology at age, device and terminal

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101584575B (en) * 2009-06-19 2011-05-04 无锡骏聿科技有限公司 Age assessment method based on face recognition technology
JP6361387B2 (en) * 2014-09-05 2018-07-25 オムロン株式会社 Identification device and control method of identification device
JP2016177394A (en) * 2015-03-19 2016-10-06 カシオ計算機株式会社 Information processing apparatus, age estimation method, and program
CN105678253B (en) * 2016-01-04 2019-01-18 东南大学 Semi-supervised face age estimation device and semi-supervised face age estimation method

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103886284A (en) * 2014-03-03 2014-06-25 小米科技有限责任公司 Character attribute information identification method and device and electronic device
CN106203306A (en) * 2016-06-30 2016-12-07 北京小米移动软件有限公司 The Forecasting Methodology at age, device and terminal

Also Published As

Publication number Publication date
CN108108668A (en) 2018-06-01

Similar Documents

Publication Publication Date Title
US11061202B2 (en) Methods and devices for adjusting lens position
EP3179711B1 (en) Method and apparatus for preventing photograph from being shielded
CN110796988B (en) Backlight adjusting method and device
CN106408603B (en) Shooting method and device
EP3136699A1 (en) Method and device for connecting external equipment
CN107463052B (en) Shooting exposure method and device
CN107944367B (en) Face key point detection method and device
CN107480785B (en) Convolutional neural network training method and device
CN104850643B (en) Picture comparison method and device
CN108629814B (en) Camera adjusting method and device
CN107222576B (en) Photo album synchronization method and device
CN108154090B (en) Face recognition method and device
CN107656616B (en) Input interface display method and device and electronic equipment
CN106919302B (en) Operation control method and device of mobile terminal
CN108108668B (en) Age prediction method and device based on image
CN111131596B (en) Screen brightness adjusting method and device
CN107122356B (en) Method and device for displaying face value and electronic equipment
CN107885464B (en) Data storage method, device and computer readable storage medium
CN115512116A (en) Image segmentation model optimization method and device, electronic equipment and readable storage medium
CN111722919B (en) Method and device for running background application program, storage medium and electronic equipment
CN107730452B (en) Image splicing method and device
CN108769513B (en) Camera photographing method and device
CN107682623B (en) Photographing method and device
CN107018064B (en) Method and device for processing communication request
CN106897876B (en) Terminal payment processing method and device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant