CN108881544B - Photographing method and mobile terminal - Google Patents

Photographing method and mobile terminal Download PDF

Info

Publication number
CN108881544B
CN108881544B CN201810697279.1A CN201810697279A CN108881544B CN 108881544 B CN108881544 B CN 108881544B CN 201810697279 A CN201810697279 A CN 201810697279A CN 108881544 B CN108881544 B CN 108881544B
Authority
CN
China
Prior art keywords
image
matching degree
target object
display
real
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201810697279.1A
Other languages
Chinese (zh)
Other versions
CN108881544A (en
Inventor
吴丽芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201810697279.1A priority Critical patent/CN108881544B/en
Publication of CN108881544A publication Critical patent/CN108881544A/en
Application granted granted Critical
Publication of CN108881544B publication Critical patent/CN108881544B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0264Details of the structure or mounting of specific components for a camera module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04MTELEPHONIC COMMUNICATION
    • H04M1/00Substation equipment, e.g. for use by subscribers
    • H04M1/02Constructional features of telephone sets
    • H04M1/0202Portable telephone sets, e.g. cordless phones, mobile phones or bar type handsets
    • H04M1/026Details of the structure or mounting of specific components
    • H04M1/0266Details of the structure or mounting of specific components for a display module assembly
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/64Computer-aided capture of images, e.g. transfer from script file into camera, check of taken image quality, advice or proposal for image composition or decision on when to take image

Abstract

The embodiment of the invention provides a photographing method and a mobile terminal, wherein the method comprises the following steps: acquiring a reference image and a real-time image acquired by an image acquisition module; identifying one or more reference objects from the reference image and one or more target objects from the real-time image; determining a first image including at least one target object and a second image including a reference object corresponding to the target object; calculating the image matching degree of the first image and the second image; and when the image matching degree is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image. Therefore, the shooting angle of the image acquisition module can be guided to a photographer or the posture or the position of the target object can be correctly adjusted by the photographer according to the image matching degree of the reference object in the reference image and the target object in the real-time image, and the shooting efficiency is improved.

Description

Photographing method and mobile terminal
Technical Field
The embodiment of the invention relates to the field of mobile terminal photographing, in particular to a photographing method and a mobile terminal.
Background
With the popularization of mobile terminals with a photographing function, people are in daily life, for example: occasions such as travel, dining, parties and the like can choose to use the mobile terminal to take pictures. However, when the mobile terminal is used to take a picture for others, the view finding composition of the photographer is different from the conception of the photographed person, the photographed picture is often not satisfactory for the photographed person, and the photographed person is difficult to make the photographer clear the conception of the photographer by only using the language, and needs to take a plurality of pictures, and then selects a picture meeting the requirements from the plurality of pictures, so that the photographing efficiency is low.
Disclosure of Invention
The embodiment of the invention provides a photographing method and a mobile terminal, and solves the problem of low photographing efficiency.
According to a first aspect of the embodiments of the present invention, there is provided a method for taking a picture, which is applied to a mobile terminal, where the mobile terminal includes: display screen and image acquisition module, the method includes: acquiring a reference image and a real-time image acquired by the image acquisition module; identifying one or more reference objects from the reference image and one or more target objects from the real-time image; determining a first image including at least one target object and a second image including a reference object corresponding to the at least one target object; calculating the image matching degree of the first image and the second image; and when the image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image.
According to a second aspect of the embodiments of the present invention, there is provided a mobile terminal, including a display screen and an image capturing module, the mobile terminal further including: the first acquisition module is used for acquiring a reference image and a real-time image acquired by the image acquisition module; an identification module for identifying one or more reference objects from the reference image and one or more target objects from the real-time image; a first determination module for determining a first image comprising at least one target object and a second image comprising a reference object corresponding to the at least one target object; the calculating module is used for calculating the image matching degree of the first image and the second image; and the first generation module is used for generating a photo according to the real-time image when the image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold value.
According to a third aspect of the embodiments of the present invention, there is provided another mobile terminal including a processor, a memory, and a computer program stored on the memory and operable on the processor, wherein the computer program, when executed by the processor, implements the steps of the method for photographing according to the first aspect.
Therefore, according to the image matching degree of the reference object in the reference image and the target object in the real-time image, a photographer can be guided to correctly adjust the shooting angle of the image acquisition module or the target object, for example, the target object is guided to adjust the posture or the target object is guided to adjust the position, and the like, so that the photographer can shoot an expected photo according to the reference image, and the shooting efficiency is improved; furthermore, the photographing can be automatically carried out according to the image matching degree between the reference object and the target object, so that the artificial subjective judgment is avoided, the photographing process is simplified, and the use experience of a user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained based on these drawings without creative efforts.
Fig. 1 is a schematic flow chart of a photographing method according to an embodiment of the present invention;
fig. 2 is a schematic view of a picture displayed in a display screen according to an embodiment of the present invention;
fig. 3 is a second schematic flowchart of a photographing method according to an embodiment of the present invention;
fig. 4 is a third schematic flowchart of a photographing method according to an embodiment of the present invention;
fig. 5 is a second schematic view of a display screen according to the embodiment of the present invention;
fig. 6a is a third schematic view of a picture displayed in the display screen according to the embodiment of the present invention;
FIG. 6b is a fourth schematic view of a display screen according to an embodiment of the present invention;
FIG. 6c is a fifth schematic view of a screen displayed on the display screen according to the embodiment of the present invention;
fig. 7 is a schematic structural diagram of a mobile terminal according to an embodiment of the present invention;
fig. 8 is a second schematic structural diagram of a mobile terminal according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
Referring to fig. 1, an embodiment of the present invention provides a method for taking a picture, where an execution subject of the method may be a mobile terminal, and the mobile terminal includes: the mobile terminal can be a mobile phone, a tablet computer, a notebook computer, a palm computer or wearable equipment and the like, and the method comprises the following specific steps:
step 101: acquiring a reference image and a real-time image acquired by an image acquisition module;
in the embodiment of the present invention, the reference image is used to guide the photographer to take a correct photo, the reference image may be an image selected by the photographer or the photographed person from an image library of the mobile terminal, or the reference image may also be an image determined by the photographer or the photographed person according to a preview image displayed in a display screen, and optionally, the preview image may not be stored on the mobile terminal, so that the storage space of the mobile terminal can be saved. It should be noted that, in the embodiment of the present invention, a manner of acquiring the reference image is not particularly limited.
In the embodiment of the invention, the real-time image is an image acquired by an image acquisition module in real time, and the image acquisition module can be a front camera or a rear camera of the mobile terminal.
Step 102: identifying one or more reference objects from the reference image and one or more target objects from the real-time image;
in the embodiment of the present invention, the reference object is an object that can serve as a reference for guiding a photographer to perform framing photography so that the photographer can take a picture that meets the requirements of the photographer or a photographed person based on the reference object, for example, a building, a vehicle, an animal, a plant, a person, or the like.
In the embodiment of the present invention, the target object refers to an object captured in a real-time image, and may be, for example, a building, a vehicle, an animal, a plant, a person, or the like.
Specifically, in step 102, one or more reference objects are identified from the reference image and one or more target objects are identified from the real-time image according to object types, wherein the object types include any one or more of the following combinations: static, dynamic, and portrait. Static means that the device is currently in a static state, and a static reference object or target object may be: buildings, stationary vehicles, stationary animals, stationary plants, and the like. Dynamic means that the current state is in motion, and the dynamic reference object or target object may be: fallen leaves, a running car, a running animal, etc.
In the embodiment of the present invention, the reference object and the target object may be recognized from the reference image and the real-time image respectively by using an existing image recognition algorithm, for example: the image recognition algorithm may be an existing template matching algorithm, but is not limited thereto.
Step 103: determining a first image including at least one target object and a second image including a reference object corresponding to the at least one target object;
in the embodiment of the present invention, the reference object corresponding to the target object may be determined in the following two ways:
the first mode is that a reference object corresponding to the target object is determined according to the image recognition results of the target object and the reference object, and the image recognition result can be obtained through an edge tracking algorithm.
The target object and the reference object corresponding to the target object have the same image recognition result, which may be that the target object and the reference object have the same outline (shape) or contain the same characters, for example: the reference image includes an AA store (reference object), the real-time image also includes an AA store (target object), and the AA store in the real-time image corresponds to the AA store in the reference image.
And secondly, determining the reference object corresponding to the target object according to the object types and the display information of the target object and the reference object.
Specifically, in step 103, the object type of each target object and each reference object is obtained; acquiring display information of the first image of each target object and the second image of each reference object in the second display area; and determining a first image of at least one target object and a second image of a reference object corresponding to the at least one target object according to the object type and the display information.
In an embodiment of the present invention, the first image of the target object is a real-time image of the target object or a contour image of the target object acquired by the image acquisition module.
In an embodiment of the present invention, the second image of the reference object may be a silhouette image of the reference object or a semi-transparent image of the reference object.
Referring to fig. 2, a first image 201 of the target object and a second image 202 of the reference object are simultaneously displayed in a display screen 203, the second image 202 being an outline image of the reference object, and optionally the second image 202 is displayed in the form of a dotted line in the display screen 203. Further, the second image 202 may be displayed in a predetermined display period and a predetermined blinking manner. The display form of the second image 202 is not particularly limited in the embodiment of the present invention.
It should be noted that fig. 2 only illustrates the case of one target object and one reference object, and the cases of multiple target objects and multiple reference objects are similar to this, and are not described again here.
Similarly, not all target objects have corresponding reference objects, for example: if there is a running puppy in the real-time image and there is no such puppy in the reference image, there is no corresponding reference object if the target object is such a running puppy. The mobile terminal may select another target object, and acquire a first image of the other target object and a second image of a reference object corresponding to the other target object. That is, in this case, the degree of image matching between the other target object and the corresponding reference object is not affected.
Step 104: calculating the image matching degree of the first image and the second image;
in the embodiment of the present invention, an image matching degree of a first image of at least one target object and a second image of a reference object corresponding to the target object may be calculated by an existing image matching algorithm, where the image matching is used to indicate similarity or consistency between the first image and the second image, for example: the image matching algorithm may be an edge tracking algorithm, or the image matching algorithm may be an image feature based matching algorithm, although not limited thereto.
Step 105: and when the image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image.
Taking the example that the reference image includes three reference objects, and the real-time image includes target objects respectively corresponding to the three reference objects, the image matching degree between the first image of at least one target object and the second image of the corresponding reference object in step 105 is greater than or equal to the preset matching degree threshold, which can be understood as:
(1) the image matching degree of the first image of any one target object and the second image of the corresponding reference object is greater than or equal to a preset matching degree threshold value;
(2) the matching degree of the first images of any two target objects and the second images of the corresponding reference objects is greater than or equal to a preset matching degree threshold value;
therefore, even if the image matching degree of a part of target objects and the reference objects cannot meet the requirement, automatic photographing can be finished.
(3) The image matching degree of the first images of all the target objects and the second images of the corresponding reference objects is larger than or equal to a preset matching degree threshold value.
The preset matching degree threshold value is as follows: the preset matching degree threshold corresponding to the object type of the target object or the reference object is, for example: the threshold value of the matching degree of the static target object or the reference object is 80% to 100%, the threshold value of the matching degree of the dynamic target object or the reference object is 30% to 50%, and the threshold value of the matching degree of the target object or the reference object of the portrait is 70% to 80%, which is not limited to this.
When the object type of the target object is dynamic, for example: for example, a running dog, a running car, or the like, for which there may be a corresponding reference object in the reference image, or there may be no corresponding reference object in the reference image, and therefore, the value of the matching threshold corresponding to the dynamic target object or reference object is low, for example: 30 to 50 percent.
When the object type of the target object is static, for example: buildings, stationary plants, and the like, for which there is generally a reference object corresponding to one of the objects in the reference image, the matching threshold value corresponding to the static object or reference object is high, for example: 80 to 100 percent.
When the type of the target object is a portrait, because there may be differences in body types or different postures, or it is often only necessary to ensure that the position where the portrait is located is consistent with the requirement of the person being photographed when photographing and framing, the value of the threshold value of the matching degree corresponding to the target object or the reference object of the portrait is moderate, for example: 60 to 80 percent.
Therefore, according to the image matching degree of the reference object in the reference image and the target object in the real-time image, a photographer can be guided to correctly adjust the shooting angle of the image acquisition module or the target object, for example, the target object is guided to adjust the posture or the target object is guided to adjust the position, and the like, so that the photographer can shoot an expected photo according to the reference image, and the shooting efficiency is improved; furthermore, the photographing can be automatically carried out according to the image matching degree between the reference object and the target object, so that the artificial subjective judgment is avoided, the photographing process is simplified, and the use experience of a user is improved.
Referring to fig. 3, an embodiment of the present invention provides another photographing method, where an execution subject of the method may be a mobile terminal, and the mobile terminal includes: the mobile terminal can be a mobile phone, a tablet computer, a notebook computer, a palm computer or wearable equipment and the like, and the method comprises the following specific steps:
step 301: acquiring a real-time image acquired by an image acquisition module, and then executing step 306; executing step 302 or step 304 while executing step 301, or executing step 302 or step 304 after executing step 301, or executing step 302 or step 304 before executing step 301;
step 302: receiving a first operation, and then executing step 303;
step 303: in response to the first operation, selecting an image as a reference image in an image library of the mobile terminal, and then executing step 306;
in this embodiment of the present invention, the first operation may be an operation of calling a local picture of the mobile terminal, and a manner of the first operation is not specifically limited in this embodiment of the present invention.
Step 304: receiving a second operation, and then executing step 305;
step 305: in response to the second operation, determining the preview image displayed in the display screen as a reference image, and then performing step 306;
in the embodiment of the present invention, the second operation may be an operation of acquiring a preview image displayed in the current display screen, where the preview image is not stored on the mobile terminal, so that the storage space of the mobile terminal can be saved. It should be noted that, the embodiment of the present invention does not specifically limit the second operation manner.
Step 306: identifying one or more reference objects from the reference image and one or more target objects from the real-time image;
step 307: determining a first image including at least one target object and a second image including a reference object corresponding to the target object;
step 308: calculating the image matching degree of the first image and the second image;
step 309: and when the image matching degree of the first image and the second image is greater than or equal to the matching degree threshold value, generating a photo according to the real-time image.
It should be noted that, the descriptions of step 306 to step 309 in fig. 3 may refer to step 102 to step 105 in fig. 1, and are not repeated herein.
Therefore, according to the image matching degree of the reference object in the reference image and the target object in the real-time image, a photographer can be guided to correctly adjust the shooting angle of the image acquisition module or the target object, for example, the target object is guided to adjust the posture or the target object is guided to adjust the position, and the like, so that the photographer can shoot an expected photo according to the reference image, and the shooting efficiency is improved; furthermore, the photographing can be automatically carried out according to the image matching degree between the reference object and the target object, so that the artificial subjective judgment is avoided, the photographing process is simplified, and the use experience of a user is improved.
Referring to fig. 4, an embodiment of the present invention provides another photographing method, where an execution subject of the method may be a mobile terminal, and the mobile terminal includes: display screen and image acquisition module, this display screen includes: the mobile terminal can be a mobile phone, a tablet computer, a notebook computer, a palm computer or wearable equipment and the like, and the method comprises the following specific steps:
step 401: acquiring a reference image and a real-time image acquired by an image acquisition module, and then executing step 402;
step 402: identifying one or more reference objects from the reference image and one or more target objects from the real-time image, performing step 403;
step 403: displaying the reference image in the first display area, and executing step 404;
in this embodiment of the present invention, the first display area may be displayed in a display screen in the form of a floating window, and the display form of the first display area is not specifically limited in this embodiment of the present invention.
Step 404: displaying a second image of at least one reference object in a second display area according to the reference image displayed in the first display area, and executing step 405;
in the implementation of the present invention, the second image of the reference object may be a contour image of the reference object, or may be a semi-transparent image of the reference object, and a display form of the second image of the object reference object in the embodiment of the present invention is not particularly limited. It should be noted that the display position of the second image of the reference object in the second display region may be determined based on the display position of the reference object in the first display region.
Step 405: displaying the first image of at least one target object in the real-time image in the second display area while displaying the second image, and then executing step 406;
referring to fig. 5, the display screen includes: the first display area 501 and the second display area 502, and optionally, the first display area 501 is located at the upper right corner of the second display area 502, but it is understood that the position relationship between the first display area and the second display area is not specifically limited in the embodiment of the present invention.
A reference image is displayed in the first display area 501, the reference image including: and a reference object 503, a second image 504 of the reference object 503 is displayed in the second display area 502, and the display position of the second image 504 in the second display area can be determined according to the display position of the reference object 503 in the first display area. For example: the second image 504 is a contour image of the reference object 503, optionally, the second image 504 is displayed in the form of a dotted line segment, and the first image 505 of the target object is also displayed in the second display area 502. It should be noted that, for clarity of description, only one reference object and one target object are shown in the drawings, and the same applies to the case of having multiple reference objects and multiple target objects, and details are not repeated here.
Step 406: acquiring the object type of each target object and each reference object, and then executing step 407;
in the embodiment of the present invention, the object type includes any one or more of the following combinations: static, dynamic, and portrait. Static means that the device is currently in a static state, and a static reference object or target object may be: buildings, stationary vehicles, stationary animals, stationary plants, and the like. Dynamic means that the current state is in motion, and the dynamic reference object or target object may be: fallen leaves, a running car, a running animal, etc.
Step 407: determining display information of the first image including each target object and the second image including each reference object in the second display region, and then performing step 408;
in the embodiment of the present invention, the display information may include attribute information (e.g., a name of the target object or the reference object) or a display position of the first image of the target object or the second image of the reference object.
With continued reference to fig. 5, the mobile terminal determines that the attribute information of the first image 505 and the second image 504 in the figure are both "houses", the display position of the first image 505 is the first position, and the display position of the second image 504 is the second position.
Step 408: determining a first image including at least one target object and a second image including a reference object corresponding to the target object according to the object type and the display information, and then performing step 409;
with continued reference to fig. 5, if the object type of the target object or the reference object is static, the attribute information of both the first image 505 and the second image 504 is "house" according to the display information of the first image 505 and the second image 504, and the first display position of the first image 505 and the second display position of the second image 504 are relatively close to each other, it can be determined that the reference object and the target object are corresponding to each other.
Step 409: calculating the image matching degree of the first image and the second image; when the image matching degree of the first image of the target object and the corresponding second image is greater than the preset matching degree threshold, executing step 410; when the image matching degree of the first image of the target object and the corresponding second image is smaller than a preset matching degree threshold value, executing step 411, or executing step 414, or executing step 416;
step 410, generating a photo according to the real-time image, and then ending the process;
continuing to refer to fig. 5, taking the object type of the target object as a static state as an example, the image matching degree of the first image 505 and the second image 504 is calculated, and the obtained image matching degree is 85%. And if the threshold value of the matching degree is 80%, the image matching degree of the first image 505 and the second image 504 exceeds the preset threshold value of the matching degree, and the mobile terminal automatically takes a picture.
Step 411: determining a non-matching region and/or a matching region of the first image and the second image; then, step 412 is performed;
in the embodiment of the present invention, the unmatched area is an area where the first image of the target object and the second image of the reference object do not coincide.
Referring to fig. 6a, a first reference object 603 and a second reference object 604 are displayed in the first display area 601, a first image 605 of the first target object, a first image 606 of the second target object, a second image of the first reference object 603 (the second image of the first reference object and the first image 605 of the first target object may completely coincide due to the static object types of the first reference object and the first target object, and thus are not shown in the drawing) and a second image 607 of the second reference object 604 are displayed in the second display area 602.
Since the object types of the second target object and the second reference object 604 are human images and the "hand gestures" of the second target object and the second reference object 604 are different, the unmatched region may be a region where the "hand gestures" in the first image 606 of the second target object are located, i.e., a region 6061. Further, the matching region may be the region in which the "hand gesture" is located in the second image 607 of the second reference object 604, i.e. the region 6071.
Step 412: generating prompt information of the unmatched area and/or the matched area according to the unmatched area and/or the matched area; step 413 is then performed;
in the embodiment of the present invention, the mobile terminal may generate different prompt information for the unmatched area and/or the matched area, for example: green is displayed in the matching area and red is displayed in the non-matching area. The content of the prompt message is not specifically limited in the embodiment of the present invention.
Step 413: the prompt is output and the process may then end.
Step 414: receiving a photographing operation; step 415 is then performed;
in the embodiment of the present invention, for a case where a large number of target objects belonging to a dynamic type exist in a real-time image, for example: the photographed person stands under the cherry blossom tree, the continuous cherry blossom is drifted off, and meanwhile, a plurality of passing pedestrians are arranged beside the photographed person. At this time, the image matching degree between the first image and the second image may not reach the matching degree threshold value all the time, but the actual framing composition already meets the requirement of the photographed person, and the mobile terminal can directly execute the photographing operation by receiving the photographing operation of the photographed person.
Step 415: in response to the photographing operation, a photograph is generated from the real-time image, and then the flow may end.
Step 416: reducing or enlarging the second image according to a predetermined ratio to obtain a third image, and then performing step 417;
in the embodiment of the present invention, the predetermined ratio may be set to 10% or 20%, etc., i.e., 10% or 20% per reduction or enlargement, although not limited thereto.
In the embodiment of the present invention, some special cases when the target object is a portrait are addressed, for example: the photographed person in the reference image is an adult with a tall stature, while the photographed person in the actual photographing may be a child with a small stature, and even if the position where the photographed person in the actual photographing stands is the same as the position where the photographed person in the reference image stands, the image matching degree may be lower than the preset matching degree threshold.
In view of the above, the mobile terminal may employ an image scaling algorithm to reduce or enlarge the second image according to a predetermined ratio, so as to improve the image matching degree of the portrait, for example, the image scaling algorithm may be a linear interpolation algorithm, but is not limited thereto.
Referring to fig. 6b, since the reference object 611 is a tall and big adult and the target object is a small and small child, there is a large difference in the shape of the first image 612 of the target object and the second image 613 of the reference object 611, resulting in an image matching degree lower than the preset matching degree threshold.
At this time, the mobile terminal may employ an existing image scaling algorithm to scale down the second image 613 to the third image 614 according to a predetermined scale, as shown in fig. 6c, and at this time, as long as the display positions of the first image 612 and the third image 614 are close to or consistent with each other, a high degree of image matching can be obtained. It will be appreciated that, for the case where the second image needs to be enlarged, the process is similar to that described above and will not be described herein again.
Step 417: and when the image matching degree of the first image and the third image is greater than or equal to the preset matching degree threshold value, generating a photo according to the real-time image, and then ending the process.
In this way, the mobile terminal simultaneously displays a first image of at least one target object and a second image of a reference object corresponding to the target object in the display screen, calculates the image matching degree of the first image and the second image, and can automatically take a picture when the image matching degree is greater than or equal to a preset matching degree threshold; when the image matching degree is smaller than a preset matching degree threshold value, the mobile terminal can select to output prompt information according to a matching area and/or a non-matching area between the first image and the second image, and guide a photographer to correctly adjust the shooting angle of the image acquisition module or the target object; or the mobile terminal can also zoom the second image according to a preset proportion, so that the image matching is more accurate; or the mobile terminal can also receive the photographing operation of the photographer and directly take the picture, so that the limitation of the photographing operation by the image matching result is avoided. The requirements of the photographed person are met, meanwhile, the photographing process is simplified, and the use experience of a user is improved.
Referring to fig. 7, an embodiment of the present invention provides a mobile terminal 700, including: display screen and image acquisition module, mobile terminal 700 still includes:
a first obtaining module 701, configured to obtain a reference image and a real-time image acquired by the image acquisition module;
an identification module 702 for identifying one or more reference objects from the reference image and one or more target objects from the real-time image;
a first determining module 703 for determining a first image comprising at least one target object and a second image comprising a reference object corresponding to the target object;
a calculating module 704, configured to calculate an image matching degree of the first image and the second image;
a first generating module 705, configured to generate a photo according to the real-time image when an image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold.
With continuing reference to fig. 7, in the embodiment of the present invention, optionally, the first obtaining module 701 includes:
a first receiving unit 7011, configured to receive a first operation;
a selecting unit 7012, configured to select, in response to the first operation, an image from an image library of the mobile terminal as a reference image;
and/or, a second receiving unit 7013, configured to receive the second operation;
a second determining unit 7014 is configured to determine, in response to the second operation, the preview image displayed in the display screen as a reference image.
With continued reference to fig. 7, in an embodiment of the present invention, optionally, the display screen includes: a first display area and a second display area, the mobile terminal 700 further comprising:
a first display module 706, configured to display the reference image in the first display area;
a second display module 707, configured to display a second image of at least one reference object in the second display area according to the reference image displayed in the first display area;
a third display module 708 configured to display the first image of the at least one target object in the real-time image in the second display area while displaying the second image.
With continuing reference to fig. 7, in an embodiment of the present invention, optionally, the first determining module 703 includes:
a first obtaining unit 7031 configured to obtain an object type of each target object and each reference object;
a second obtaining unit 7032, configured to determine display information of the first image of each target object and the second image of each reference object in the second display area;
a first determining unit 7033 is configured to determine, according to the object type and the display information, a first image including at least one target object and a second image including a reference object corresponding to the target object.
With continuing reference to fig. 7, in the embodiment of the present invention, optionally, the mobile terminal 700 further includes:
a second determining module 709, configured to determine a mismatch area and/or a matching area of the first image of the target object and the corresponding second image when an image matching degree of the first image of the target object and the corresponding second image is smaller than a preset matching degree threshold;
a third generating module 710, configured to generate prompt information of the unmatched area and/or the matched area according to the unmatched area and/or the matched area;
and the output module 711 is configured to output the prompt message.
With continuing reference to fig. 7, in the embodiment of the present invention, optionally, the mobile terminal 700 further includes:
the receiving module 712 is configured to receive a photographing operation when an image matching degree of a first image of a target object and a corresponding second image is smaller than a preset matching degree threshold;
a fourth generating module 713, configured to generate a photo according to the real-time image in response to the photographing operation.
With continuing reference to fig. 7, in the embodiment of the present invention, optionally, the mobile terminal 700 further includes:
the scaling module 714 is configured to, when the image matching degree between the first image of the target object and the corresponding second image is smaller than a preset matching degree threshold, scale down or enlarge the second image of the reference object to obtain a third image;
the first generating module 715 is configured to generate a photo according to the real-time image when the image matching degree between the first image of the target object and the corresponding third image is greater than or equal to a preset matching degree threshold.
Therefore, according to the image matching degree of the reference object in the reference image and the target object in the real-time image, a photographer can be guided to correctly adjust the shooting angle of the image acquisition module or the target object, for example, the target object is guided to adjust the posture or the target object is guided to adjust the position, and the like, so that the photographer can shoot an expected photo according to the reference image, and the shooting efficiency is improved; furthermore, the photographing can be automatically carried out according to the image matching degree between the reference object and the target object, so that the artificial subjective judgment is avoided, the photographing process is simplified, and the use experience of a user is improved.
Fig. 8 is a schematic hardware structure diagram of a mobile terminal for implementing various embodiments of the present invention, and as shown in the figure, the mobile terminal 800 includes but is not limited to: a radio frequency unit 801, a network module 802, an audio output unit 803, an input unit 804, a sensor 805, a display unit 806, a user input unit 807, an interface unit 808, a memory 809, a processor 810, and a power supply 811. Those skilled in the art will appreciate that the mobile terminal architecture illustrated in fig. 8 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than those illustrated, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
Preferably, a computer program stored on the memory 809 and executable on said processor 810, which computer program when executed by the processor 810 performs the steps of: acquiring a reference image and a real-time image acquired by the image acquisition module; identifying one or more reference objects from the reference image and one or more target objects from the real-time image; acquiring a first image of at least one target object and a second image of a reference object corresponding to the at least one target object; calculating the image matching degree of a first image of at least one target object and a second image of a reference object corresponding to the at least one target object; and when the image matching degree of the first image of the at least one target object and the second image of the corresponding reference object is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image.
Therefore, the mobile terminal obtains the reference image and the real-time image, identifies one or more reference objects from the reference image, identifies one or more target objects from the real-time image, calculates the image matching degree of the first image and the second image according to the first image of at least one target object and the second image of the reference object corresponding to the target object, and automatically takes a picture when the image matching degree is greater than or equal to a preset matching degree threshold value, so that the difficulty of taking pictures for other people is reduced, and the use experience of a user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 801 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 810; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 801 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. Further, the radio frequency unit 801 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 802, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 803 may convert audio data received by the radio frequency unit 801 or the network module 802 or stored in the memory 809 into an audio signal and output as sound. Also, the audio output unit 803 may also provide audio output related to a specific function performed by the mobile terminal 800 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 803 includes a speaker, a buzzer, a receiver, and the like.
The input unit 804 is used for receiving an audio or video signal. The input Unit 804 may include a Graphics Processing Unit (GPU) 8041 and a microphone 8042, and the Graphics processor 8041 processes image data of a still picture or video obtained by an image capturing device (such as a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 806. The image frames processed by the graphics processor 8041 may be stored in the memory 809 (or other storage medium) or transmitted via the radio frequency unit 801 or the network module 802. The microphone 8042 can receive sound, and can process such sound into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 801 in case of a phone call mode.
The mobile terminal 800 also includes at least one sensor 805, such as light sensors, motion sensors, and other sensors. Specifically, the light sensor includes an ambient light sensor that can adjust the brightness of the display panel 8061 according to the brightness of ambient light, and a proximity sensor that can turn off the display panel 8061 and/or the backlight when the mobile terminal 800 moves to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 805 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 806 is used to display information input by the user or information provided to the user. The Display unit 806 may include a Display panel 8061, and the Display panel 8061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 807 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 807 includes a touch panel 8071 and other input devices 8072. The touch panel 8071, also referred to as a touch screen, may collect touch operations by a user on or near the touch panel 8071 (e.g., operations by a user on or near the touch panel 8071 using a finger, a stylus, or any other suitable object or accessory). The touch panel 871 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 810, receives a command from the processor 810, and executes the command. In addition, the touch panel 8071 can be implemented by various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 8071, the user input unit 807 can include other input devices 8072. In particular, other input devices 8072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 8071 can be overlaid on the display panel 8061, and when the touch panel 8071 detects a touch operation on or near the touch panel 8071, the touch operation is transmitted to the processor 810 to determine the type of the touch event, and then the processor 810 provides a corresponding visual output on the display panel 8061 according to the type of the touch event. Although in fig. 8, the touch panel 8071 and the display panel 8061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 8071 and the display panel 8061 may be integrated to implement the input and output functions of the mobile terminal, which is not limited herein.
The interface unit 808 is an interface through which an external device is connected to the mobile terminal 800. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 808 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 800 or may be used to transmit data between the mobile terminal 800 and external devices.
The memory 809 may be used to store software programs as well as various data. The memory 809 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 809 can include high speed random access memory, and can also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid state storage device.
The processor 810 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by running or executing software programs and/or modules stored in the memory 809 and calling data stored in the memory 809, thereby integrally monitoring the mobile terminal. Processor 810 may include one or more processing units; preferably, the processor 810 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 810.
The mobile terminal 800 may also include a power supply 811 (e.g., a battery) for powering the various components, and the power supply 811 may be logically coupled to the processor 810 via a power management system that may be used to manage charging, discharging, and power consumption.
In addition, the mobile terminal 800 includes some functional modules that are not shown, and thus, are not described in detail herein.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when executed by a processor, the computer program implements each process of the above account management method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. Without further limitation, an element defined by the phrase "comprising an … …" does not exclude the presence of other like elements in a process, method, article, or apparatus that comprises the element.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (6)

1. A photographing method is applied to a mobile terminal, and the mobile terminal comprises: display screen and image acquisition module, the display screen includes: a first display area and a second display area, the method comprising:
acquiring a reference image and a real-time image acquired by the image acquisition module;
identifying one or more reference objects from the reference image and one or more target objects from the real-time image;
displaying the reference image in the first display area;
displaying a second image of at least one reference object in the second display area according to the reference image displayed in the first display area;
displaying a first image of at least one target object in the real-time image in the second display area while displaying the second image;
acquiring the object type of each target object and each reference object;
acquiring display information of a first image including each target object and a second image including each reference object in the second display area, wherein the display information includes attribute information or a display position of the first image of the target object or the second image of the reference object;
determining a first image including at least one target object and a second image including a reference object corresponding to the at least one target object according to the object type and the display information;
calculating the image matching degree of the first image and the second image;
and when the image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image, wherein the preset matching degree threshold value corresponds to the object type.
2. The method according to claim 1, wherein when the image matching degree of the first image and the second image is less than a preset matching degree threshold, the method further comprises:
determining a non-matching region and/or a matching region of the first image and the second image;
generating prompt information of the unmatched area and/or the matched area according to the unmatched area and/or the matched area;
and outputting the prompt information.
3. The method according to claim 1, wherein when the image matching degree of the first image and the second image is less than a preset matching degree threshold, the method further comprises:
reducing or amplifying the second image according to a preset proportion to obtain a third image;
and when the image matching degree of the first image and the third image is greater than or equal to a preset matching degree threshold value, generating a photo according to the real-time image.
4. A mobile terminal, comprising: display screen and image acquisition module, the display screen includes: a first display area and a second display area, wherein the mobile terminal further comprises:
the first acquisition module is used for acquiring a reference image and a real-time image acquired by the image acquisition module;
an identification module for identifying one or more reference objects from the reference image and one or more target objects from the real-time image;
a first display module, configured to display the reference image in the first display area;
a second display module, configured to display a second image of at least one reference object in the second display area according to the reference image displayed in the first display area;
a third display module, configured to display the first image of the at least one target object in the real-time image in the second display area while displaying the second image;
a first determination module for determining a first image comprising at least one target object and a second image comprising a reference object corresponding to the at least one target object;
the first determining module includes:
a first acquisition unit configured to acquire an object type of each target object and each reference object;
a second acquisition unit configured to acquire display information of the first image including each target object and the second image including each reference object in the second display region, the display information including attribute information or a display position of the first image of the target object or the second image of the reference object;
a first determination unit configured to determine a first image including at least one target object and a second image including a reference object corresponding to the at least one target object, according to the object type and the display information;
the calculating module is used for calculating the image matching degree of the first image and the second image;
and the first generation module is used for generating a photo according to the real-time image when the image matching degree of the first image and the second image is greater than or equal to a preset matching degree threshold value, wherein the preset matching degree threshold value corresponds to the type of the object.
5. The mobile terminal of claim 4, wherein the mobile terminal further comprises:
the scaling module is used for reducing or amplifying the second image of the reference object according to a preset proportion to obtain a third image when the image matching degree of the first image of the target object and the corresponding second image is smaller than a preset matching degree threshold value;
and the second generation module is used for generating a photo according to the real-time image when the image matching degree of the first image of the target object and the corresponding third image is greater than or equal to a preset matching degree threshold value.
6. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on said memory and executable on said processor, said computer program, when executed by said processor, implementing the steps of the method of taking a picture as claimed in any one of claims 1 to 3.
CN201810697279.1A 2018-06-29 2018-06-29 Photographing method and mobile terminal Active CN108881544B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201810697279.1A CN108881544B (en) 2018-06-29 2018-06-29 Photographing method and mobile terminal

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201810697279.1A CN108881544B (en) 2018-06-29 2018-06-29 Photographing method and mobile terminal

Publications (2)

Publication Number Publication Date
CN108881544A CN108881544A (en) 2018-11-23
CN108881544B true CN108881544B (en) 2020-08-11

Family

ID=64297148

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201810697279.1A Active CN108881544B (en) 2018-06-29 2018-06-29 Photographing method and mobile terminal

Country Status (1)

Country Link
CN (1) CN108881544B (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109361794B (en) * 2018-11-19 2021-04-20 Oppo广东移动通信有限公司 Zoom control method and device of mobile terminal, storage medium and mobile terminal
CN109743504B (en) * 2019-01-22 2022-02-22 努比亚技术有限公司 Auxiliary photographing method, mobile terminal and storage medium
CN110933301A (en) * 2019-11-22 2020-03-27 广州色咖信息科技有限公司 Photographing method and device, photographing light control method and device, and photographing light distribution equipment
CN111064888A (en) * 2019-12-25 2020-04-24 维沃移动通信有限公司 Prompting method and electronic equipment
CN111327833B (en) * 2020-03-31 2021-06-01 厦门美图之家科技有限公司 Auxiliary shooting method and device, electronic equipment and readable storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012043447A (en) * 2001-11-05 2012-03-01 Nant Holdings Ip Llc Image capture and identification system and method
CN105554373A (en) * 2015-11-20 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Photographing processing method and device and terminal
CN105847771A (en) * 2015-01-16 2016-08-10 联想(北京)有限公司 Image processing method and electronic device
WO2018084907A1 (en) * 2016-11-02 2018-05-11 Raytheon Company Flux rate unit cell focal plane array
CN108093171A (en) * 2017-11-30 2018-05-29 努比亚技术有限公司 A kind of photographic method, terminal and computer readable storage medium

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6739369B2 (en) * 2017-01-30 2020-08-12 アイホン株式会社 How to adjust the camera focus in the intercom device with camera
JP6800797B2 (en) * 2017-03-31 2020-12-16 キヤノン株式会社 Image pickup device, image processing device, control method and program of image pickup device
CN107920213A (en) * 2017-11-20 2018-04-17 深圳市堇茹互动娱乐有限公司 Image synthesizing method, terminal and computer-readable recording medium
CN108377332A (en) * 2018-02-27 2018-08-07 上海摩象网络科技有限公司 A kind of method and system using expression control smart machine shooting
CN108846377A (en) * 2018-06-29 2018-11-20 百度在线网络技术(北京)有限公司 Method and apparatus for shooting image

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012043447A (en) * 2001-11-05 2012-03-01 Nant Holdings Ip Llc Image capture and identification system and method
CN105847771A (en) * 2015-01-16 2016-08-10 联想(北京)有限公司 Image processing method and electronic device
CN105554373A (en) * 2015-11-20 2016-05-04 宇龙计算机通信科技(深圳)有限公司 Photographing processing method and device and terminal
WO2018084907A1 (en) * 2016-11-02 2018-05-11 Raytheon Company Flux rate unit cell focal plane array
CN108093171A (en) * 2017-11-30 2018-05-29 努比亚技术有限公司 A kind of photographic method, terminal and computer readable storage medium

Also Published As

Publication number Publication date
CN108881544A (en) 2018-11-23

Similar Documents

Publication Publication Date Title
CN109361865B (en) Shooting method and terminal
CN110740259B (en) Video processing method and electronic equipment
CN108881544B (en) Photographing method and mobile terminal
CN109600550B (en) Shooting prompting method and terminal equipment
CN108989672B (en) Shooting method and mobile terminal
CN110365907B (en) Photographing method and device and electronic equipment
CN109474786B (en) Preview image generation method and terminal
CN108683850B (en) Shooting prompting method and mobile terminal
CN109005336B (en) Image shooting method and terminal equipment
CN110113528B (en) Parameter obtaining method and terminal equipment
CN107730460B (en) Image processing method and mobile terminal
CN107749046B (en) Image processing method and mobile terminal
CN108924412B (en) Shooting method and terminal equipment
CN110602389B (en) Display method and electronic equipment
CN109819168B (en) Camera starting method and mobile terminal
CN109241832B (en) Face living body detection method and terminal equipment
CN109618218B (en) Video processing method and mobile terminal
CN109448069B (en) Template generation method and mobile terminal
CN108984143B (en) Display control method and terminal equipment
CN109246351B (en) Composition method and terminal equipment
CN108881721B (en) Display method and terminal
CN108174110B (en) Photographing method and flexible screen terminal
CN111182211B (en) Shooting method, image processing method and electronic equipment
CN110908517B (en) Image editing method, image editing device, electronic equipment and medium
CN109472825B (en) Object searching method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant