CN110740265B - Image processing method and terminal equipment - Google Patents

Image processing method and terminal equipment Download PDF

Info

Publication number
CN110740265B
CN110740265B CN201911056855.5A CN201911056855A CN110740265B CN 110740265 B CN110740265 B CN 110740265B CN 201911056855 A CN201911056855 A CN 201911056855A CN 110740265 B CN110740265 B CN 110740265B
Authority
CN
China
Prior art keywords
touch
point
touch point
time
determining
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201911056855.5A
Other languages
Chinese (zh)
Other versions
CN110740265A (en
Inventor
林广超
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Vivo Mobile Communication Co Ltd
Original Assignee
Vivo Mobile Communication Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Vivo Mobile Communication Co Ltd filed Critical Vivo Mobile Communication Co Ltd
Priority to CN201911056855.5A priority Critical patent/CN110740265B/en
Publication of CN110740265A publication Critical patent/CN110740265A/en
Application granted granted Critical
Publication of CN110740265B publication Critical patent/CN110740265B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/62Control of parameters via user interfaces
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/70Circuitry for compensating brightness variation in the scene
    • H04N23/73Circuitry for compensating brightness variation in the scene by influencing the exposure time

Abstract

The embodiment of the invention provides an image processing method and terminal equipment, wherein the method comprises the following steps: receiving multi-point touch operation of a user on a shooting interface; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.

Description

Image processing method and terminal equipment
Technical Field
The invention relates to the technical field of terminal equipment, in particular to an image processing method and a mobile terminal.
Background
At present, people increasingly use terminal devices to take pictures, and in the process of taking pictures, a subject that a user wants to take pictures may not be focused correctly, or when the brightness is too bright or too dark, a common terminal device changes a focusing area and improves the brightness in a touch screen mode. In addition, in the prior art, focusing on a touch area can be realized only through single-point touch, and when a larger area needs to be focused, focusing cannot be realized, so that focusing position and brightness adjustment cannot meet the expectation of a user.
Disclosure of Invention
The embodiment of the invention provides an image processing method and terminal equipment, and aims to solve the problem of poor focusing and exposure effects in the prior art.
In order to solve the technical problem, the invention is realized as follows:
in a first aspect, an embodiment of the present invention further provides an image processing method, which is applied to a terminal device, where the method includes: receiving multi-point touch operation of a user on a shooting interface; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; and carrying out automatic exposure and automatic focusing processing according to the touch area.
In a second aspect, an embodiment of the present invention further provides a terminal device, where the terminal device includes: the first receiving module is used for receiving multi-point touch operation of a user on a shooting interface; a first determining module, configured to determine a touch area according to a position of each touch point corresponding to the multi-point touch operation when a touch parameter of each touch point meets a preset condition, where the touch parameter includes at least one of touch time and a touch position; and the first processing module is used for carrying out automatic exposure and automatic focusing processing according to the touch area.
In a third aspect, an embodiment of the present invention further provides a mobile terminal, including a processor, a memory, and a computer program stored on the memory and executable on the processor, where the computer program, when executed by the processor, implements the steps of the image processing method.
In a fourth aspect, the embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and the computer program is executed by a processor to implement the steps of the image processing method.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings needed to be used in the description of the embodiments of the present invention will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art that other drawings can be obtained according to these drawings without inventive labor.
FIG. 1 is a flowchart illustrating steps of an image processing method according to a first embodiment of the present invention;
FIG. 2 is a flowchart illustrating steps of an image processing method according to a second embodiment of the present invention;
fig. 3 is a block diagram of a terminal device according to a third embodiment of the present invention;
fig. 4 is a block diagram of a terminal device according to a fourth embodiment of the present invention;
fig. 5 is a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are some, not all, embodiments of the present invention. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
It should be appreciated that reference throughout this specification to "one embodiment" or "an embodiment" means that a particular feature, structure or characteristic described in connection with the embodiment is included in at least one embodiment of the present invention. Thus, the appearances of the phrases "in one embodiment" or "in an embodiment" in various places throughout this specification are not necessarily all referring to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
In various embodiments of the present invention, it should be understood that the sequence numbers of the following processes do not mean the execution sequence, and the execution sequence of each process should be determined by its function and inherent logic, and should not constitute any limitation to the implementation process of the embodiments of the present invention.
Example one
Referring to fig. 1, a flowchart illustrating steps of an image processing method according to a first embodiment of the present invention is shown.
The image processing method provided by the embodiment of the invention comprises the following steps:
step 101: and receiving multi-point touch operation of a user on the shooting interface.
After the automatic exposure and the automatic focusing are finished in the shooting interface, whether the shooting interface has touch operation at multiple positions at the same time is detected.
The user can perform two-point touch operation, three-point touch operation and four-point touch operation on the shooting interface.
It should be noted that the touch operation may be a click operation, a double-click operation, and the like, which is not limited in this embodiment of the present invention.
Step 102: and under the condition that the touch parameters of all touch points corresponding to the multi-point touch operation meet preset conditions, determining a touch area according to the positions of all touch points.
The touch parameter includes at least one of a touch time and a touch position.
Determining the time difference between a first touch point and a last touch point, judging whether the time difference is smaller than a preset time difference, if so, judging the coordinates of each touch point, determining a first difference value of transverse coordinates and a second difference value of longitudinal coordinates between every two touch points, judging whether the first difference value is larger than a first preset difference value and whether the second difference value is larger than a second preset difference value, and if so, determining a touch area according to each touch point.
For example: when the number of the touch points is two, determining the touch time difference between the first touch point and the second touch point, determining that the first touch point and the second touch point are simultaneously touched under the condition that the touch time difference is smaller than the preset time difference, and determining that the first touch point or the second touch point is a wrong touch point under the condition that the touch time difference is larger than or equal to the preset time difference.
It should be noted that, a person skilled in the art may set the preset time difference according to an actual situation, where the preset time difference may be 0.1s, 0.2s, 0.3s, and the like, and the embodiment of the present invention is not limited in this respect.
And under the condition that the touch time difference is smaller than the preset time difference, determining that the coordinate of the first touch point is (Xa, Ya) and the coordinate of the second touch point is (Xb, Yb). Determining whether | Xa-Xb | is larger than Δ x, wherein Δ x is a first preset difference value, | Xa-Xb | is a difference value of horizontal coordinates of the first touch point and the second touch point, and determining whether | Ya-Yb | is larger than a second preset difference value, wherein | Ya-Yb | is a difference value of vertical coordinates of the first touch point and the second touch point, and Δ y is a second preset difference value. And under the conditions of Xa-Xb | >. DELTA x and Ya-Yb | >. DELTA y, determining a touch area according to the first touch point and the second touch point.
Wherein, the technical personnel in the field can set up first preset difference and second preset difference according to actual conditions, and first preset difference and second preset difference can set up to: 0.3cm, 0.4cm, 0.5cm, etc., which are not particularly limited in the embodiments of the present invention.
Step 103: and carrying out automatic exposure and automatic focusing processing according to the touch area.
And carrying out automatic exposure processing and automatic focusing processing on the image or the portrait in the touch area.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
Example two
Referring to fig. 2, a flowchart illustrating steps of an image processing method according to a second embodiment of the present invention is shown.
The image processing method provided by the embodiment of the invention comprises the following steps:
step 201: and receiving multi-point touch operation of a user on the shooting interface.
The user can perform two-point touch operation, three-point touch operation and four-point touch operation on the shooting interface.
It should be noted that the touch operation may be a click operation, a double-click operation, and the like, which is not limited in this embodiment of the present invention.
Step 202: when the number of the touch points of the multi-point touch operation is N, determining first touch time and second touch time of a first touch point and an Nth touch point.
Wherein N is a positive integer greater than 1.
And determining the touch time of each touch point, and determining the first touch time of the first touch point and the second touch time of the Nth touch point. And determining whether the mistaken touch point exists or not through the touch time.
Determining a third touch time of a fourth touch point and a fourth touch time of the second touch point under the condition that the time difference between the first touch time and the second touch time is greater than or equal to a preset time difference; determining the touch points as four touch points under the condition that the time difference between the first touch time and the third touch time is greater than the preset time difference or the difference between the fourth touch time and the second touch time is greater than the preset time difference; determining touch coordinates of a first touch point, a second touch point, a third touch point and a fourth touch point; determining whether each two touch coordinates satisfy simultaneously: the distance difference in the abscissa direction is greater than a first preset distance and the distance difference in the ordinate direction is greater than a second preset distance; if so, generating a first touch area according to the first touch point, the second touch point, the third touch point and the fourth touch point.
Under the condition that the time difference between the first touch time and the second touch time is greater than or equal to the preset time difference, one touch point may be a wrong touch point in the N touch points, at this time, which is determined to be the wrong touch point, under the condition that the time difference between the first touch time and the third touch time is greater than the preset time difference or the difference value between the fourth touch time and the second touch time is greater than or equal to the time difference, the touch points are four touch points, the specific wrong touch point is determined according to the coordinate positions of the four touch points, and when the touch coordinates of the first touch point, the second touch point, the third touch point and the fourth touch point are determined; and determining that every two touch coordinates are met simultaneously, and if the numerical difference value in the abscissa direction is greater than a first preset distance and the distance difference value in the ordinate direction is greater than a second preset distance, determining that the Nth touch point is a false touch point. And generating a touch area according to the first touch point, the second touch point, the third touch point and the fourth touch point, wherein the touch area is quadrilateral.
When determining the touch coordinates of the first touch point, the second touch point, the third touch point and the fourth touch point; determining that every two touch coordinates are met simultaneously, when the numerical difference value in the abscissa direction is greater than a first preset distance and the numerical difference value in the ordinate direction is smaller than a second preset distance, determining that the Nth touch point is not a mistaken touch point, and determining touch coordinates of the second touch point, the third touch point, the fourth touch point and the Nth touch point; and generating a second touch area according to the second touch point, the third touch point, the fourth touch point and the Nth touch point under the condition that every two touch coordinates meet the requirement simultaneously, the numerical difference value in the abscissa direction is greater than the first preset distance, and the distance difference value in the ordinate direction is greater than the second preset distance.
When determining the touch coordinates of the first touch point, the second touch point, the third touch point and the fourth touch point; determining that every two touch coordinates are met simultaneously, and determining touch coordinates of a second touch point, a third touch point, a fourth touch point and an Nth touch point when a numerical difference value in the abscissa direction is greater than a first preset distance and a distance difference value in the ordinate direction is greater than a second preset distance; when every two touch coordinates are satisfied simultaneously, the distance difference in the abscissa direction is greater than a first preset distance, and the numerical value difference in the ordinate direction is greater than a second preset distance, any one group of touch points can be selected to generate a touch area. Preferably, the areas of the two touch areas can be compared, and the larger touch area is taken as the final result.
Step 203: and determining the touch coordinates of every two touch points under the condition that the time difference between the first touch time and the second touch time is smaller than the preset time difference.
For example: when the number of the touch points is two, determining the touch time difference between the first touch point and the second touch point, determining that the first touch point and the second touch point are simultaneously touched under the condition that the touch time difference is smaller than the preset time difference, and determining that the first touch point or the second touch point is a wrong touch point under the condition that the touch time difference is larger than or equal to the preset time difference.
It should be noted that, a person skilled in the art may set the preset time difference according to an actual situation, where the preset time difference may be 0.1s, 0.2s, 0.3s, and the like, and the embodiment of the present invention is not limited in this respect.
Step 204: if the coordinates between every two touch points satisfy the following conditions: and determining a touch area according to each touch point if the numerical difference in the abscissa direction is greater than the first preset distance and the numerical difference in the ordinate direction is greater than the second preset distance.
And under the condition that the touch time difference is smaller than the preset time difference, determining that the coordinate of the first touch point is (Xa, Ya) and the coordinate of the second touch point is (Xb, Yb). Determining whether | Xa-Xb | is larger than Δ x, wherein Δ x is a first preset difference value, | Xa-Xb | is a numerical difference value of horizontal coordinates of the first touch point and the second touch point, and determining whether | Ya-Yb | is larger than a second preset difference value, wherein | Ya-Yb | is a numerical difference value of vertical coordinates of the first touch point and the second touch point, and Δ y is a second preset difference value. And under the conditions of Xa-Xb | >. DELTA x and Ya-Yb | >. DELTA y, determining a touch area according to the first touch point and the second touch point.
Wherein, the technical personnel in the field can set up first preset difference and second preset difference according to actual conditions, and first preset difference and second preset difference can set up to: 0.3cm, 0.4cm, 0.5cm, etc., which are not particularly limited in the embodiments of the present invention.
Step 205: and under the condition that the face image exists in the touch area, determining whether the touch area is at the edge position of the shooting interface.
Optionally, when the face image exists in the touch area, the face image in the touch area is directly subjected to automatic exposure and automatic focusing.
Step 206: and under the condition that the touch area is not at the edge position of the shooting interface, carrying out automatic exposure and automatic focusing processing on the face image.
Optionally, when the face image exists in the touch area, before performing automatic exposure and automatic focusing processing on the face image in the touch area, determining whether the touch area is at an edge position of the shooting interface, and determining whether to perform automatic exposure and automatic focusing processing on the face image in the touch area according to a determination result. For example, when a face image of a passerby appears in a shot picture, the face image is often located at an edge position of a shooting interface because the passerby is not a shooting subject, and therefore, whether the face image in the touch area is the passerby can be determined according to a determination result of whether the touch area is at the edge position of the shooting interface, and if the touch area is not at the edge position of the shooting interface, that is, the face image is determined to be a non-passerby, automatic exposure and automatic focusing processing are performed on the face image. Optionally, determining whether the touch area is at the edge position of the shooting interface under the condition that the face image exists in the touch area; if the touch area is at the edge position of the shooting interface, judging the area ratio of the face image in the touch area; if the area ratio of the face image in the touch area is larger than a preset threshold value, carrying out automatic exposure and automatic focusing on the face image; and if the area ratio of the face image in the touch area is smaller than or equal to a preset threshold, carrying out automatic exposure and automatic focusing processing on the image area except the touch area in the shooting interface.
And identifying whether the face image exists in the touch area or not by an image identification technology. If the face image is at the edge of the shooting interface, the area ratio of the face image in the touch area needs to be determined. If the area ratio of the face image in the touch area is smaller than or equal to deltas, the face image does not need to be automatically exposed and automatically focused during calculation, and the image area except the touch area in the shooting interface is automatically exposed and automatically focused. And when the area ratio of the face image in the touch area is smaller than or equal to delta s, carrying out automatic exposure and automatic focusing on the face image.
In a part of shooting scenes, the position of a shooting subject may be located at the edge of a shooting picture due to the consideration of composition effect and the like, but in this case, the image size of the shooting subject is not too small, so that when a touch area is located at the edge of the shooting picture, whether automatic exposure and automatic focusing processing are performed on the face image in the touch area can be judged by determining the area ratio of the face image in the touch area, and therefore, the wrong operation is avoided while the shooting processing of passers-by entering the picture in the image shooting process is effectively prevented.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
Having described the image processing method for application programs according to the embodiments of the present invention, the following describes a terminal device according to the embodiments of the present invention with reference to the accompanying drawings.
EXAMPLE III
Referring to fig. 3, a block diagram of a terminal device according to a third embodiment of the present invention is shown.
The terminal device provided by the embodiment of the invention comprises: the first receiving module 301 is configured to receive a multi-touch operation of a user on a shooting interface; a first determining module 302, configured to determine a touch area according to a position of each touch point when a touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition, where the touch parameter includes at least one of touch time and touch position; the first processing module 303 is configured to perform automatic exposure and automatic focusing processing according to the touch area.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
Example four
Referring to fig. 4, a block diagram of a terminal device according to a fourth embodiment of the present invention is shown.
The terminal device provided by the embodiment of the invention comprises: a first receiving module 401, configured to receive a multi-touch operation of a user on a shooting interface; a first determining module 402, configured to determine a touch area according to a position of each touch point when a touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition, where the touch parameter includes at least one of touch time and touch position; the first processing module 403 is configured to perform automatic exposure and automatic focusing according to the touch area.
Preferably, the first determining module 402 comprises: the first determining sub-module 4021 is configured to determine a first touch time and a second touch time of a first touch point and an nth touch point when the number of touch points of the multi-point touch operation is N, where N is a positive integer greater than 1; the second determining sub-module 4022 is configured to determine touch coordinates of every two touch points when a time difference between the first touch time and the second touch time is smaller than a preset time difference; a third determining sub-module 4023, configured to, if the coordinates between each two touch points satisfy: and determining a touch area according to each touch point if the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance.
Preferably, the terminal device further includes: a second determining module 404, configured to determine, by the first determining sub-module 4021, five touch points of the multi-touch operation, and determine, after determining a first touch time and a second touch time of a first touch point and an nth touch point, a third touch time of a fourth touch point and a fourth touch time of the second touch point when a time difference between the first touch time and the second touch time is greater than or equal to a preset time difference; a third determining module 405, configured to determine that the touch points are four touch points when a time difference between the first touch time and the third touch time is greater than a preset time difference or a difference between the fourth touch time and the second touch time is greater than a preset time difference; a fourth determining module 406, configured to determine touch coordinates of the first touch point, the second touch point, the third touch point, and the fourth touch point; a fifth determining module 407, configured to determine whether each two touch coordinates satisfy: the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance; a sixth determining module 408, configured to generate a first touch area according to the first touch point, the second touch point, the third touch point, and the fourth touch point if the first touch area is determined to be the first touch area; a seventh determining module 409, configured to determine touch coordinates of the second touch point, the third touch point, the fourth touch point, and the fifth touch point if the touch coordinates of the second touch point, the third touch point, the fourth touch point, and the fifth touch point are not determined; an eighth determining module 410, configured to satisfy, at the same time for each two touch coordinates: and under the condition that the numerical difference value in the abscissa direction is greater than a first preset distance and the numerical difference value in the ordinate direction is greater than a second preset distance, generating a second touch area according to the second touch point, the third touch point, the fourth touch contact and the fifth touch point.
Preferably, the terminal device further includes: a ninth determining module 411, configured to, when the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, determine, according to the position of each touch point, a touch area by the first determining module 402, and then determine, when a face image exists in the touch area, whether the touch area is at an edge position of the shooting interface; the first processing module 403 is specifically configured to: and under the condition that the touch area is not at the edge position of the shooting interface, carrying out automatic exposure and automatic focusing processing on the face image.
Preferably, the terminal device further includes: a tenth determining module 412, configured to determine, after determining a touch area according to a position of each touch point when the first determining module 402 determines that the touch area has a face image in the touch area when the touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition, whether the touch area is at an edge position of the shooting interface; a determining module 413, configured to determine an area ratio of the face image in the touch area if the touch area is at an edge position of the shooting interface; a second processing module 414, configured to perform automatic exposure and automatic focusing on the face image if the area ratio of the face image in the touch area is greater than a preset threshold; a third processing module 415, configured to perform automatic exposure and automatic focusing processing on an image area other than the touch area in the shooting interface if the area ratio of the face image in the touch area is smaller than or equal to the preset threshold.
The terminal device provided in the embodiment of the present invention can implement each process implemented by the mobile terminal in the method embodiments of fig. 1 and fig. 2, and is not described herein again to avoid repetition.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
EXAMPLE five
Referring to fig. 5, a schematic diagram of a hardware structure of a mobile terminal according to a fifth embodiment of the present invention is shown;
the mobile terminal 500 includes, but is not limited to: a radio frequency unit 501, a network module 502, an audio output unit 503, an input unit 504, a sensor 505, a display unit 506, a user input unit 507, an interface unit 508, a memory 509, a processor 510, and a power supply 511. Those skilled in the art will appreciate that the mobile terminal architecture shown in fig. 5 is not intended to be limiting of mobile terminals, and that a mobile terminal may include more or fewer components than shown, or some components may be combined, or a different arrangement of components. In the embodiment of the present invention, the mobile terminal includes, but is not limited to, a mobile phone, a tablet computer, a notebook computer, a palm computer, a vehicle-mounted terminal, a wearable device, a pedometer, and the like.
The user input unit 507 is configured to receive a multi-touch operation performed on a shooting interface by a user.
A processor 510, configured to determine a touch area according to a position of each touch point when a touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition, where the touch parameter includes at least one of touch time and a touch position; and carrying out automatic exposure and automatic focusing processing according to the touch area.
In the embodiment of the invention, the multi-point touch operation of a user on a shooting interface is received; determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position; the automatic exposure and the automatic focusing processing are carried out according to the touch area, so that the focusing and exposure processing of the area required by a user can be realized, and the use experience of the user is improved.
It should be understood that, in the embodiment of the present invention, the radio frequency unit 501 may be used for receiving and sending signals during a message sending and receiving process or a call process, and specifically, receives downlink data from a base station and then processes the received downlink data to the processor 510; in addition, the uplink data is transmitted to the base station. In general, radio frequency unit 501 includes, but is not limited to, an antenna, at least one amplifier, a transceiver, a coupler, a low noise amplifier, a duplexer, and the like. In addition, the radio frequency unit 501 can also communicate with a network and other devices through a wireless communication system.
The mobile terminal provides the user with wireless broadband internet access through the network module 502, such as helping the user send and receive e-mails, browse webpages, access streaming media, and the like.
The audio output unit 503 may convert audio data received by the radio frequency unit 501 or the network module 502 or stored in the memory 509 into an audio signal and output as sound. Also, the audio output unit 503 may also provide audio output related to a specific function performed by the mobile terminal 500 (e.g., a call signal reception sound, a message reception sound, etc.). The audio output unit 503 includes a speaker, a buzzer, a receiver, and the like.
The input unit 504 is used to receive an audio or video signal. The input Unit 504 may include a Graphics Processing Unit (GPU) 5041 and a microphone 5042, and the Graphics processor 5041 processes image data of a still picture or video obtained by an image capturing device (e.g., a camera) in a video capturing mode or an image capturing mode. The processed image frames may be displayed on the display unit 506. The image frames processed by the graphic processor 5041 may be stored in the memory 509 (or other storage medium) or transmitted via the radio frequency unit 501 or the network module 502. The microphone 5042 may receive sounds and may be capable of processing such sounds into audio data. The processed audio data may be converted into a format output transmittable to a mobile communication base station via the radio frequency unit 501 in case of the phone call mode.
The mobile terminal 500 also includes at least one sensor 505, such as a light sensor, motion sensor, and other sensors. Specifically, the light sensor includes an ambient light sensor that adjusts the brightness of the display panel 5061 according to the brightness of ambient light, and a proximity sensor that turns off the display panel 5061 and/or a backlight when the mobile terminal 500 is moved to the ear. As one of the motion sensors, the accelerometer sensor can detect the magnitude of acceleration in each direction (generally three axes), detect the magnitude and direction of gravity when stationary, and can be used to identify the posture of the mobile terminal (such as horizontal and vertical screen switching, related games, magnetometer posture calibration), and vibration identification related functions (such as pedometer, tapping); the sensors 505 may also include fingerprint sensors, pressure sensors, iris sensors, molecular sensors, gyroscopes, barometers, hygrometers, thermometers, infrared sensors, etc., which are not described in detail herein.
The display unit 506 is used to display information input by the user or information provided to the user. The Display unit 606 may include a Display panel 5061, and the Display panel 5061 may be configured in the form of a Liquid Crystal Display (LCD), an Organic Light-Emitting Diode (OLED), or the like.
The user input unit 507 may be used to receive input numeric or character information and generate key signal inputs related to user settings and function control of the mobile terminal. Specifically, the user input unit 507 includes a touch panel 5071 and other input devices 5072. Touch panel 5071, also referred to as a touch screen, may collect touch operations by a user on or near it (e.g., operations by a user on or near touch panel 5071 using a finger, stylus, or any suitable object or attachment). The touch panel 5071 may include two parts of a touch detection device and a touch controller. The touch detection device detects the touch direction of a user, detects a signal brought by touch operation and transmits the signal to the touch controller; the touch controller receives touch information from the touch sensing device, converts the touch information into touch point coordinates, sends the touch point coordinates to the processor 510, and receives and executes commands sent by the processor 510. In addition, the touch panel 5071 may be implemented in various types such as a resistive type, a capacitive type, an infrared ray, and a surface acoustic wave. In addition to the touch panel 5071, the user input unit 507 may include other input devices 5072. In particular, other input devices 5072 may include, but are not limited to, a physical keyboard, function keys (e.g., volume control keys, switch keys, etc.), a trackball, a mouse, and a joystick, which are not described in detail herein.
Further, the touch panel 5071 may be overlaid on the display panel 5061, and when the touch panel 6071 detects a touch operation on or near the touch panel, the touch operation is transmitted to the processor 510 to determine the type of the touch event, and then the processor 510 provides a corresponding visual output on the display panel 5061 according to the type of the touch event. Although in fig. 5, the touch panel 5071 and the display panel 5061 are two independent components to implement the input and output functions of the mobile terminal, in some embodiments, the touch panel 5071 and the display panel 5061 may be integrated to implement the input and output functions of the mobile terminal, and is not limited herein.
The interface unit 508 is an interface through which an external device is connected to the mobile terminal 500. For example, the external device may include a wired or wireless headset port, an external power supply (or battery charger) port, a wired or wireless data port, a memory card port, a port for connecting a device having an identification module, an audio input/output (I/O) port, a video I/O port, an earphone port, and the like. The interface unit 508 may be used to receive input (e.g., data information, power, etc.) from external devices and transmit the received input to one or more elements within the mobile terminal 500 or may be used to transmit data between the mobile terminal 500 and external devices.
The memory 509 may be used to store software programs as well as various data. The memory 509 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program required by at least one function (such as a sound playing function, an image playing function, etc.), and the like; the storage data area may store data (such as audio data, a phonebook, etc.) created according to the use of the cellular phone, and the like. Further, the memory 509 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device.
The processor 510 is a control center of the mobile terminal, connects various parts of the entire mobile terminal using various interfaces and lines, and performs various functions of the mobile terminal and processes data by operating or executing software programs and/or modules stored in the memory 509 and calling data stored in the memory 509, thereby performing overall monitoring of the mobile terminal. Processor 510 may include one or more processing units; preferably, the processor 510 may integrate an application processor, which mainly handles operating systems, user interfaces, application programs, etc., and a modem processor, which mainly handles wireless communications. It will be appreciated that the modem processor described above may not be integrated into processor 510.
The mobile terminal 500 may further include a power supply 511 (e.g., a battery) for supplying power to various components, and preferably, the power supply 511 may be logically connected to the processor 510 via a power management system, so that functions of managing charging, discharging, and power consumption are performed via the power management system.
In addition, the mobile terminal 500 includes some functional modules that are not shown, and thus, are not described in detail herein.
Preferably, an embodiment of the present invention further provides a mobile terminal, which includes a processor 510, a memory 509, and a computer program stored in the memory 509 and capable of running on the processor 510, where the computer program, when executed by the processor 510, implements each process of the above-mentioned image processing method embodiment, and can achieve the same technical effect, and in order to avoid repetition, details are not described here again.
The embodiment of the present invention further provides a computer-readable storage medium, where a computer program is stored on the computer-readable storage medium, and when the computer program is executed by a processor, the computer program implements each process of the embodiment of the image processing method, and can achieve the same technical effect, and in order to avoid repetition, details are not repeated here. The computer-readable storage medium may be a Read-Only Memory (ROM), a Random Access Memory (RAM), a magnetic disk or an optical disk.
It should be noted that, in this document, the terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus. The term "comprising" is used to specify the presence of stated features, integers, steps, operations, elements, components, operations.
Through the above description of the embodiments, those skilled in the art will clearly understand that the method of the above embodiments can be implemented by software plus a necessary general hardware platform, and certainly can also be implemented by hardware, but in many cases, the former is a better implementation manner. Based on such understanding, the technical solutions of the present invention may be embodied in the form of a software product, which is stored in a storage medium (such as ROM/RAM, magnetic disk, optical disk) and includes instructions for enabling a terminal (such as a mobile phone, a computer, a server, an air conditioner, or a network device) to execute the method according to the embodiments of the present invention.
While the present invention has been described with reference to the embodiments shown in the drawings, the present invention is not limited to the embodiments, which are illustrative and not restrictive, and it will be apparent to those skilled in the art that various changes and modifications can be made therein without departing from the spirit and scope of the invention as defined in the appended claims.

Claims (10)

1. An image processing method applied to a terminal device is characterized by comprising the following steps:
receiving multi-point touch operation of a user on a shooting interface;
determining a touch area according to the position of each touch point under the condition that the touch parameter of each touch point corresponding to the multi-point touch operation meets a preset condition, wherein the touch parameter comprises at least one of touch time and touch position;
carrying out automatic exposure and automatic focusing processing according to the touch area;
wherein, the step of determining a touch area according to the position of each touch point when the touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition includes:
when the number of the touch points of the multi-point touch operation is N, determining first touch time and second touch time of a first touch point and an Nth touch point, wherein N is a positive integer greater than 1; the Nth touch point is the last touch point in the multi-point touch operation;
determining touch coordinates of every two touch points under the condition that the time difference between the first touch time and the second touch time is smaller than a preset time difference;
if the coordinates between every two touch points satisfy the following conditions: and determining a touch area according to each touch point if the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance.
2. The method of claim 1, wherein the number of touch points of the multi-touch operation is five, and wherein after the step of determining the first touch time and the second touch time of the first touch point and the nth touch point, the method further comprises:
determining a third touch time of a fourth touch point and a fourth touch time of the second touch point under the condition that the time difference between the first touch time and the second touch time is greater than or equal to a preset time difference;
determining that the touch points are four touch points when the time difference between the first touch time and the third touch time is greater than a preset time difference or the difference between the fourth touch time and the second touch time is greater than a preset time difference;
determining touch coordinates of a first touch point, a second touch point, a third touch point and a fourth touch point;
determining whether each two touch coordinates satisfy simultaneously: the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance;
if so, generating a first touch area according to the first touch point, the second touch point, the third touch point and the fourth touch point;
if not, determining touch coordinates of the second touch point, the third touch point, the fourth touch point and the fifth touch point;
satisfying at the same time every two touch coordinates: under the condition that the numerical difference value in the abscissa direction is larger than a first preset distance and the numerical difference value in the ordinate direction is larger than a second preset distance, generating a second touch area according to the second touch point, the third touch point, the fourth touch contact and the fifth touch point;
the first touch point, the second touch point, the third touch point, the fourth touch point and the Nth touch point are obtained according to a touch sequence of a plurality of touch points in the multi-point touch operation.
3. The method according to claim 1, wherein after the step of determining the touch area according to the position of each touch point when the touch parameter of each touch point corresponding to the multi-touch operation satisfies a preset condition, the method further comprises:
determining whether the touch area is at the edge position of the shooting interface or not under the condition that the face image exists in the touch area;
the step of performing automatic exposure and automatic focusing according to the touch area comprises the following steps:
and under the condition that the touch area is not at the edge position of the shooting interface, carrying out automatic exposure and automatic focusing processing on the face image.
4. The method according to claim 1, wherein after the step of determining the touch area according to the position of each touch point when the touch parameter of each touch point corresponding to the multi-touch operation satisfies a preset condition, the method further comprises:
determining whether the touch area is at the edge position of the shooting interface or not under the condition that the face image exists in the touch area;
if the touch area is at the edge position of the shooting interface, judging the area ratio of the face image in the touch area;
if the area ratio of the face image in the touch area is larger than a preset threshold value, carrying out automatic exposure and automatic focusing processing on the face image;
and if the area ratio of the face image in the touch area is smaller than or equal to the preset threshold, performing automatic exposure and automatic focusing processing on image areas except the touch area in the shooting interface.
5. A terminal device, characterized in that the terminal device comprises:
the first receiving module is used for receiving multi-point touch operation of a user on a shooting interface;
a first determining module, configured to determine a touch area according to a position of each touch point corresponding to the multi-point touch operation when a touch parameter of each touch point meets a preset condition, where the touch parameter includes at least one of touch time and a touch position;
the first processing module is used for carrying out automatic exposure and automatic focusing processing according to the touch area;
wherein the first determining module comprises:
the first determining submodule is used for determining first touch time and second touch time of a first touch point and an Nth touch point when the number of the touch points of the multi-point touch operation is N, wherein N is a positive integer larger than 1; the Nth touch point is the last touch point in the multi-point touch operation;
the second determining submodule is used for determining touch coordinates of every two touch points under the condition that the time difference between the first touch time and the second touch time is smaller than a preset time difference;
a third determining submodule, configured to, if the coordinates between every two touch points satisfy: and determining a touch area according to each touch point if the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance.
6. The terminal device according to claim 5, wherein the terminal device further comprises:
a second determining module, configured to determine, after determining a first touch time and a second touch time of the first touch point and the nth touch point and after determining a first touch time and a second touch time of the first touch point and the nth touch point, a third touch time of a fourth touch point and a fourth touch time of the second touch point when a time difference between the first touch time and the second touch time is greater than or equal to a preset time difference;
the third determining module is used for determining that the touch points are four touch points under the condition that the time difference between the first touch time and the third touch time is greater than a preset time difference or the difference value between the fourth touch time and the second touch time is greater than a preset time difference;
the fourth determining module is used for determining touch coordinates of the first touch point, the second touch point, the third touch point and the fourth touch point;
a fifth determining module, configured to determine whether each two touch coordinates satisfy: the numerical difference in the abscissa direction is greater than a first preset distance and the numerical difference in the ordinate direction is greater than a second preset distance;
a sixth determining module, configured to generate a first touch area according to the first touch point, the second touch point, the third touch point, and the fourth touch point if the first touch area is the first touch area;
a seventh determining module, configured to determine touch coordinates of the second touch point, the third touch point, the fourth touch point, and the fifth touch point if the touch coordinates of the second touch point, the third touch point, the fourth touch point, and the fifth touch point are not determined;
an eighth determining module, configured to satisfy, at the same time for each two touch coordinates: under the condition that the numerical difference value in the abscissa direction is larger than a first preset distance and the numerical difference value in the ordinate direction is larger than a second preset distance, generating a second touch area according to the second touch point, the third touch point, the fourth touch contact and the fifth touch point;
the terminal device is further configured to obtain the first touch point, the second touch point, the third touch point, the fourth touch point and the nth touch point according to a touch sequence of a plurality of touch points in the multi-point touch operation.
7. The terminal device according to claim 5, wherein the terminal device further comprises:
a ninth determining module, configured to determine whether the touch area is at an edge position of the shooting interface when the face image exists in the touch area;
the first processing module is specifically configured to:
and under the condition that the touch area is not at the edge position of the shooting interface, carrying out automatic exposure and automatic focusing processing on the face image.
8. The terminal device according to claim 5, wherein the terminal device further comprises:
a tenth determining module, configured to determine, after determining a touch area according to a position of each touch point when the touch parameter of each touch point corresponding to the multi-point touch operation satisfies a preset condition by the first determining module, whether the touch area is at an edge position of the shooting interface when determining that a face image exists in the touch area;
the judging module is used for judging the area ratio of the face image in the touch area if the touch area is at the edge position of the shooting interface;
the second processing module is used for carrying out automatic exposure and automatic focusing processing on the face image if the area ratio of the face image in the touch area is greater than a preset threshold value;
and the third processing module is used for carrying out automatic exposure and automatic focusing processing on image areas except the touch area in the shooting interface if the area ratio of the face image in the touch area is smaller than or equal to the preset threshold.
9. A mobile terminal, characterized in that it comprises a processor, a memory and a computer program stored on the memory and executable on the processor, which computer program, when executed by the processor, implements the steps of the image processing method according to any one of claims 1 to 4.
10. A computer-readable storage medium, on which a computer program is stored which, when being executed by a processor, carries out the steps of the image processing method according to any one of claims 1 to 4.
CN201911056855.5A 2019-10-31 2019-10-31 Image processing method and terminal equipment Active CN110740265B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911056855.5A CN110740265B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911056855.5A CN110740265B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Publications (2)

Publication Number Publication Date
CN110740265A CN110740265A (en) 2020-01-31
CN110740265B true CN110740265B (en) 2021-03-12

Family

ID=69270615

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911056855.5A Active CN110740265B (en) 2019-10-31 2019-10-31 Image processing method and terminal equipment

Country Status (1)

Country Link
CN (1) CN110740265B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111314621B (en) * 2020-04-15 2022-05-27 维沃移动通信有限公司 Photographing method and electronic equipment
CN111917980B (en) * 2020-07-29 2021-12-28 Oppo(重庆)智能科技有限公司 Photographing control method and device, storage medium and electronic equipment

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681657A (en) * 2016-01-15 2016-06-15 广东欧珀移动通信有限公司 Shooting focusing method and terminal device
CN105704375A (en) * 2016-01-29 2016-06-22 广东欧珀移动通信有限公司 Image processing method and terminal
CN106855782A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method for preventing false touch, device and terminal
WO2019003571A1 (en) * 2017-06-28 2019-01-03 シャープ株式会社 Electronic device, method for controlling electronic device, and program
CN110032326A (en) * 2019-03-29 2019-07-19 网易(杭州)网络有限公司 Mobile terminal shows control method, device, equipment and the storage medium of picture

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8237807B2 (en) * 2008-07-24 2012-08-07 Apple Inc. Image capturing device with touch screen for adjusting camera settings
US9313397B2 (en) * 2014-05-30 2016-04-12 Apple Inc. Realtime capture exposure adjust gestures
CN104270560B (en) * 2014-07-31 2018-01-12 三星电子(中国)研发中心 A kind of multi-spot method and apparatus

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105681657A (en) * 2016-01-15 2016-06-15 广东欧珀移动通信有限公司 Shooting focusing method and terminal device
CN105704375A (en) * 2016-01-29 2016-06-22 广东欧珀移动通信有限公司 Image processing method and terminal
CN106855782A (en) * 2016-12-16 2017-06-16 广东欧珀移动通信有限公司 A kind of method for preventing false touch, device and terminal
WO2019003571A1 (en) * 2017-06-28 2019-01-03 シャープ株式会社 Electronic device, method for controlling electronic device, and program
CN110032326A (en) * 2019-03-29 2019-07-19 网易(杭州)网络有限公司 Mobile terminal shows control method, device, equipment and the storage medium of picture

Also Published As

Publication number Publication date
CN110740265A (en) 2020-01-31

Similar Documents

Publication Publication Date Title
CN108182043B (en) Information display method and mobile terminal
CN108459797B (en) Control method of folding screen and mobile terminal
CN109078319B (en) Game interface display method and terminal
CN110062105B (en) Interface display method and terminal equipment
CN110109593B (en) Screen capturing method and terminal equipment
CN109710349B (en) Screen capturing method and mobile terminal
CN109407948B (en) Interface display method and mobile terminal
CN108900695B (en) Display processing method, terminal equipment and computer readable storage medium
CN110752981B (en) Information control method and electronic equipment
CN109343788B (en) Operation control method of mobile terminal and mobile terminal
CN110096203B (en) Screenshot method and mobile terminal
CN109859718B (en) Screen brightness adjusting method and terminal equipment
CN110740265B (en) Image processing method and terminal equipment
CN110929273A (en) Permission setting method and electronic equipment
CN108093119B (en) Strange incoming call number marking method and mobile terminal
CN110213437B (en) Editing method and mobile terminal
CN109491572B (en) Screen capturing method of mobile terminal and mobile terminal
CN109327605B (en) Display control method and device and mobile terminal
CN111443968A (en) Screenshot method and electronic equipment
CN108234745B (en) Signal receiving method, mobile terminal and computer readable storage medium
CN110888572A (en) Message display method and terminal equipment
CN110769153B (en) Image processing method and electronic equipment
CN111459323B (en) Control method, electronic device and medium
CN110210197B (en) Screen sensitivity adjusting method and mobile terminal
CN110471068B (en) Proximity detection method and mobile terminal

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant