WO2024007654A1 - 摄像头对焦方法、装置、电子设备和计算机可读存储介质 - Google Patents

摄像头对焦方法、装置、电子设备和计算机可读存储介质 Download PDF

Info

Publication number
WO2024007654A1
WO2024007654A1 PCT/CN2023/087491 CN2023087491W WO2024007654A1 WO 2024007654 A1 WO2024007654 A1 WO 2024007654A1 CN 2023087491 W CN2023087491 W CN 2023087491W WO 2024007654 A1 WO2024007654 A1 WO 2024007654A1
Authority
WO
WIPO (PCT)
Prior art keywords
pixel ratio
calibration
calibrated
distance
camera
Prior art date
Application number
PCT/CN2023/087491
Other languages
English (en)
French (fr)
Inventor
刘义
聂兵兵
Original Assignee
惠州Tcl移动通信有限公司
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 惠州Tcl移动通信有限公司 filed Critical 惠州Tcl移动通信有限公司
Publication of WO2024007654A1 publication Critical patent/WO2024007654A1/zh

Links

Classifications

    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/63Control of cameras or camera modules by using electronic viewfinders
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/60Control of cameras or camera modules
    • H04N23/67Focus control based on electronic image sensor signals

Definitions

  • the present application relates to the field of camera technology, and specifically to a camera focusing method, device, electronic equipment and computer-readable storage medium.
  • the embodiments of the present application can solve the technical problem of poor focusing results when focusing based on the incident angle.
  • An embodiment of the present application provides a camera focusing method, including:
  • the first camera and the second camera are driven to move to a quasi-focus position.
  • a camera focusing device including:
  • An acquisition module configured to acquire a first preview image of the object through a first camera of the electronic device, and acquire a second preview image of the object through a second camera of the electronic device;
  • Determining module configured to determine the first pixel ratio of the subject in the first preview image based on the first preview image, and determine the proportion of the subject in the second preview based on the second preview image. The second proportion of pixels in the image;
  • a mapping module configured to perform mapping calculations on the first pixel ratio and the second pixel ratio to obtain the target distance of the subject relative to the first camera and the second camera;
  • a driving module is used to drive the first camera and the second camera to move to a quasi-focus position according to the target distance.
  • an embodiment of the present application also provides an electronic device, including a processor and a memory.
  • the memory stores a computer program.
  • the processor is used to run the computer program in the memory to implement the camera focusing method provided by the embodiment of the present application.
  • embodiments of the present application also provide a computer-readable storage medium.
  • the computer-readable storage medium stores a computer program.
  • the computer program is suitable for loading by the processor to execute any camera provided by the embodiments of the present application. Focus method.
  • embodiments of the present application also provide a computer program product, including a computer program.
  • the computer program is When executed, the processor implements any camera focusing method provided by the embodiments of the present application.
  • a first preview image of the object is obtained through the first camera of the electronic device, and a second preview image of the object is obtained through the second camera of the electronic device. Then, a first pixel ratio occupied by the object in the first preview image is determined based on the first preview image, and a second pixel ratio occupied by the object in the second preview image is determined based on the second preview image. Then, mapping calculation is performed on the first pixel ratio and the second pixel ratio to obtain the target distance of the object relative to the first camera and the second camera. Finally, the first camera and the second camera are driven to move to the accurate focus position according to the target distance.
  • the first pixel ratio occupied by the subject in the first preview image and the second pixel ratio occupied by the subject in the second preview image are relatively easy to determine, therefore, the first pixel ratio occupied by the subject in the second preview image is relatively easy to determine.
  • the first pixel ratio and the second pixel ratio are mapped and calculated to obtain a more accurate target distance of the object relative to the first camera and the second camera. Then the first camera and the second camera can be driven to the quasi-focus position according to the target distance. Thus achieving better focusing results.
  • Figure 1 is a schematic flowchart of a camera focusing method provided by an embodiment of the present application
  • Figure 2 is a schematic diagram of the target distance provided by the embodiment of the present application.
  • Figure 3 is a schematic diagram of the field of view provided by the embodiment of the present application.
  • Figure 4 is a schematic diagram of reference positions and reference pixels provided by the embodiment of the present application.
  • Figure 5 is a schematic diagram of the focus area provided by the embodiment of the present application.
  • Figure 6 is a schematic structural diagram of a camera focusing device provided by an embodiment of the present application.
  • FIG. 7 is a schematic structural diagram of an electronic device provided by an embodiment of the present application.
  • Embodiments of the present application provide a camera focusing method, device, electronic equipment, and computer-readable storage medium.
  • the camera focusing device can be integrated in an electronic device, and the electronic device can be a server, a terminal or other equipment.
  • the server can be an independent physical server, or a server cluster or distributed system composed of multiple physical servers. It can also provide cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, and cloud communications. , middleware services, domain name services, security services, network acceleration services (Content Delivery Network, CDN), and cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
  • cloud services cloud databases, cloud computing, cloud functions, cloud storage, network services, and cloud communications.
  • middleware services domain name services, security services, network acceleration services (Content Delivery Network, CDN)
  • cloud servers for basic cloud computing services such as big data and artificial intelligence platforms.
  • the terminal can be a smartphone, tablet, laptop, desktop computer, smart speaker, smart watch, etc., but is not limited to this.
  • the terminal and the server can be connected directly or indirectly through wired or wireless communication methods, which is not limited in this application.
  • the electronic device in order to facilitate the description of the camera focusing method of the present application, the electronic device will be used as a terminal for detailed description below, that is, the terminal will be used as the execution subject for detailed description.
  • FIG. 1 is a schematic flowchart of a camera focusing method provided by an embodiment of the present application.
  • the camera focusing method can include:
  • the terminal may obtain a first preview image of the object through its first camera, and obtain a second preview image through its second camera.
  • the other terminal detects that the shooting client is started, the first preview image of the subject is obtained through the first camera of the other terminal, and the second preview image is obtained through the second camera of the other terminal, and the other terminal The first preview image and the second preview image are then sent to the terminal, and the terminal obtains the first preview image and the second preview image.
  • the method of obtaining the first preview image and the second preview image can be selected according to the actual situation, and is not limited in this embodiment.
  • obtaining the first preview image of the subject through the first camera of the electronic device, and obtaining the second preview image of the subject through the second camera of the electronic device includes:
  • the shooting start instruction move the first camera of the electronic device to the preset initial position, and move the second camera of the electronic device to the preset initial position;
  • the first preview image of the object is acquired through the first camera located at the preset initial position
  • the second preview image of the object is acquired through the second camera located at the preset initial position.
  • the first camera and the second camera are moved to the preset initial position, so that after the first preview image and the second preview image are acquired, , the first pixel ratio can be determined according to the object area corresponding to the subject in the first preview image and the first preview image, and the third pixel ratio can be determined according to the object area corresponding to the subject in the second preview image and the second preview image.
  • the two-pixel ratio reduces the phenomenon that the first pixel ratio and the second pixel ratio cannot be determined because the corresponding object area of the subject in the first preview image and the corresponding object area in the second preview image are too blurry.
  • the terminal may divide the number of pixels corresponding to the subject in the first preview image by the total number of pixels in the first preview image to obtain the first pixel ratio, and divide the number of pixels corresponding to the subject in the second preview image into The number of pixels is divided by the total pixels of the second preview image, thereby obtaining the second pixel ratio.
  • the terminal may divide the number of horizontal pixels corresponding to the subject in the first preview image by the total number of horizontal pixels in the first preview image to obtain the first pixel ratio, and divide the number of horizontal pixels corresponding to the subject in the second preview image.
  • the second pixel ratio is obtained by dividing the number of corresponding horizontal pixels in the image by the total number of horizontal pixels in the second preview image.
  • the terminal may divide the number of vertical pixels corresponding to the subject in the first preview image by the total number of vertical pixels in the first preview image to obtain the first pixel ratio, and divide the number of vertical pixels corresponding to the subject in the second preview image.
  • the second pixel ratio is obtained by dividing the number of corresponding vertical pixels in the preview image by the total number of vertical pixels in the second preview image.
  • the terminal may divide the number of diagonal pixels corresponding to the subject in the first preview image by the total number of diagonal pixels in the first preview image to obtain the first pixel ratio, and divide the subject in the first preview image into The second pixel ratio is obtained by dividing the number of corresponding diagonal pixels in the second preview image by the total number of diagonal pixels in the second preview image.
  • the first pixel ratio can be used to represent the first proportion of the object occupied by the subject in the first angle of view of the first camera.
  • the field of view ratio, the second pixel ratio can be used to represent the second field of view ratio that the object occupies in the second field of view of the second camera.
  • the terminal may calculate the target distance based on the first pixel ratio and the second pixel ratio.
  • Target distance refers to the straight-line distance between the subject and the first camera.
  • the target distance can be shown in Figure 2.
  • the value corresponding to m in the figure is the target distance. Since the first camera and the second camera are located on the same plane, the straight-line distance between the subject and the second camera is the same as the straight-line distance between the subject and the first camera. That is, the target distance also refers to The straight-line distance between the subject and the second camera.
  • mapping calculation is performed on the first pixel ratio and the second pixel ratio to obtain the target distance of the object relative to the first camera and the second camera, including:
  • the distance corresponding to the distance parameter is used as the target distance of the object relative to the first camera and the second camera.
  • the terminal obtains the first pixel ratio by dividing the number of horizontal pixels corresponding to the subject in the first preview image by the total number of horizontal pixels in the first preview image, then the number of horizontal pixels corresponding to the subject in the second preview image is obtained. The number of corresponding horizontal pixels is divided by the total number of horizontal pixels in the second preview image to obtain the second pixel ratio. Then the first field of view angle is the horizontal field of view angle of the first camera, and the second field of view angle is the horizontal field of view angle of the first camera. The horizontal field of view of the second camera.
  • a preset mapping relationship between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined based on the horizontal field of view angle of the first camera, the horizontal field of view angle of the second camera and the first distance, Then, the first pixel ratio and the second pixel ratio are substituted into the preset mapping relationship to perform mapping calculation to obtain the distance corresponding to the distance parameter.
  • the terminal obtains the first pixel ratio by dividing the number of vertical pixels corresponding to the subject in the first preview image by the total number of vertical pixels in the first preview image, then the number of vertical pixels corresponding to the subject in the second preview image is The number of corresponding vertical pixels is divided by the total number of vertical pixels of the second preview image to obtain the second pixel ratio.
  • the first field of view angle is the vertical field of view angle of the first camera
  • the second field of view angle is the vertical field of view angle of the first camera. The vertical field of view of the second camera.
  • a preset mapping relationship between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined based on the vertical field of view angle of the first camera, the vertical field of view angle of the second camera and the first distance, Then, the first pixel ratio and the second pixel ratio are substituted into the preset mapping relationship to perform mapping calculation to obtain the distance corresponding to the distance parameter.
  • the terminal obtains the first pixel ratio by dividing the number of diagonal pixels corresponding to the subject in the first preview image by the total number of diagonal pixels in the first preview image
  • the terminal will divide the number of diagonal pixels corresponding to the subject in the second preview image into The number of corresponding diagonal pixels in the image is divided by the total number of diagonal pixels in the second preview image to obtain the second pixel ratio
  • the first field of view is the diagonal field of view of the first camera
  • the second The field of view is the diagonal field of view of the second camera.
  • a preset mapping between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined based on the diagonal field of view angle of the first camera, the diagonal field of view angle of the second camera and the first distance. relationship, and then substitute the first pixel ratio and the second pixel ratio into the preset mapping relationship to perform mapping calculation to obtain the distance corresponding to the distance parameter.
  • the horizontal field of view angle, the vertical field of view angle and the diagonal field of view angle can be as shown in Figure 3, for example.
  • the preset mapping relationship between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined, including:
  • a first preset mapping relationship between the distance parameter, the edge parameter and the first pixel ratio parameter is determined.
  • the edge parameter represents the relationship between the object and the edge of the second camera. distance;
  • a preset mapping relationship between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined according to the first preset mapping relationship and the second preset mapping relationship.
  • Angle mapping can be trigonometric function mapping, and trigonometric function mapping can be tangent mapping, cosine mapping, sine mapping, etc.
  • the first angle value may be the tangent value of the first field of view angle
  • the second angle value may be the tangent value of the second field of view angle
  • the first angle value may be the cosine value of the first field of view angle
  • the second angle value may be the cosine value of the second field of view angle
  • the first angle value may be the sine value of the first field of view angle
  • the second angle value may be the sine value of the second field of view angle.
  • the second preset mapping relationship can be expressed by relational expression (2):
  • the obtained preset mapping relationship can be expressed by relational formula (3):
  • n represents the edge parameter
  • m represents the distance parameter
  • m represents the distance parameter
  • m represents the first field of view angle
  • b represents the second pixel ratio parameter
  • relational expression (1) and relational expression (2) based on Figure 2.
  • BG and n both represent edge parameters
  • CD and W both represent the first distance.
  • BG is used to represent the edge parameter
  • CD is used to represent it.
  • the first distance is explained.
  • G represents the location of the object, where G can represent the center of the object, the left edge of the object, or the right edge of the object, which is not limited in this embodiment.
  • Relational expression (7) is also relational expression (1).
  • the second pixel ratio parameter is:
  • Relational expression (8) is also relational expression (2).
  • the preset mapping relationship may also be determined based on the first preset mapping relationship and the second preset mapping relationship, and then when the terminal obtains the first pixel ratio and After the second pixel ratio, the first pixel ratio and the second pixel ratio can be directly substituted into the preset mapping relationship for mapping calculation, thereby obtaining the target distance.
  • the first field of view and the second field of view can be a horizontal field of view, a vertical field of view, or a diagonal field of view, that is, the first field of view and the second field of view are
  • the angle may include field angles of different field angle types. Therefore, in other embodiments, the first pixel ratio parameter and the second pixel ratio parameter are determined according to the first field of view angle, the second field of view and the first distance. and preset mapping relationships between distance parameters, including:
  • the first pixel ratio and the second pixel ratio are substituted into a preset mapping relationship corresponding to the field of view type that matches the target field of view type to perform mapping calculation to obtain the distance corresponding to the distance parameter.
  • relation (3) is the tangent value of the horizontal field of view angle of the first camera
  • in relation (3) is the tangent value of the horizontal field of view of the second camera.
  • in relation (3) is the tangent value of the vertical field of view of the first camera
  • in relation (3) is the tangent value of the vertical field of view of the second camera.
  • the target field of view type of the first pixel ratio is the same as the target field of view type of the second pixel ratio.
  • the second pixel ratio is also the horizontal field of view.
  • pixel ratio when the first image When the pixel ratio is the pixel ratio perpendicular to the viewing angle, the second pixel ratio is also the pixel ratio perpendicular to the viewing angle.
  • the target distance can also be calculated simultaneously using preset mapping relationships of multiple types of field of view.
  • the first pixel ratio includes the first pixel ratio of multiple types of field of view, and the second pixel ratio.
  • Scale includes second pixel scale for multiple field of view types;
  • the first pixel ratio and the second pixel ratio are substituted into the preset mapping relationship corresponding to the field of view type that matches the target field of view type to perform mapping calculations to obtain the distance corresponding to the distance parameter, including:
  • the first pixel ratio and the second pixel ratio of the target field of view type are substituted into the preset mapping relationship corresponding to the field of view type that matches the target field of view type for mapping calculation, and the distance parameters for various field of view types are obtained.
  • the distance corresponding to the distance parameter is determined.
  • the obtained first pixel ratio is the horizontal field of view type of the first camera (horizontal field of view).
  • the field angle corresponding to the field angle type is the first pixel ratio corresponding to the horizontal field of view).
  • the number of horizontal pixels corresponding to the subject in the second preview image is divided by the total number of horizontal pixels in the second preview image. number, and the obtained second pixel ratio is the second pixel ratio corresponding to the horizontal field of view type of the second camera.
  • the obtained first pixel ratio is the first pixel ratio corresponding to the vertical field of view type of the first camera.
  • a pixel ratio is divided by the number of vertical pixels corresponding to the object in the second preview image by the total number of vertical pixels in the second preview image.
  • the obtained second pixel ratio is the vertical field of view of the second camera.
  • the second pixel ratio corresponding to the type.
  • the terminal After obtaining the first pixel ratio and the second pixel ratio of various field of view types, the terminal substitutes the first pixel ratio and the second pixel ratio of the target field of view type into the field of view type that matches the target field of view type. Mapping calculation is performed in the corresponding preset mapping relationship to obtain the first target distance.
  • the target field of view angle The type is the horizontal field of view type
  • mapping calculation is performed in the mapping relationship to obtain the first target distance.
  • the target field of view type is For the vertical field of view type, substitute the first pixel ratio corresponding to the vertical field of view type of the first camera and the second pixel ratio corresponding to the vertical field of view type of the second camera into the preset mapping corresponding to the vertical field of view type. Perform mapping calculations in the relationship to obtain the first target distance.
  • the target field of view angle If the type is the diagonal field of view type, then the first pixel ratio corresponding to the diagonal field of view type of the first camera and the second pixel ratio corresponding to the diagonal field of view type of the second camera are substituted into the diagonal field of view. Perform mapping calculations in the preset mapping relationship corresponding to the type to obtain the first target distance.
  • the distance corresponding to the distance parameter is determined according to each first target distance, including:
  • determining the distance corresponding to the distance parameter based on each first target distance may also include:
  • the average value of each multiplication result is used as the distance corresponding to the distance parameter.
  • the first target distance is first calculated based on the first pixel ratio and the second pixel ratio of multiple field of view types, and then the target distance is calculated based on the first target distance, thereby making the obtained target distance more accurate.
  • mapping calculation on the first pixel ratio and the second pixel ratio to obtain the target distance of the object relative to the first camera and the second camera includes:
  • the calibrated distance corresponding to the first calibrated pixel ratio that matches the first pixel ratio and the second calibrated pixel ratio that matches the second pixel ratio is used as the target distance of the object relative to the first camera and the second camera.
  • the first calibrated pixel ratio that matches the first pixel ratio may refer to the first calibrated pixel ratio that has the smallest difference with the first pixel ratio.
  • the second calibrated pixel ratio that matches the second pixel ratio may refer to the first calibrated pixel ratio that matches the second pixel ratio.
  • the first calibrated pixel ratio is the pixel ratio obtained based on the first calibration image of the test object located at the position of the calibration distance
  • the second calibrated pixel ratio is the pixel ratio obtained based on the second calibration image of the test object located at the position of the calibration distance. Proportion.
  • the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance are associated and stored in the preset mapping list, after the first pixel ratio and the second pixel ratio are obtained, the first calibrated pixel ratio and the second calibrated pixel ratio can be processed according to the first The pixel ratio and the second pixel ratio are found from the preset mapping list to find the first calibrated pixel ratio that matches the first pixel ratio, and the calibration distance corresponding to the second calibrated pixel ratio that matches the second pixel ratio.
  • the first pixel ratio and the second pixel ratio are substituted into the fitting curve for fitting to obtain the fitting distance.
  • the fitting curve is the sum of the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance. the curve between;
  • the fitting distance is taken as the target distance between the object and the first camera and the second camera.
  • the first calibrated pixel ratio, the second calibrated pixel ratio, and the calibrated distance are fitted in advance to obtain a fitting curve. Then, after obtaining the first pixel ratio and the second pixel ratio, the terminal The ratio and the second pixel ratio are substituted into the fitting curve to obtain the target distance.
  • the object area corresponding to the subject in the first preview image may have a deflection angle relative to the reference position on the first preview image. For example, as shown in FIG. 4 , using the left edge of the first preview image as a reference position, the corresponding object area of the subject in the first preview image has a deflection angle relative to the left edge of the first preview image.
  • the fitting curve includes fitting curves with different calibrated deflection angles
  • the first pixel ratio and the second pixel ratio are substituted into the first target fitting curve for fitting, and the fitting distance is obtained.
  • a calibrated deflection angle corresponds to a fitting curve, so that the fitting distance can be obtained more accurately, and thus the target distance can be obtained more accurately.
  • the calibrated deflection angle that matches the deflection angle may refer to the calibrated deflection angle that has the smallest difference from the deflection angle among multiple calibrated deflection angles. After filtering out the fitting curve corresponding to the calibrated deflection angle that matches the deflection angle, the fitting curve corresponding to the calibrated deflection angle that matches the deflection angle is used as the first target fitting curve.
  • the terminal can also obtain the object area corresponding to the subject in the second preview image and the deflection angle relative to the reference position on the second preview image, and then filter out the deflection angle relative to the reference position on the second preview image.
  • the deflection angle matches the calibrated deflection angle.
  • obtaining the object area corresponding to the subject in the first preview image and the deflection angle relative to the reference position on the first preview image may include:
  • the deflection angle of the corresponding object area of the subject in the first preview image relative to the reference position on the first preview image is zero;
  • the object area corresponding to the subject in the first preview image is determined based on the distance between the two reference pixels and the reference position, relative to the reference position on the first preview image. deflection angle.
  • any two reference pixels on the same edge can be pixel 1 and pixel 2 respectively.
  • the distances from pixel 1 and pixel 2 to the reference position are different.
  • the reference pixel and reference position can be set according to the actual situation, and are not limited in this embodiment.
  • the fitting curve includes different fitting curves for calibrating focus areas
  • the first pixel ratio and the second pixel ratio are substituted into the second target fitting curve for fitting to obtain the fitting distance.
  • the focus area in the first preview image may include an object area corresponding to the subject in the first preview image.
  • the calibrated focus area that matches the focus area may refer to the calibrated focus area that overlaps the focus area with the largest area. It should be understood that the terminal can also determine the focus area in the second preview image, and then filter out the second target fitting curve based on the focus area in the second preview image.
  • a calibrated focus area corresponds to a fitting curve, so that the fitting distance can be obtained more accurately, and thus the target distance can be obtained more accurately.
  • the fitting curve includes fitting curves of different calibrated viewing angle types; the first pixel ratio and the second pixel ratio are substituted into the fitting curve for fitting, and the fitting distance is obtained, including:
  • the first pixel ratio and the second pixel ratio are substituted into the third target fitting curve for fitting, and the fitting distance is obtained.
  • one viewing angle type corresponds to one fitting curve. combined curve.
  • the terminal filters out the third target fitting curve from the fitting curve according to the target field of view type of the first pixel ratio or the second pixel ratio, and then adds the first pixel ratio
  • the proportion and the second pixel proportion are substituted into the third fitting curve for fitting, and the fitting distance is obtained, so as to improve the accuracy of the fitting distance.
  • the fitting curve can be a curve pre-stored in the electronic device before the electronic device leaves the factory, that is, Before the electronic device leaves the factory, the first camera and the second camera of the electronic device are calibrated to obtain a fitting curve.
  • Fitting is performed according to the first calibration pixel ratio, the second calibration pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance is obtained.
  • the terminal can first obtain the test distance, then filter out the calibration distance from the test distance, obtain the first calibration image of the test object located at the position of the calibration distance through the first camera, and obtain the test image at the position of the calibration distance through the second camera The second calibration image of the object, and then the terminal returns to the step of filtering out the calibration distance from the test distance, and stops execution until the test distance is filtered out, thereby obtaining the first calibration image and the second calibration image corresponding to each calibration distance.
  • the terminal can first filter out the calibration distance of 8cm from the test distance, and then obtain the first calibration image of the test object at 8cm through the first camera, and then obtain the first calibration image of the test object at 8cm through the second camera. Obtain the second calibration image of the test object at 8cm. Then, the terminal can filter out the calibration distance of 15cm from the test distance, and then obtain the first calibration image of the test object at 15cm through the first camera, and obtain the second calibration image of the test object at 15cm through the second camera, until the The first calibration image of the test object at 5m and the second calibration image of the test object at 5m.
  • the terminal After the terminal obtains the first calibration image and the second calibration image corresponding to each calibration distance, it determines the first calibration pixel ratio and the second calibration image corresponding to each calibration distance based on the first calibration image and the second calibration image corresponding to each calibration distance. two calibrated pixel ratios. Finally, the terminal performs fitting based on the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibration distance, and obtains the first calibrated pixel ratio, the second calibrated pixel ratio and the calibration distance. distance between the fitted curves.
  • the method of determining the first calibrated pixel ratio and the second calibrated pixel ratio may refer to the method of determining the first pixel ratio and the second pixel ratio, which will not be described again in this embodiment.
  • the proportion of first calibrated pixels occupied by the test object in the field of view of the first camera is determined based on the first calibration image
  • the proportion of the test object in the field of view of the second camera is determined based on the second calibration image.
  • the second proportion of calibrated pixels in the field angle including:
  • each first calibrated focus area determines the proportion of the first calibrated pixels occupied by the test object in each first calibrated focus area, and according to each second calibrated focus area, determine the ratio of the test object in each second calibrated focus area.
  • the proportion of second calibrated pixels in the area
  • Fitting is performed based on the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibration distance is obtained, including :
  • the second calibrated pixel ratio corresponding to each second calibrated focus area at each calibrated distance and each calibrated distance we obtain The fitting curve between each first calibrated focus area and the second calibrated focus area, the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance.
  • the calibration distances are 5cm, 30cm and 1m.
  • the first calibration image and the second calibration image can be divided into 8*6 area, as shown in Figure 5.
  • determine the first calibration pixel ratio a1 of the first calibration focus area a at 5cm determine the second calibration of the second calibration focus area b at 5cm.
  • Pixel ratio b1 determines the first calibration pixel ratio a1 of the first calibration focus area a at 5cm.
  • the first calibration pixel ratio a2 of the first calibration focus area a at 30cm is determined, and based on the second calibration image at 30cm, the second calibration of the second calibration focus area b at 30cm is determined.
  • Pixel ratio b2 According to the first calibration image at 1m, determine the first calibration pixel ratio a3 of the first calibration focus area a at 1m, and based on the second calibration image at 1m, determine the second calibration of the second calibration focus area b at 1m. Pixel ratio b3.
  • the first calibrated pixel ratio a3 the second calibrated pixel ratio b3 and 1m are fitted to obtain the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance for the first calibrated focus area a and the second calibrated focus area b.
  • the fitting curve between the first calibrated focus area a and the second calibrated focus area b is obtained.
  • the test object can be a card with a repeating pattern.
  • the test object can be a card with a white grid on the back.
  • the test object may fill the first calibration image and the second calibration image.
  • the first calibration image includes the first calibration image of the test object at different calibrated deflection angles for each calibration distance
  • the second calibration image includes the test object at different calibrated deflection angles for each calibration distance.
  • the first calibration pixel proportion of the test object in the first calibration image is determined for the calibrated deflection angle, and according to the second calibration image of the calibrated deflection angle, the calibrated deflection angle of the test object is determined The proportion of the second calibration pixels in the second calibration image;
  • Fitting is performed based on the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibration distance is obtained, including :
  • the first calibration pixel ratio corresponding to each calibration deflection angle at each calibration distance the second calibration pixel ratio corresponding to each calibration deflection angle at each calibration distance, and each calibration distance, we obtain the results for each calibration deflection. angle, the fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance.
  • the terminal can first obtain the test distance, then filter out the calibration distance from the test distance, obtain the first calibration image of the test object located at the position of the calibration distance through the first camera, and obtain the test image at the position of the calibration distance through the second camera
  • the second calibration image of the object and then the terminal rotates the test object along at least one direction of the X-axis direction, the Y-axis direction, and the Z-axis direction to calibrate the rotation angle, and continues to obtain the image of the test object located at the position of the calibration distance through the first camera.
  • the first calibration image is to obtain the second calibration image of the test object at the position of the calibration distance through the second camera until the first calibration image and the second calibration image corresponding to all calibration rotation angles of the calibration distance are obtained, and the terminal returns to execution.
  • the step of selecting the calibration distance from the test distance is stopped until the test distance is filtered out, thereby obtaining the first calibration image and the second calibration image corresponding to each calibration distance and each calibration rotation angle.
  • the calibration distance is 5cm and 30cm
  • the calibration rotation angle is 0°, 30° and 50°.
  • the first calibration image c1 of the test object with a calibrated rotation angle of 0° at 5 cm is acquired through the first camera
  • the second calibration image d1 of the test object with a calibrated rotation angle of 0° at 5 cm is acquired through the second camera.
  • the first calibration image c11 of the test object with a calibrated rotation angle of 30° at 5cm is acquired through the first camera
  • the second calibration image d11 of the test object with a calibrated rotation angle of 30° at 5cm is acquired through the second camera.
  • the first calibration image c111 of the test object with a calibrated rotation angle of 60° at 5cm is acquired through the first camera
  • the second calibration image d111 of the test object with a calibrated rotation angle of 60° at 5cm is acquired through the second camera.
  • the first calibration image of the test object with a calibration rotation angle of 0° at 30cm is obtained through the first camera.
  • c2 acquire the second calibration image d2 of the test object with a calibration rotation angle of 0° at 30cm through the second camera.
  • the first calibration image c21 of the test object with a calibrated rotation angle of 30° at 30cm is acquired through the first camera
  • the second calibration image d21 of the test object with a calibrated rotation angle of 30° at 30cm is acquired through the second camera.
  • the first calibration image c211 of the test object with a calibrated rotation angle of 60° at 30 cm is acquired through the first camera
  • the second calibration image d211 of the test object with a calibrated rotation angle of 60° at 30 cm is acquired through the second camera.
  • the terminal determines the first calibration pixel ratio v1 of the test object at 5cm and the first calibration pixel ratio v2 at 30cm for the calibration rotation angle 0°, and based on the second calibration image d1 and the second calibration image d2, determine the second calibration pixel ratio y1 of the test object at 5cm and the second calibration pixel ratio y2 at 30cm for the calibration rotation angle of 0°.
  • the terminal Based on the first calibration image c11 and the first calibration image c21, the terminal determines the first calibration pixel ratio v11 of the test object at 5cm and the first calibration pixel ratio v21 at 30cm for the calibration rotation angle of 30°, and based on the second calibration image d1 and The second calibration image d2 determines the second calibration pixel ratio y11 of the test object at 5cm and the second calibration pixel ratio y21 at 30cm for the calibration rotation angle of 30°.
  • the terminal Based on the first calibration image c111 and the first calibration image c211, the terminal determines the first calibration pixel ratio v111 of the test object at 5cm and the first calibration pixel ratio v211 at 30cm for the calibration rotation angle of 60°, and based on the second calibration image d1 and The second calibration image d2 determines the second calibration pixel ratio y111 of the test object at 5cm and the second calibration pixel ratio y211 at 30cm for the calibration rotation angle of 60°.
  • the fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance, that is, the fitting curve corresponding to the calibration rotation angle of 30° is obtained.
  • the fitting is performed based on the first calibrated pixel ratio v111, the second calibrated pixel ratio y111, 5cm, the first calibrated pixel ratio v211, the second calibrated pixel ratio y211 and 30cm, and the calibrated deflection angle of 60° is obtained.
  • the fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance that is, the fitting curve corresponding to the calibration rotation angle of 60° is obtained.
  • the proportion of the first calibration pixels occupied by the test object in the first calibration image is determined according to the first calibration image
  • the proportion of the test object in the second calibration image occupied by the test object is determined according to the second calibration image.
  • the second calibration pixel ratio includes:
  • the first calibration image determine the first calibrated pixel proportion of the test object in the first calibration image for different calibrated field of view types, and determine the test object for different calibrated field of view types according to the second calibration image, The proportion of the second calibration pixels in the second calibration image;
  • Fitting is performed based on the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibration distance is obtained, including :
  • the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to the calibrated field of view type, the first calibrated pixel ratio, the second calibrated pixel ratio and each calibrated distance are obtained for each calibrated field of view type. Fitting curve between pixel ratio and calibration distance.
  • the calibration distance is 5cm and 30cm
  • the calibrated field of view type includes horizontal field of view type and vertical field of view type.
  • the test object calculates the first calibrated pixel ratio 5 of the test object for the horizontal field of view type and the first calibrated pixel ratio 6 of the test object for the vertical field of view type, based on the second calibration image at 30cm , the test object has a second calibrated pixel ratio of 7 for the horizontal field of view type and the test object has a second calibrated pixel ratio of 8 for the vertical field of view type.
  • the fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance that is, a fitting curve corresponding to the horizontal field of view angle type is obtained.
  • the fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance that is, a fitting curve corresponding to the vertical field of view angle type is obtained.
  • the terminal After the terminal obtains the target distance, it can search for the driving current corresponding to the target distance, and then drive the first camera and the second camera to move to the quasi-focus position based on the driving current.
  • the accurate focus position refers to the position of the first camera and the second camera when a clear image can be obtained.
  • phase detection auto focus there are high requirements for the camera sensor, and some terminals cannot focus through relative detection.
  • TOF Time of Flight
  • additional costs are required.
  • focusing can be achieved through the first camera and the second camera, which is easy to implement and has low cost.
  • the first preview image of the subject is first obtained through the first camera of the electronic device, and the second preview image of the subject is obtained through the second camera of the electronic device. Then, a first pixel ratio occupied by the object in the first preview image is determined based on the first preview image, and a second pixel ratio occupied by the object in the second preview image is determined based on the second preview image. Then, mapping calculation is performed on the first pixel ratio and the second pixel ratio to obtain the target distance of the object relative to the first camera and the second camera. Finally, the first camera and the second camera are driven to move to the accurate focus position according to the target distance.
  • the first pixel ratio occupied by the subject in the first preview image and the second pixel ratio occupied by the subject in the second preview image are relatively easy to determine, therefore, the first pixel ratio occupied by the subject in the second preview image is relatively easy to determine.
  • the first pixel ratio and the second pixel ratio are mapped and calculated to obtain a more accurate target distance of the object relative to the first camera and the second camera. Then the first camera and the second camera can be driven to the quasi-focus position according to the target distance. Thus achieving better focusing results.
  • the embodiment of the present application also provides a device based on the above-mentioned camera focusing method.
  • the meanings of the nouns are the same as those in the above camera focusing method.
  • the camera focusing device may include:
  • the acquisition module 601 is configured to acquire the first preview image of the object through the first camera of the electronic device, and acquire the second preview image of the object through the second camera of the electronic device;
  • Determining module 602 configured to determine a first pixel proportion of the object in the first preview image according to the first preview image, and determine a first pixel proportion of the object in the second preview image according to the second preview image. second pixel ratio;
  • Mapping module 603 is used to perform mapping calculations on the first pixel ratio and the second pixel ratio to obtain the target distance of the subject relative to the first camera and the second camera;
  • the driving module 604 is used to drive the first camera and the second camera to move to the quasi-focus position according to the target distance.
  • mapping module 603 is specifically used to perform:
  • the distance corresponding to the distance parameter is used as the target distance of the object relative to the first camera and the second camera.
  • mapping module 603 is specifically used to perform:
  • a first preset mapping relationship between the distance parameter, the edge parameter and the first pixel ratio parameter is determined.
  • the edge parameter represents the relationship between the object and the edge of the second camera. distance;
  • a preset mapping relationship between the first pixel ratio parameter, the second pixel ratio parameter and the distance parameter is determined according to the first preset mapping relationship and the second preset mapping relationship.
  • mapping module 603 is specifically used to perform:
  • the first pixel ratio and the second pixel ratio are substituted into a preset mapping relationship corresponding to the field of view type that matches the target field of view type to perform mapping calculation to obtain the distance corresponding to the distance parameter.
  • the first pixel ratio includes first pixel ratios of multiple viewing angle types
  • the second pixel ratio includes second pixel ratios of multiple viewing angle types.
  • mapping module 603 is specifically used to perform:
  • the first pixel ratio and the second pixel ratio of the target field of view type are substituted into the preset mapping relationship corresponding to the field of view type that matches the target field of view type for mapping calculation, and the distance parameters for various field of view types are obtained.
  • the distance corresponding to the distance parameter is determined.
  • mapping module 603 is specifically used to perform:
  • mapping module 603 is specifically used to perform:
  • the calibrated distance corresponding to the first calibrated pixel ratio that matches the first pixel ratio and the second calibrated pixel ratio that matches the second pixel ratio is used as the target distance of the object relative to the first camera and the second camera.
  • mapping module 603 is specifically used to perform:
  • the fitting curve is the curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance;
  • the fitting distance is taken as the target distance of the object relative to the first camera and the second camera.
  • the fitting curve includes fitting curves for different calibrated deflection angles.
  • mapping module 603 is specifically used to perform:
  • the first pixel ratio and the second pixel ratio are substituted into the first target fitting curve for fitting, and the fitting distance is obtained.
  • the fitting curve includes fitting curves for different calibrated focus areas.
  • mapping module 603 is specifically used to perform:
  • the first pixel ratio and the second pixel ratio are substituted into the second target fitting curve for fitting to obtain the fitting distance.
  • the fitting curve includes fitting curves of different calibrated field angle types.
  • mapping module 603 is specifically used to perform:
  • the first pixel ratio and the second pixel ratio are substituted into the third target fitting curve for fitting, and the fitting distance is obtained.
  • the camera focusing device also includes:
  • Calibration module used to perform:
  • Fitting is performed according to the first calibration pixel ratio, the second calibration pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance is obtained.
  • the calibration module is specifically used to perform:
  • each first calibrated focus area determines the proportion of the first calibrated pixels occupied by the test object in each first calibrated focus area, and according to each second calibrated focus area, determine the ratio of the test object in each second calibrated focus area.
  • the proportion of second calibrated pixels in the area
  • the second calibrated pixel ratio corresponding to each second calibrated focus area at each calibrated distance and each calibrated distance we obtain The fitting curve between each first calibrated focus area and the second calibrated focus area, the first calibrated pixel ratio, the second calibrated pixel ratio and the calibrated distance.
  • the first calibration image includes a first calibration image of the test object at different calibrated deflection angles for each calibration distance
  • the second calibration image includes a second calibration of the test object at different calibrated deflection angles for each calibration distance
  • the calibration module is specifically used to perform:
  • the test object is in the first calibration image with respect to the calibrated deflection angle.
  • the proportion of the first calibrated pixels occupied by the test object and based on the second calibration image of the calibrated deflection angle, determine the proportion of the second calibrated pixels occupied by the test object in the second calibration image for the calibrated deflection angle;
  • the first calibration pixel ratio corresponding to each calibration deflection angle at each calibration distance the second calibration pixel ratio corresponding to each calibration deflection angle at each calibration distance, and each calibration distance, we obtain the results for each calibration deflection. angle, the fitting curve between the first calibration pixel ratio, the second calibration pixel ratio and the calibration distance.
  • the calibration module is specifically used to perform:
  • the first calibration image determine the first calibrated pixel proportion of the test object in the first calibration image for different calibrated field of view types, and determine the test object for different calibrated field of view types according to the second calibration image, The proportion of the second calibration pixels in the second calibration image;
  • Fitting is performed based on the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibration distance, and a fitting curve between the first calibrated pixel ratio, the second calibrated pixel ratio and the calibration distance is obtained, including :
  • the first calibrated pixel ratio, the second calibrated pixel ratio and each calibration distance corresponding to each calibrated field of view type at each calibration distance, the first calibrated pixel ratio, the second calibrated pixel ratio and each calibrated distance are obtained for each calibrated field of view type.
  • the fitting curve between the two calibrated pixel ratios and the calibrated distance are obtained for each calibrated field of view type.
  • the acquisition module 601 is specifically used to execute:
  • the shooting start instruction move the first camera of the electronic device to the preset initial position, and move the second camera of the electronic device to the preset initial position;
  • the first preview image of the object is acquired through the first camera located at the preset initial position
  • the second preview image of the object is acquired through the second camera located at the preset initial position.
  • each of the above modules can be implemented as an independent entity, or can be combined in any way to be implemented as the same or several entities.
  • the specific implementation methods and corresponding beneficial effects of each of the above modules can be found in the previous method embodiments. I won’t go into details here.
  • An embodiment of the present application also provides an electronic device.
  • the electronic device may be a server or a terminal. As shown in Figure 7, it shows a schematic structural diagram of the electronic device involved in the embodiment of the present application. Specifically:
  • the electronic device may include components such as a processor 701 of one or more processing cores, a memory 702 of one or more computer-readable storage media, a power supply 703, and an input unit 704.
  • a processor 701 of one or more processing cores a memory 702 of one or more computer-readable storage media
  • a power supply 703 a power supply 703
  • FIG. 7 does not constitute a limitation of the electronic device, and may include more or fewer components than shown in the figure, or combine certain components, or arrange different components. in:
  • the processor 701 is the control center of the electronic device, using various interfaces and lines to connect various parts of the entire electronic device, by running or executing computer programs and/or modules stored in the memory 702, and calling programs stored in the memory 702. Data, perform various functions of electronic devices and process data.
  • the processor 701 may include one or more processing cores; preferably, the processor 701 may integrate an application processor and a modem processor, where the application processor mainly processes operating systems, user interfaces, application programs, etc. , the modem processor mainly handles wireless communications. It can be understood that the above-mentioned modem processor may not be integrated into the processor 701.
  • the memory 702 can be used to store computer programs and modules.
  • the processor 701 executes various functional applications and data processing by running the computer programs and modules stored in the memory 702 .
  • the memory 702 may mainly include a program storage area and a data storage area, where the program storage area may store an operating system, a computer program required for at least one function (such as a sound playback function, an image playback function, etc.), etc.; the storage data area may store a program according to Data created by the use of electronic devices, etc.
  • memory 702 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 702 may also include a memory controller to provide the processor 701 with access to the memory 702 .
  • the electronic device also includes a power supply 703 that supplies power to various components.
  • the power supply 703 can be logically connected to the processor 701 through a power management system, so that functions such as charging, discharging, and power consumption management can be implemented through the power management system.
  • Power supply 703 may also include one or more DC or AC power supplies, recharging systems, power failure detection circuits, power converters or inverters, power status indicators, and other arbitrary components.
  • the electronic device may also include an input unit 704 that may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical, or trackball signal input related to user settings and function control.
  • an input unit 704 may be used to receive input numeric or character information and generate keyboard, mouse, joystick, optical, or trackball signal input related to user settings and function control.
  • the electronic device may also include a display unit and the like, which will not be described again here.
  • the processor 701 in the electronic device will load the executable files corresponding to the processes of one or more computer programs into the memory 702 according to the following instructions, and the processor 701 will run the executable files stored in the computer program.
  • the first camera and the second camera are driven to move to the accurate focus position.
  • embodiments of the present application provide a computer-readable storage medium in which a computer program is stored, and the computer program can be loaded by a processor to perform steps in any camera focusing method provided by embodiments of the present application.
  • the computer program can perform the following steps:
  • the first camera and the second camera are driven to move to the accurate focus position.
  • the computer-readable storage medium may include: read-only memory (ROM, Read Only Memory), random access memory (RAM, Random Access Memory), magnetic disk or optical disk, etc.
  • any camera focusing method provided by the embodiments of the present application can be implemented.
  • the beneficial effects that can be achieved are detailed in the previous embodiments and will not be described again here.
  • a computer program product or computer program includes computer instructions, and the computer instructions are stored in a computer-readable storage medium.
  • the processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions, so that the computer device performs the above camera focusing method.

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Signal Processing (AREA)
  • Studio Devices (AREA)
  • Automatic Focus Adjustment (AREA)

Abstract

本申请公开了一种摄像头对焦方法、装置、电子设备和计算机可读存储介质;在本申请中,根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例;对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离;根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。本申请可以达到更好的对焦结果。

Description

摄像头对焦方法、装置、电子设备和计算机可读存储介质
本申请要求申请日为2022年07月06日、申请号为202210798330.4、发明名称为“摄像头对焦方法、装置、电子设备和计算机可读存储介质”的中国专利申请的优先权,其全部内容通过引用结合在本申请中。
技术领域
本申请涉及摄像头技术领域,具体涉及一种摄像头对焦方法、装置、电子设备和计算机可读存储介质。
背景技术
随着社会经济的发展,用户越来越喜欢拍照。在拍照之前,镜头需要进行对焦,才能获取到清晰的图像。
目前,在利用两个镜头进行对焦时,需要先确定被拍摄物体相对于镜头的入射角,然后根据入射角进行对焦。然而,该入射角比较难计算,容易出错,导致最终对焦的结果较差。
技术问题
本申请实施例可以解决根据入射角进行对焦时对焦的结果较差的技术问题。
技术解决方案
本申请实施例一种摄像头对焦方法,包括:
通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过上述电子设备的第二摄像头获取上述被摄物体的第二预览图像;
根据上述第一预览图像,确定上述被摄物体在上述第一预览图像中所占的第一像素比例,以及根据上述第二预览图像,确定上述被摄物体在上述第二预览图像中所占的第二像素比例;
对上述第一像素比例和上述第二像素比例进行映射计算,得到上述被摄物体相对于上述第一摄像头和上述第二摄像头的目标距离;
根据上述目标距离,驱动上述第一摄像头和上述第二摄像头移动至准焦位置。
相应地,本申请实施例提供一种摄像头对焦装置,包括:
获取模块,用于通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过上述电子设备的第二摄像头获取上述被摄物体的第二预览图像;
确定模块,用于根据上述第一预览图像,确定上述被摄物体在上述第一预览图像中所占的第一像素比例,以及根据上述第二预览图像,确定上述被摄物体在上述第二预览图像中所占的第二像素比例;
映射模块,用于对上述第一像素比例和上述第二像素比例进行映射计算,得到上述被摄物体相对于上述第一摄像头和上述第二摄像头的目标距离;
驱动模块,用于根据上述目标距离,驱动上述第一摄像头和上述第二摄像头移动至准焦位置。
此外,本申请实施例还提供一种电子设备,包括处理器和存储器,上述存储器存储有计算机程序,上述处理器用于运行上述存储器内的计算机程序实现本申请实施例提供的摄像头对焦方法。
此外,本申请实施例还提供一种计算机可读存储介质,上述计算机可读存储介质存储有计算机程序,上述计算机程序适于处理器进行加载,以执行本申请实施例所提供的任一种摄像头对焦方法。
此外,本申请实施例还提供一种计算机程序产品,包括计算机程序,所述计算机程序被 处理器执行时实现本申请实施例所提供的任一种摄像头对焦方法。
有益效果
在本申请实施例中,先通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像。然后根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例。接着对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离。最后根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
即在本申请实施例中,由于被摄物体在第一预览图像中所占的第一像素比例和被摄物体在第二预览图像中所占的第二像素比例比较容易确定,因此,对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离更加准确,则根据目标距离可以将第一摄像头和第二摄像头驱动至准焦位置,从而达到更好的对焦结果。
附图说明
为了更清楚地说明本申请实施例中的技术方案,下面将对实施例描述中所需要使用的附图作简单地介绍,显而易见地,下面描述中的附图仅仅是本申请的一些实施例,对于本领域技术人员来讲,在不付出创造性劳动的前提下,还可以根据这些附图获得其他的附图。
图1是本申请实施例提供的摄像头对焦方法的流程示意图;
图2是本申请实施例提供的目标距离的示意图;
图3是本申请实施例提供的视场角的示意图;
图4是本申请实施例提供的参考位置和参考像素的示意图;
图5是本申请实施例提供的对焦区域的示意图;
图6是本申请实施例提供的摄像头对焦装置的结构示意图;
图7是本申请实施例提供的电子设备的结构示意图。
本发明的实施方式
下面将结合本申请实施例中的附图,对本申请实施例中的技术方案进行清楚、完整地描述,显然,所描述的实施例仅仅是本申请一部分实施例,而不是全部的实施例。基于本申请中的实施例,本领域技术人员在没有作出创造性劳动前提下所获得的所有其他实施例,都属于本申请保护的范围。
本申请实施例提供一种摄像头对焦方法、装置、电子设备和计算机可读存储介质。其中,该摄像头对焦装置可以集成在电子设备中,该电子设备可以是服务器,也可以是终端等设备。
其中,服务器可以是独立的物理服务器,也可以是多个物理服务器构成的服务器集群或者分布式系统,还可以是提供云服务、云数据库、云计算、云函数、云存储、网络服务、云通信、中间件服务、域名服务、安全服务、网络加速服务(Content Delivery Network,CDN)、以及大数据和人工智能平台等基础云计算服务的云服务器。
终端可以是智能手机、平板电脑、笔记本电脑、台式计算机、智能音箱、智能手表等,但并不局限于此。终端以及服务器可以通过有线或无线通信方式进行直接或间接地连接,本申请在此不做限制。
另外,本申请实施例中的“多个”指两个或两个以上。本申请实施例中的“第一”和“第二”等用于区分描述,而不能理解为暗示相对重要性。
以下分别进行详细说明。需要说明的是,以下实施例的描述顺序不作为对实施例优选顺序的限定。
在本实施例中,为了方便对本申请的摄像头对焦方法进行说明,以下将以电子设备为终端进行详细说明,即以终端作为执行主体进行详细说明。
请参阅图1,图1是本申请一实施例提供的摄像头对焦方法的流程示意图。该摄像头对焦方法可以包括:
S101、通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像。
终端可以在检测到拍摄客户端被启动时,通过本身的第一摄像头获取被摄物体的第一预览图像,以及通过本身的第二摄像头获取第二预览图像。
或者,也可以是当其他终端检测到拍摄客户端被启动时,通过其他终端的第一摄像头获取被摄物体的第一预览图像,以及通过其他终端的第二摄像头获取第二预览图像,其他终端再将第一预览图像和第二预览图像发送至终端,终端从而获取到第一预览图像和第二预览图像。
对于第一预览图像和第二预览图像的获取方式,可以根据实际情况进行选择,本实施例在此不做限定。
如果被摄物体在第一预览图像中对应的物体区域和在第二预览图像中对应的物体区域太模糊,则较难根据第一预览图像确定第一像素比例以及根据第二预览图像确定第二像素比例。因此,在一些实施例中,通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像,包括:
获取拍摄启动指令;
根据拍摄启动指令,将电子设备的第一摄像头移动至预设初始位置,以及将电子设备的第二摄像头移动至预设初始位置;
通过位于预设初始位置的第一摄像头获取被摄物体的第一预览图像,以及通过位于预设初始位置的第二摄像头获取被摄物体的第二预览图像。
在本实施例中,在获取第一预览图像和第二预览图像之前,先将第一摄像头和第二摄像头移动至预设初始位置,从而使得在获取到第一预览图像和第二预览图像之后,可以根据被摄物体在第一预览图像中对应的物体区域和第一预览图像,确定第一像素比例,根据被摄物体在第二预览图像中对应的物体区域和第二预览图像,确定第二像素比例,减少出现由于被摄物体在第一预览图像中对应的物体区域和在第二预览图像中对应的物体区域太模糊导致不能确定第一像素比例和第二像素比例的现象。
S102、根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例。
终端可以将被摄物体在第一预览图像中对应的像素的个数除以第一预览图像的总像素的个数,从而得到第一像素比例,将被摄物体在第二预览图像中对应的像素的个数除以第二预览图像的总像素,从而得到第二像素比例。
或者,终端也可以将被摄物体在第一预览图像中对应的水平像素的个数除以第一预览图像的水平像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的水平像素的个数除以第二预览图像的水平像素的总个数,得到第二像素比例。
又或者,终端也可以将被摄物体在第一预览图像中对应的垂直像素的个数除以第一预览图像的垂直像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的垂直像素的个数除以第二预览图像的垂直像素的总个数,得到第二像素比例。
又或者,终端也可以将被摄物体在第一预览图像中对应的对角像素的个数除以第一预览图像的对角像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的对角像素的个数除以第二预览图像的对角像素的总个数,得到第二像素比例。
应理解,第一像素比例可以用于表征被摄物体在第一摄像头的第一视场角中所占的第一 视场角比例,第二像素比例可以用于表征被摄物体在第二摄像头的第二视场角中所占的第二视场角比例。
S103、对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离。
在得到第一像素比例和第二像素比例之后,终端可以根据第一像素比例和第二像素比例计算得到目标距离。
目标距离指被摄物体与第一摄像头之间的直线距离。比如,目标距离可以如图2所示,图中m对应的取值即为目标距离。由于第一摄像头和第二摄像头位于同一个平面,因此,被摄物体与第二摄像头之间的直线距离和被摄物体与第一摄像头之间的直线距离相同,也即是,目标距离也指被摄物体与第二摄像头之间的直线距离。
在一些实施例中,对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离,包括:
获取第一摄像头的第一视场角、第二摄像头的第二视场角、以及第一摄像头和第二摄像头之间的第一距离;
根据第一视场角、第二视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,第一像素比例表示第一像素比例参数的取值,第二像素比例表示第二像素比例参数的取值;
将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离;
将距离参数对应的距离作为被摄物体相对于第一摄像头和第二摄像头的目标距离。
如果终端通过将被摄物体在第一预览图像中对应的水平像素的个数除以第一预览图像的水平像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的水平像素的个数除以第二预览图像的水平像素的总个数,得到第二像素比例,则第一视场角为第一摄像头的水平视场角,第二视场角为第二摄像头的水平视场角。
也即是,根据第一摄像头的水平视场角、第二摄像头的水平视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,然后将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离。
如果终端通过将被摄物体在第一预览图像中对应的垂直像素的个数除以第一预览图像的垂直像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的垂直像素的个数除以第二预览图像的垂直像素的总个数,得到第二像素比例,则第一视场角为第一摄像头的垂直视场角,第二视场角为第二摄像头的垂直视场角。
也即是,根据第一摄像头的垂直视场角、第二摄像头的垂直视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,然后将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离。
如果终端通过将被摄物体在第一预览图像中对应的对角像素的个数除以第一预览图像的对角像素的总个数,得到第一像素比例,将被摄物体在第二预览图像中对应的对角像素的个数除以第二预览图像的对角像素的总个数,得到第二像素比例,则第一视场角为第一摄像头的对角视场角,第二视场角为第二摄像头的对角视场角。
也即是,根据第一摄像头的对角视场角、第二摄像头的对角视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,然后将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离。
水平视场角、垂直视场角以及对角视场角比如可以如图3所示。
其中,根据第一视场角、第二视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,包括:
对第一视场角进行角度映射,得到第一角度值,以及对第二视场角进行角度映射,得到第二角度值;
根据第一角度值、第二角度值以及第一距离,确定距离参数、边缘参数以及第一像素比例参数之间的第一预设映射关系,边缘参数表征物体与第二摄像头的边缘之间的距离;
根据第二角度值,确定边缘参数、距离参数以及第二像素比例参数之间的第二预设映射关系;
根据第一预设映射关系和第二预设映射关系,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系。
角度映射可以为三角函数映射,三角函数映射可以为正切映射、余弦映射以及正弦映射等。
当角度映射为正切映射时,第一角度值可以为第一视场角的正切值,第二角度值可以为第二视场角的正切值。当角度映射为余弦映射时,第一角度值可以为第一视场角的余弦值,第二角度值可以为第二视场角的余弦值。当角度映射为正弦映射时,第一角度值可以为第一视场角的正弦值,第二角度值可以为第二视场角的正弦值。
当角度映射为正切映射时,第一预设映射关系可以采用关系式(1)进行表示:
第二预设映射关系可以采用关系式(2)进行表示:
根据第一预设映射关系和第二预设映射关系,得到的预设映射关系可以采用关系式(3)表示:
其中,n表示边缘参数,m表示距离参数,表示第一视场角,表示第二视场角,表示第一距离,a表示第一像素比例参数,b表示第二像素比例参数。
下面根据图2,对关系式(1)和关系式(2)进行详细说明,图中BG和n均表示边缘参数,CD和W均表示第一距离,下面采用BG表示边缘参数,采用CD表示第一距离进行说明。G表示被摄物体所在的位置,其中,G可以表示被摄物体的中心,也可以表示被摄物体的左边缘,也可以表示被摄物体的右边缘,本实施例在此不做限定。
先求解第一视场角的正切值:
第二视场角的正切值:
将关系式(4)减去关系式(5),可以得到:
对关系式(6)进行变形,可以得到:
则第一像素比例参数为:
关系式(7)也即为关系式(1)。
第二像素比例参数为:
关系式(8)也即为关系式(2)。
需要说明的是,在获取第一预览图像和第二预览图像之前,也可以先根据第一预设映射关系和第二预设映射关系确定预设映射关系,然后当终端得到第一像素比例和第二像素比例之后,可以直接将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,从而得到目标距离。
由于第一视场角和第二视场角可以为水平视场角,也可以为垂直视场角,也可以为对角视场角,也即是,第一视场角和第二视场角可以包括不同视场角类型的视场角,因此,在另一些实施例中,根据第一视场角、第二视场以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,包括:
根据第一视场角、第二视场以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数针对各种视场角类型的预设映射关系;
将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离,包括:
确定第一像素比例或第二像素比例的目标视场角类型;
将第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到距离参数对应的距离。
应理解,各种视场角类型对应的预设映射关系的表现形式可以是相同的,均可以采用关系式(3)进行表示,只是关系式(3)中的参数在各种视场角类型中对应的取值不相同。
比如,当第一视场角和第二视场角为水平视场角时,关系式(3)中的为第一摄像头的水平视场角的正切值,关系式(3)中的为第二摄像头的水平视场角的正切值。当第一视场角和第二视场角为垂直视场角时,关系式(3)中的为第一摄像头的垂直视场角的正切值,关系式(3)中的为第二摄像头的垂直视场角的正切值。
第一像素比例的目标视场角类型和第二像素比例的目标视场角类型相同,比如,当第一像素比例为水平视场角的像素比例时,第二像素比例也为水平视场角的像素比例,当第一像 素比例为垂直视场角的像素比例时,第二像素比例也为垂直视场角的像素比例。
所以,可以通过确定第一像素比例的目标视场角类型或第二像素比例的目标视场角类型,然后将第一像素比例和第二像素比例代入与目标视场角类型相同的视场角类型对应的预设映射关系中进行映射计算,从而得到目标距离。
为了更加准确地得到目标距离,也可以更加多种视场角类型的预设映射关系同时计算目标距离,此时,第一像素比例包括多种视场角类型的第一像素比例,第二像素比例包括多种视场角类型的第二像素比例;
将第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到距离参数对应的距离,包括:
将目标视场角类型的第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到距离参数针对各种视场角类型对应的第一目标距离;
根据各个第一目标距离,确定距离参数对应的距离。
将被摄物体在第一预览图像中对应的水平像素的个数除以第一预览图像的水平像素的总个数,得到的第一像素比例为第一摄像头的水平视场角类型(水平视场角类型对应的视场角为水平视场角)对应的第一像素比例,将被摄物体在第二预览图像中对应的水平像素的个数除以第二预览图像的水平像素的总个数,得到的第二像素比例为第二摄像头的水平视场角类型对应的第二像素比例。
将被摄物体在第一预览图像中对应的垂直像素的个数除以第一预览图像的垂直像素的总个数,得到的第一像素比例为第一摄像头的垂直视场角类型对应的第一像素比例,将被摄物体在第二预览图像中对应的垂直像素的个数除以第二预览图像的垂直像素的总个数,得到的第二像素比例为第二摄像头的垂直视场角类型对应的第二像素比例。
将被摄物体在第一预览图像中对应的对角像素的个数除以第一预览图像的对角像素的总个数,得到的第一像素比例为第一摄像头的对角视场角类型对应的第一像素比例,将被摄物体在第二预览图像中对应的对角像素的个数除以第二预览图像的对角像素的总个数,得到的第二像素比例为第二摄像头的对角视场角类型对应的第二像素比例。
终端在得到各种视场角类型的第一像素比例和第二像素比例之后,将目标视场角类型的第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到第一目标距离。
比如,当第一像素比例为第一摄像头的水平视场角类型对应的第一像素比例,第二像素比例为第二摄像头的水平视场角类型对应的第二像素比例时,目标视场角类型为水平视场角类型,则将第一摄像头的水平视场角类型对应的第一像素比例和第二摄像头的水平视场角类型对应的第二像素比例代入水平视场角类型对应的预设映射关系中进行映射计算,得到第一目标距离。
当第一像素比例为第一摄像头的垂直视场角类型对应的第一像素比例,第二像素比例为第二摄像头的垂直视场角类型对应的第二像素比例时,目标视场角类型为垂直视场角类型,则将第一摄像头的垂直视场角类型对应的第一像素比例和第二摄像头的垂直视场角类型对应的第二像素比例代入垂直视场角类型对应的预设映射关系中进行映射计算,得到第一目标距离。
当第一像素比例为第一摄像头的对角视场角类型对应的第一像素比例,第二像素比例为第二摄像头的对角视场角类型对应的第二像素比例时,目标视场角类型为对角视场角类型,则将第一摄像头的对角视场角类型对应的第一像素比例和第二摄像头的对角视场角类型对应的第二像素比例代入对角视场角类型对应的预设映射关系中进行映射计算,得到第一目标距离。
得到三个第一目标距离之后,再根据三个第一目标距离,计算目标距离。
其中,根据各个第一目标距离,确定距离参数对应的距离,包括:
确定各个第一目标距离的平均值;
将平均值作为距离参数对应的距离。
或者,根据各个第一目标距离,确定距离参数对应的距离,也可以包括:
获取各个第一目标距离对应的权重系数;
将第一目标距离与目标距离对应的权重系数相乘,得到相乘结果;
将各个相乘结果的平均值,作为距离参数对应的距离。
在本实施例中,先根据多种视场角类型的第一像素比例和第二像素比例,计算第一目标距离,然后再根据第一目标距离计算目标距离,从而使得得到的目标距离更加准确。
在另一些实施例中,对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离,包括:
获取预设映射列表,预设映射列表关联存储了第一标定像素比例、第二标定像素比例以及标定距离;
将第一像素比例和第二像素比例与预设映射表中的第一标定像素比例和第二标定像素比例进行匹配;
将与第一像素比例匹配的第一标定像素比例,且与第二像素比例匹配的第二标定像素比例对应的标定距离,作为被摄物体相对于第一摄像头和第二摄像头的目标距离。
与第一像素比例匹配的第一标定像素比例,可以指与第一像素比例之间的差值最小的第一标定像素比例,与第二像素比例匹配的第二标定像素比例,可以指与第二像素比例之间的差值最小的第二标定像素比例。
第一标定像素比例为根据位于标定距离所在位置上的测试物体的第一标定图像得到的像素比例,第二标定像素比例为根据位于标定距离所在位置上的测试物体的第二标定图像得到的像素比例。
在本实施例中,由于预设映射列表中关联存储了第一标定像素比例、第二标定像素比例和标定距离,因此,在获取到第一像素比例和第二像素比例之后,可以根据第一像素比例和第二像素比例,从预设映射列表中查找到与第一像素比例匹配的第一标定像素比例,且与第二像素比例匹配的第二标定像素比例对应的标定距离。
在另一些实施例中,将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离,拟合曲线为第一标定像素比例、第二标定像素比例以及标定距离之间的曲线;
将拟合距离作为被摄物体相对于第一摄像头和第二摄像头之间的目标距离。
在本实施例将,预先对第一标定像素比例、第二标定像素比例以及标定距离进行拟合,得到拟合曲线,然后终端在得到第一像素比例和第二像素比例之后,将第一像素比例和第二像素比例代入拟合曲线中,即可得到目标距离。
由于被摄物体在第一预览图像中对应的物体区域可能相对于第一预览图像上参考位置存在偏转角度。比如,如图4所示,将第一预览图像的左边缘作为参考位置,被摄物体在第一预览图像中对应的物体区域相对于第一预览图像的左边缘存在偏转角度。
因此,为了更加准确地得到目标距离,在另一些实施例中,拟合曲线包括不同的标定偏转角度的拟合曲线;
将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
获取被摄物体在第一预览图像中对应的物体区域,相对于第一预览图像上参考位置的偏转角度;
从多个标定偏转角度的拟合曲线中,筛选出与偏转角度匹配的标定偏转角度对应的拟合曲线,得到第一目标拟合曲线;
将第一像素比例和第二像素比例代入第一目标拟合曲线中进行拟合,得到拟合距离。
在本实施例中,一个标定偏转角度对应一条拟合曲线,从而使得可以更加准确地得到拟合距离,进而更加准确地得到目标距离。
与偏转角度匹配的标定偏转角,可以指多个标定偏转角度中,与偏转角度之间的差值最小的标定偏转角度。筛选出与偏转角度匹配的标定偏转角度对应的拟合曲线之后,将与偏转角度匹配的标定偏转角度对应的拟合曲线作为第一目标拟合曲线。
需要说明的是,终端也可以获取被摄物体在第二预览图像中对应的物体区域,相对于第二预览图像上参考位置的偏转角度,然后再筛选出与相对于第二预览图像上参考位置的偏转角度匹配的标定偏转角度。
其中,获取被摄物体在第一预览图像中对应的物体区域,相对于第一预览图像上参考位置的偏转角度,可以包括:
确定被摄物体在第一预览图像中对应的物体区域的同一边缘上两个参考像素;
确定两个参考像素到参考位置的距离;
若两个参考像素到参考位置的距离相同,则被摄物体在第一预览图像中对应的物体区域,相对于第一预览图像上参考位置的偏转角度为零;
若两个参考像素到参考位置的距离不相同,则根据两个参考像素到参考位置的距离,确定被摄物体在第一预览图像中对应的物体区域,相对于第一预览图像上参考位置的偏转角度。
比如,如图4所示,同一边缘上任意两个参考像素可以分别为像素1和像素2,像素1和像素2到参考位置的距离不相同。对于参考像素和参考位置,可以根据实际情况进行设置,本实施例在此不做限定。
在另一些实施例中,拟合曲线包括不同的标定对焦区域的拟合曲线;
将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
确定第一预览图像中的对焦区域;
从多个标定对焦区域的拟合曲线中,筛选出与对焦区域匹配的标定对焦区域的拟合曲线,得到第二目标拟合曲线;
将第一像素比例和第二像素比例代入第二目标拟合曲线中进行拟合,得到拟合距离。
第一预览图像中的对焦区域,可以包括被摄物体在第一预览图像中对应的物体区域。与对焦区域匹配的标定对焦区域可以指,与对焦区域重合面积最大的标定对焦区域。应理解,终端也可以确定第二预览图像中的对焦区域,然后根据第二预览图像中的对焦区域,筛选出第二目标拟合曲线。
在本实施例中,一个标定对焦区域对应一条拟合曲线,从而使得可以更加准确地得到拟合距离,进而更加准确地得到目标距离。
在另一些实施例中,拟合曲线包括不同的标定视场角类型的拟合曲线;将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
确定第一像素比例或第二像素比例的目标视场角类型;
从多个标定视场角类型的拟合曲线中筛选出与目标视场角类型匹配的标定视场角类型对应的拟合曲线,得到第三目标拟合曲线;
将第一像素比例和第二像素比例代入第三目标拟合曲线中进行拟合,得到拟合距离。
由于第一像素比例和第二像素比例的视场角类型不同时,拟合曲线也存在区别,因此,在本实施例中,为了更加准确地得到目标距离,一种视场角类型对应一条拟合曲线。然后获取到第一像素比例和第二像素比例之后,终端根据第一像素比例或第二像素比例的目标视场角类型从拟合曲线中筛选出第三目标拟合曲线,再将第一像素比例和第二像素比例代入第三拟合曲线中进行拟合,得到拟合距离,以便提高拟合距离的准确性。
应理解,拟合曲线可以为在电子设备出厂之前,预先存储在电子设备中的曲线,也即是, 在电子设备出厂之前,先对电子设备的第一摄像头和第二摄像头进行标定,从而得到拟合曲线。
所以,在另一些实施例中,在将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离之前,还包括:
获取多个标定距离;
通过第一摄像头获取位于每个标定距离所在位置上的测试物体的第一标定图像,以及通过第二摄像头获取位于每个标定距离所在位置上的测试物体的第二标定图像;
根据第一标定图像,确定测试物体在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体在第二标定图像中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
终端可以先获取测试距离,然后从测试距离中筛选出标定距离,并通过第一摄像头获取位于标定距离所在位置上的测试物体的第一标定图像,通过第二摄像头获取标定距离所在位置上的测试物体的第二标定图像,然后终端返回执行从测试距离中筛选出标定距离的步骤,直至测试距离被筛选完毕停止执行,从而得到各个标定距离对应的第一标定图像和第二标定图像。
比如,测试距离包括8cm、15cm、30cm、1m以及5m,则终端可以先从测试距离中筛选出标定距离8cm,然后通过第一摄像头获取8cm处的测试物体的第一标定图像,通过第二摄像头获取8cm处的测试物体的第二标定图像。接着,终端可以从测试距离中筛选出标定距离15cm,然后通过第一摄像头获取15cm处的测试物体的第一标定图像,通过第二摄像头获取15cm处的测试物体的第二标定图像,直至获取到5m处的测试物体的第一标定图像和5m处的测试物体的第二标定图像。
终端得到每个标定距离对应的第一标定图像和第二标定图像之后,根据每个标定距离对应的第一标定图像和第二标定图像,确定每个标定距离对应的第一标定像素比例和第二标定像素比例,最后,终端再根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
确定第一标定像素比例和第二标定像素比例的方法,可以参照确定第一像素比例和第二像素比例的方法,本实施例在此不再赘述。
在另一些实施例中,根据第一标定图像,确定测试物体在第一摄像头的视场角中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体在第二摄像头的视场角中所占的第二标定像素比例,包括:
对第一标定图像进行划分,得到第一标定图像对应的各个第一标定对焦区域,以及对第二标定图像进行划分,得到第二标定图像对应的各个第二标定对焦区域;
根据每个第一标定对焦区域,确定测试物体在每个第一标定对焦区域中所占的第一标定像素比例,以及根据每个第二标定对焦区域,确定测试物体在每个第二标定对焦区域中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,包括:
根据每个第一标定对焦区域在各个标定距离上对应的第一标定像素比例,每个第二标定对焦区域在各个标定距离上对应的第二标定像素比例以及各个标定距离进行拟合,得到针对每个第一标定对焦区域和第二标定对焦区域,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
比如,标定距离为5cm、30cm和1m。可以将第一标定图像和第二标定图像划分为8*6 个区域,如图5所示。根据5cm处的第一标定图像,确定第一标定对焦区域a在5cm处的第一标定像素比例a1,根据5cm处的第二标定图像,确定第二标定对焦区域b在5cm处的第二标定像素比例b1。
根据30cm处的第一标定图像,确定第一标定对焦区域a在30cm处的第一标定像素比例a2,根据30cm处的第二标定图像,确定第二标定对焦区域b在30cm处的第二标定像素比例b2。根据1m处的第一标定图像,确定第一标定对焦区域a在1m处的第一标定像素比例a3,根据1m处的第二标定图像,确定第二标定对焦区域b在1m处的第二标定像素比例b3。
则针对第一标定对焦区域a和第二标定对焦区域b,根据第一标定像素比例a1、第二标定像素比例b1、5cm、第一标定像素比例a2、第二标定像素比例b2、30cm、第一标定像素比例a3、第二标定像素比例b3以及1m进行拟合,从而得到针对第一标定对焦区域a和第二标定对焦区域b,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,也即是,得到第一标定对焦区域a和第二标定对焦区域b对应的拟合曲线。
测试物体可以为有重复图案的图卡,比如,测试物体可以为背白格子的图卡。在获取第一标定图像和第二标定图像的过程中,测试物体可以布满第一标定图像和第二标定图像。
在另一些实施例中,第一标定图像包括测试物体针对每个标定距离的不同的标定偏转角度的第一标定图像,第二标定图像包括测试物体针对每个标定距离的不同的标定偏转角度的第二标定图像;
根据第一标定图像,确定测试物体在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体在第二标定图像中所占的第二标定像素比例,包括:
根据标定偏转角度的第一标定图像,确定测试物体针对标定偏转角度在第一标定图像中所占的第一标定像素比例,以及根据标定偏转角度的第二标定图像,确定测试物体针对标定偏转角度在第二标定图像中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,包括:
根据每个标定偏转角度在各个标定距离上对应的第一标定像素比例,每个标定偏转角度在各个标定距离上对应的第二标定像素比例以及各个标定距离进行拟合,得到针对每个标定偏转角度,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
终端可以先获取测试距离,然后从测试距离中筛选出标定距离,并通过第一摄像头获取位于标定距离所在位置上的测试物体的第一标定图像,通过第二摄像头获取标定距离所在位置上的测试物体的第二标定图像,然后终端将测试物体沿X轴方向、Y轴方向以及Z轴方向中的至少一个方向旋转标定旋转角度,继续通过第一摄像头获取位于标定距离所在位置上的测试物体的第一标定图像,通过第二摄像头获取标定距离所在位置上的测试物体的第二标定图像,直至获取到标定距离的所有标定旋转角度对应的第一标定图像和第二标定图像,终端再返回执行从测试距离中筛选出标定距离的步骤,直至测试距离被筛选完毕停止执行,从而得到各个标定距离针对各个标定旋转角度对应的第一标定图像和第二标定图像。
比如,标定距离为5cm和30cm,标定旋转角度为0°、30°和50°。通过第一摄像头获取5cm处的标定旋转角度为0°的测试物体的第一标定图像c1,通过第二摄像头获取5cm处的标定旋转角度为0°的测试物体的第二标定图像d1。通过第一摄像头获取5cm处的标定旋转角度为30°的测试物体的第一标定图像c11,通过第二摄像头获取5cm处的标定旋转角度为30°的测试物体的第二标定图像d11。通过第一摄像头获取5cm处的标定旋转角度为60°的测试物体的第一标定图像c111,通过第二摄像头获取5cm处的标定旋转角度为60°的测试物体的第二标定图像d111。
然后再通过第一摄像头获取30cm处的标定旋转角度为0°的测试物体的第一标定图像 c2,通过第二摄像头获取30cm处的标定旋转角度为0°的测试物体的第二标定图像d2。通过第一摄像头获取30cm处的标定旋转角度为30°的测试物体的第一标定图像c21,通过第二摄像头获取30cm处的标定旋转角度为30°的测试物体的第二标定图像d21。通过第一摄像头获取30cm处的标定旋转角度为60°的测试物体的第一标定图像c211,通过第二摄像头获取30cm处的标定旋转角度为60°的测试物体的第二标定图像d211。
接着,终端根据第一标定图像c1和第一标定图像c2,确定测试物体针对标定旋转角度0°在5cm的第一标定像素比例v1和在30cm的第一标定像素比例v2,根据第二标定图像d1和第二标定图像d2,确定测试物体针对标定旋转角度0°在5cm的第二标定像素比例y1和在30cm的第二标定像素比例y2。
终端根据第一标定图像c11和第一标定图像c21,确定测试物体针对标定旋转角度30°在5cm的第一标定像素比例v11和在30cm的第一标定像素比例v21,根据第二标定图像d1和第二标定图像d2,确定测试物体针对标定旋转角度30°在5cm的第二标定像素比例y11和在30cm的第二标定像素比例y21。
终端根据第一标定图像c111和第一标定图像c211,确定测试物体针对标定旋转角度60°在5cm的第一标定像素比例v111和在30cm的第一标定像素比例v211,根据第二标定图像d1和第二标定图像d2,确定测试物体针对标定旋转角度60°在5cm的第二标定像素比例y111和在30cm的第二标定像素比例y211。
则针对标定旋转角度0°,根据第一标定像素比例v1、第二标定像素比例y1、5cm、第一标定像素比例v2、第二标定像素比例y2以及30cm进行拟合,得到标定旋转角度0°对应的拟合曲线。
针对标定旋转角度30°,根据第一标定像素比例v11、第二标定像素比例y11、5cm、第一标定像素比例v21、第二标定像素比例y21以及30cm进行拟合,得到针对标定偏转角度30°,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,也即是,得到标定旋转角度30°对应的拟合曲线。
针对标定旋转角度60°,根据第一标定像素比例v111、第二标定像素比例y111、5cm、第一标定像素比例v211、第二标定像素比例y211以及30cm进行拟合,得到针对标定偏转角度60°,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,也即是,得到标定旋转角度60°对应的拟合曲线。
在另一些实施例中,根据第一标定图像,确定测试物体在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体在第二标定图像中所占的第二标定像素比例,包括:
根据第一标定图像,确定测试物体针对不同标定视场角类型,在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体针对不同标定视场角类型,在第二标定图像中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,包括:
根据标定视场角类型在各个标定距离对应的第一标定像素比例、第二标定像素比例以及各个标定距离进行拟合,得到针对每种标定视场角类型,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
比如,标定距离为5cm和30cm,标定视场角类型包括水平视场角类型和垂直视场角类型。根据5cm处的第一标定图像,计算测试物体针对水平视场角类型的第一标定像素比例1和测试物体针对垂直视场角类型的第一标定像素比例2,根据5cm处的第二标定图像,测试物体针对水平视场角类型的第二标定像素比例3和测试物体针对垂直视场角类型的第二标定像素比例4。
根据30cm处的第一标定图像,计算测试物体针对水平视场角类型的第一标定像素比例5和测试物体针对垂直视场角类型的第一标定像素比例6,根据30cm处的第二标定图像,测试物体针对水平视场角类型的第二标定像素比例7和测试物体针对垂直视场角类型的第二标定像素比例8。
根据水平视场角类型对应的第一标定像素比例1、第二标定像素比例3、5cm、第一标定像素比例5、第二标定像素比例7以及30cm进行拟合,得到针对水平视场角类型,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,也即是,得到水平视场角类型对应的拟合曲线。
根据垂直视场角类型对应的第一标定像素比例2、第二标定像素比例4、5cm、第一标定像素比例6、第二标定像素比例8以及30cm进行拟合,得到针对垂直视场角类型,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,也即是,得到垂直视场角类型对应的拟合曲线。
S104、根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
终端得到目标距离之后,可以查找目标距离对应的驱动电流,然后根据驱动电流,驱动第一摄像头和第二摄像头移动至准焦位置。准焦位置指可以获取到清晰图像时第一摄像头和第二摄像头的位置。
应理解,终端在查找到目标距离对应的驱动电流之后,如果根据驱动电流,发现第一摄像头和第二摄像头已经位于准焦位置,则无需再根据确定电流驱动第一摄像头和第二摄像头进行移动。
在通过相位对焦(phase detection auto focus,PDAF)时,对摄像头的传感器存在较高的要求,一些终端不能通过相对检测进行对焦。当通过飞行时差测距(Time of Flight,TOF)对焦时,需要增加成本,而在本实施例中,通过第一摄像头和第二摄像头即可实现对焦,便于实现,且成本较低。
由以上可知,在本申请实施例中,先通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像。然后根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例。接着对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离。最后根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
即在本申请实施例中,由于被摄物体在第一预览图像中所占的第一像素比例和被摄物体在第二预览图像中所占的第二像素比例比较容易确定,因此,对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离更加准确,则根据目标距离可以将第一摄像头和第二摄像头驱动至准焦位置,从而达到更好的对焦结果。
为便于更好的实施本申请实施例提供的摄像头对焦方法,本申请实施例还提供一种基于上述摄像头对焦方法的装置。其中名词的含义与上述摄像头对焦方法中相同,具体实现细节可以参考方法实施例中的说明。
例如,如图6所示,该摄像头对焦装置可以包括:
获取模块601,用于通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像;
确定模块602,用于根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例;
映射模块603,用于对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离;
驱动模块604,用于根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
可选地,映射模块603具体用于执行:
获取第一摄像头的第一视场角、第二摄像头的第二视场角、以及第一摄像头和第二摄像头之间的第一距离;
根据第一视场角、第二视场角以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,第一像素比例表示第一像素比例参数的取值,第二像素比例表示第二像素比例参数的取值;
将第一像素比例和第二像素比例代入预设映射关系中进行映射计算,得到距离参数对应的距离;
将距离参数对应的距离作为被摄物体相对于第一摄像头和第二摄像头的目标距离。
可选地,映射模块603具体用于执行:
对第一视场角进行角度映射,得到第一角度值,以及对第二视场角进行角度映射,得到第二角度值;
根据第一角度值、第二角度值以及第一距离,确定距离参数、边缘参数以及第一像素比例参数之间的第一预设映射关系,边缘参数表征物体与第二摄像头的边缘之间的距离;
根据第二角度值,确定边缘参数、距离参数以及第二像素比例参数之间的第二预设映射关系;
根据第一预设映射关系和第二预设映射关系,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系。
可选地,映射模块603具体用于执行:
根据第一视场角、第二视场以及第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数针对各种视场角类型的预设映射关系;
确定第一像素比例或第二像素比例的目标视场角类型;
将第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到距离参数对应的距离。
可选地,第一像素比例包括多种视场角类型的第一像素比例,第二像素比例包括多种视场角类型的第二像素比例。
相应地,映射模块603具体用于执行:
将目标视场角类型的第一像素比例和第二像素比例代入与目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到距离参数针对各种视场角类型对应的第一目标距离;
根据各个第一目标距离,确定距离参数对应的距离。
可选地,映射模块603具体用于执行:
确定各个第一目标距离的平均值;
将平均值作为距离参数对应的距离。
可选地,映射模块603具体用于执行:
获取预设映射列表,预设映射列表关联存储了第一标定像素比例、第二标定像素比例以及标定距离;
将第一像素比例和第二像素比例与预设映射表中的第一标定像素比例和第二标定像素比例进行匹配;
将与第一像素比例匹配的第一标定像素比例,且与第二像素比例匹配的第二标定像素比例对应的标定距离,作为被摄物体相对于第一摄像头和第二摄像头的目标距离。
可选地,映射模块603具体用于执行:
将第一像素比例和第二像素比例代入拟合曲线中进行拟合,得到拟合距离,拟合曲线为第一标定像素比例、第二标定像素比例以及标定距离之间的曲线;
将拟合距离作为被摄物体相对于第一摄像头和第二摄像头的目标距离。
可选地,拟合曲线包括不同的标定偏转角度的拟合曲线。
相应地,映射模块603具体用于执行:
获取被摄物体在所述第一预览图像中对应的物体区域,相对于第一预览图像上参考位置的偏转角度;
从多个标定偏转角度的拟合曲线中,筛选出与偏转角度匹配的标定偏转角度对应的拟合曲线,得到第一目标拟合曲线;
将第一像素比例和第二像素比例代入第一目标拟合曲线中进行拟合,得到拟合距离。
可选地,拟合曲线包括不同的标定对焦区域的拟合曲线。
相应地,映射模块603具体用于执行:
确定第一预览图像中的对焦区域;
从多个标定对焦区域的拟合曲线中,筛选出与对焦区域匹配的标定对焦区域的拟合曲线,得到第二目标拟合曲线;
将第一像素比例和第二像素比例代入第二目标拟合曲线中进行拟合,得到拟合距离。
可选地,拟合曲线包括不同的标定视场角类型的拟合曲线。
相应地,映射模块603具体用于执行:
确定第一像素比例或第二像素比例的目标视场角类型;
从多个标定视场角类型的拟合曲线中筛选出与目标视场角类型匹配的标定视场角类型对应的拟合曲线,得到第三目标拟合曲线;
将第一像素比例和第二像素比例代入第三目标拟合曲线中进行拟合,得到拟合距离。
可选地,该摄像头对焦装置还包括:
标定模块,用于执行:
获取多个标定距离;
通过第一摄像头获取位于每个标定距离所在位置上的测试物体的第一标定图像,以及通过第二摄像头获取位于每个标定距离所在位置上的测试物体的第二标定图像;
根据第一标定图像,确定测试物体在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体在第二标定图像中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
可选地,标定模块具体用于执行:
对第一标定图像进行划分,得到第一标定图像对应的各个第一标定对焦区域,以及对第二标定图像进行划分,得到第二标定图像对应的各个第二标定对焦区域;
根据每个第一标定对焦区域,确定测试物体在每个第一标定对焦区域中所占的第一标定像素比例,以及根据每个第二标定对焦区域,确定测试物体在每个第二标定对焦区域中所占的第二标定像素比例;
根据每个第一标定对焦区域在各个标定距离上对应的第一标定像素比例,每个第二标定对焦区域在各个标定距离上对应的第二标定像素比例以及各个标定距离进行拟合,得到针对每个第一标定对焦区域和第二标定对焦区域,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
可选地,第一标定图像包括测试物体针对每个标定距离的不同的标定偏转角度的第一标定图像,第二标定图像包括测试物体针对每个标定距离的不同的标定偏转角度的第二标定图像
相应地,标定模块具体用于执行:
根据标定偏转角度的第一标定图像,确定测试物体针对标定偏转角度在第一标定图像中 所占的第一标定像素比例,以及根据标定偏转角度的第二标定图像,确定测试物体针对标定偏转角度在第二标定图像中所占的第二标定像素比例;
根据每个标定偏转角度在各个标定距离上对应的第一标定像素比例,每个标定偏转角度在各个标定距离上对应的第二标定像素比例以及各个标定距离进行拟合,得到针对每个标定偏转角度,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
可选地,标定模块具体用于执行:
根据第一标定图像,确定测试物体针对不同标定视场角类型,在第一标定图像中所占的第一标定像素比例,以及根据第二标定图像,确定测试物体针对不同标定视场角类型,在第二标定图像中所占的第二标定像素比例;
根据每个标定距离对应的第一标定像素比例、第二标定像素比例以及每个标定距离进行拟合,得到第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线,包括:
根据每种标定视场角类型在各个标定距离对应的第一标定像素比例、第二标定像素比例以及各个标定距离进行拟合,得到针对每种标定视场角类型,第一标定像素比例、第二标定像素比例以及标定距离之间的拟合曲线。
可选地,获取模块601具体用于执行:
获取拍摄启动指令;
根据拍摄启动指令,将电子设备的第一摄像头移动至预设初始位置,以及将电子设备的第二摄像头移动至预设初始位置;
通过位于预设初始位置的第一摄像头获取被摄物体的第一预览图像,以及通过位于预设初始位置的第二摄像头获取被摄物体的第二预览图像。
具体实施时,以上各个模块可以作为独立的实体来实现,也可以进行任意组合,作为同一或若干个实体来实现,以上各个模块的具体实施方式以及对应的有益效果可参见前面的方法实施例,在此不再赘述。
本申请实施例还提供一种电子设备,该电子设备可以是服务器或终端等,如图7所示,其示出了本申请实施例所涉及的电子设备的结构示意图,具体来讲:
该电子设备可以包括一个或者一个以上处理核心的处理器701、一个或一个以上计算机可读存储介质的存储器702、电源703和输入单元704等部件。本领域技术人员可以理解,图7中示出的电子设备结构并不构成对电子设备的限定,可以包括比图示更多或更少的部件,或者组合某些部件,或者不同的部件布置。其中:
处理器701是该电子设备的控制中心,利用各种接口和线路连接整个电子设备的各个部分,通过运行或执行存储在存储器702内的计算机程序和/或模块,以及调用存储在存储器702内的数据,执行电子设备的各种功能和处理数据。可选的,处理器701可包括一个或多个处理核心;优选的,处理器701可集成应用处理器和调制解调处理器,其中,应用处理器主要处理操作系统、用户界面和应用程序等,调制解调处理器主要处理无线通信。可以理解的是,上述调制解调处理器也可以不集成到处理器701中。
存储器702可用于存储计算机程序以及模块,处理器701通过运行存储在存储器702的计算机程序以及模块,从而执行各种功能应用以及数据处理。存储器702可主要包括存储程序区和存储数据区,其中,存储程序区可存储操作系统、至少一个功能所需的计算机程序(比如声音播放功能、图像播放功能等)等;存储数据区可存储根据电子设备的使用所创建的数据等。此外,存储器702可以包括高速随机存取存储器,还可以包括非易失性存储器,例如至少一个磁盘存储器件、闪存器件、或其他易失性固态存储器件。相应地,存储器702还可以包括存储器控制器,以提供处理器701对存储器702的访问。
电子设备还包括给各个部件供电的电源703,优选的,电源703可以通过电源管理系统与处理器701逻辑相连,从而通过电源管理系统实现管理充电、放电、以及功耗管理等功能。 电源703还可以包括一个或一个以上的直流或交流电源、再充电系统、电源故障检测电路、电源转换器或者逆变器、电源状态指示器等任意组件。
该电子设备还可包括输入单元704,该输入单元704可用于接收输入的数字或字符信息,以及产生与用户设置以及功能控制有关的键盘、鼠标、操作杆、光学或者轨迹球信号输入。
尽管未示出,电子设备还可以包括显示单元等,在此不再赘述。具体在本实施例中,电子设备中的处理器701会按照如下的指令,将一个或一个以上的计算机程序的进程对应的可执行文件加载到存储器702中,并由处理器701来运行存储在存储器702中的计算机程序,从而实现各种功能,比如:
通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像;
根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例;
对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离;
根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
以上各个操作的具体实施方式以及对应的有益效果可参见上文对摄像头对焦方法的详细描述,在此不作赘述。
本领域普通技术人员可以理解,上述实施例的各种方法中的全部或部分步骤可以通过计算机程序来完成,或通过计算机程序控制相关的硬件来完成,该计算机程序可以存储于一计算机可读存储介质中,并由处理器进行加载和执行。
为此,本申请实施例提供一种计算机可读存储介质,其中存储有计算机程序,该计算机程序能够被处理器进行加载,以执行本申请实施例所提供的任一种摄像头对焦方法中的步骤。例如,该计算机程序可以执行如下步骤:
通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过电子设备的第二摄像头获取被摄物体的第二预览图像;
根据第一预览图像,确定被摄物体在第一预览图像中所占的第一像素比例,以及根据第二预览图像,确定被摄物体在第二预览图像中所占的第二像素比例;
对第一像素比例和第二像素比例进行映射计算,得到被摄物体相对于第一摄像头和第二摄像头的目标距离;
根据目标距离,驱动第一摄像头和第二摄像头移动至准焦位置。
以上各个操作的具体实施方式以及对应的有益效果可参见前面的实施例,在此不再赘述。
其中,该计算机可读存储介质可以包括:只读存储器(ROM,Read Only Memory)、随机存取记忆体(RAM,Random Access Memory)、磁盘或光盘等。
由于该计算机可读存储介质中所存储的计算机程序,可以执行本申请实施例所提供的任一种摄像头对焦方法中的步骤,因此,可以实现本申请实施例所提供的任一种摄像头对焦方法所能实现的有益效果,详见前面的实施例,在此不再赘述。
其中,根据本申请的一个方面,提供了一种计算机程序产品或计算机程序,该计算机程序产品或计算机程序包括计算机指令,该计算机指令存储在计算机可读存储介质中。计算机设备的处理器从计算机可读存储介质读取该计算机指令,处理器执行该计算机指令,使得该计算机设备执行上述摄像头对焦方法。
以上对本申请实施例所提供的一种摄像头对焦方法、装置、电子设备和计算机可读存储介质进行了详细介绍,本文中应用了具体个例对本申请的原理及实施方式进行了阐述,以上实施例的说明只是用于帮助理解本申请的方法及其核心思想;同时,对于本领域的技术人员,依据本申请的思想,在具体实施方式及应用范围上均会有改变之处,综上所述,本说明书内 容不应理解为对本申请的限制。

Claims (20)

  1. 一种摄像头对焦方法,其中,包括:
    通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过所述电子设备的第二摄像头获取所述被摄物体的第二预览图像;
    根据所述第一预览图像,确定所述被摄物体在所述第一预览图像中所占的第一像素比例,以及根据所述第二预览图像,确定所述被摄物体在所述第二预览图像中所占的第二像素比例;
    对所述第一像素比例和所述第二像素比例进行映射计算,得到所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离;
    根据所述目标距离,驱动所述第一摄像头和所述第二摄像头移动至准焦位置。
  2. 根据权利要求1所述的摄像头对焦方法,其中,所述对所述第一像素比例和所述第二像素比例进行映射计算,得到所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离,包括:
    获取所述第一摄像头的第一视场角、所述第二摄像头的第二视场角、以及所述第一摄像头和所述第二摄像头之间的第一距离;
    根据所述第一视场角、所述第二视场角以及所述第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,所述第一像素比例表示所述第一像素比例参数的取值,所述第二像素比例表示所述第二像素比例参数的取值;
    将所述第一像素比例和所述第二像素比例代入所述预设映射关系中进行映射计算,得到所述距离参数对应的距离;
    将所述距离参数对应的距离作为所述被摄物体相对于所述第一摄像头和第二摄像头的目标距离。
  3. 根据权利要求2所述的摄像头对焦方法,其中,所述根据所述第一视场角、所述第二视场以及所述第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,包括:
    对所述第一视场角进行角度映射,得到第一角度值,以及对所述第二视场角进行所述角度映射,得到第二角度值;
    根据所述第一角度值、所述第二角度值以及所述第一距离,确定所述距离参数、边缘参数以及所述第一像素比例参数之间的第一预设映射关系,所述边缘参数表征物体与所述第二摄像头的边缘之间的距离;
    根据所述第二角度值,确定所述边缘参数、所述距离参数以及所述第二像素比例参数之间的第二预设映射关系;
    根据所述第一预设映射关系和所述第二预设映射关系,确定所述第一像素比例参数、所述第二像素比例参数以及所述距离参数之间的预设映射关系。
  4. 根据权利要求2所述的摄像头对焦方法,其中,所述根据所述第一视场角、所述第二视场以及所述第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数之间的预设映射关系,包括:
    根据所述第一视场角、所述第二视场以及所述第一距离,确定第一像素比例参数、第二像素比例参数以及距离参数针对各种视场角类型的预设映射关系;
    所述将所述第一像素比例和所述第二像素比例代入所述预设映射关系中进行映射计算,得到所述距离参数对应的距离,包括:
    确定所述第一像素比例或所述第二像素比例的目标视场角类型;
    将所述第一像素比例和所述第二像素比例代入与所述目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到所述距离参数对应的距离。
  5. 根据权利要求4所述的摄像头对焦方法,其中,所述第一像素比例包括多种视场角类型的第一像素比例,所述第二像素比例包括多种视场角类型的第二像素比例;
    所述将所述第一像素比例和所述第二像素比例代入与所述目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到所述距离参数对应的距离,包括:
    将所述目标视场角类型的第一像素比例和第二像素比例代入与所述目标视场角类型匹配的视场角类型对应的预设映射关系中进行映射计算,得到所述距离参数针对各种视场角类型对应的第一目标距离;
    根据各个所述第一目标距离,确定所述距离参数对应的距离。
  6. 根据权利要求5所述的摄像头对焦方法,其中,所述根据各个所述第一目标距离,确定所述距离参数对应的距离,包括:
    确定各个所述第一目标距离的平均值;
    将所述平均值作为所述距离参数对应的距离。
  7. 根据权利要求1所述的摄像头对焦方法,其中,所述对所述第一像素比例和所述第二像素比例进行映射计算,得到所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离,包括:
    获取预设映射列表,所述预设映射列表关联存储了第一标定像素比例、第二标定像素比例以及标定距离;
    将所述第一像素比例和所述第二像素比例与所述预设映射表中的第一标定像素比例和第二标定像素比例进行匹配;
    将与所述第一像素比例匹配的第一标定像素比例,且与所述第二像素比例匹配的第二标定像素比例对应的标定距离,作为所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离。
  8. 根据权利要求1所述的摄像头对焦方法,其中,所述对所述第一像素比例和所述第二像素比例进行映射计算,得到所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离,包括:
    将所述第一像素比例和所述第二像素比例代入拟合曲线中进行拟合,得到拟合距离,所述拟合曲线为第一标定像素比例、第二标定像素比例以及标定距离之间的曲线;
    将所述拟合距离作为所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离。
  9. 根据权利要求8所述的摄像头对焦方法,其中,所述拟合曲线包括不同的标定偏转角度的拟合曲线;
    所述将所述第一像素比例和所述第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
    获取所述被摄物体在所述第一预览图像中对应的物体区域,相对于所述第一预览图像上参考位置的偏转角度;
    从多个所述标定偏转角度的拟合曲线中,筛选出与所述偏转角度匹配的标定偏转角度对应的拟合曲线,得到第一目标拟合曲线;
    将所述第一像素比例和所述第二像素比例代入所述第一目标拟合曲线中进行拟合,得到拟合距离。
  10. 根据权利要求9所述的摄像头对焦方法,其中,所述获取所述被摄物体在所述第一预览图像中对应的物体区域,相对于所述第一预览图像上参考位置的偏转角度,包括:
    确定所述被摄物体在所述第一预览图像中对应的物体区域的同一边缘上两个参考像素;
    确定两个所述参考像素到所述参考位置的距离;
    若两个所述参考像素到所述参考位置的距离相同,则所述被摄物体在所述第一预览图像 中对应的物体区域,相对于所述第一预览图像上参考位置的偏转角度为零;
    若两个所述参考像素到所述参考位置的距离不相同,则根据两个所述参考像素到所述参考位置的距离,确定所述被摄物体在所述第一预览图像中对应的物体区域,相对于所述第一预览图像上参考位置的偏转角度。
  11. 根据权利要求8所述的摄像头对焦方法,其中,所述拟合曲线包括不同的标定对焦区域的拟合曲线;
    所述将所述第一像素比例和所述第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
    确定所述第一预览图像中的对焦区域;
    从多个所述标定对焦区域的拟合曲线中,筛选出与所述对焦区域匹配的标定对焦区域的拟合曲线,得到第二目标拟合曲线;
    将所述第一像素比例和所述第二像素比例代入所述第二目标拟合曲线中进行拟合,得到拟合距离。
  12. 根据权利要求8所述的摄像头对焦方法,其中,所述拟合曲线包括不同的标定视场角类型的拟合曲线;
    将所述第一像素比例和所述第二像素比例代入拟合曲线中进行拟合,得到拟合距离,包括:
    确定所述第一像素比例或所述第二像素比例的目标视场角类型;
    从多个所述标定视场角类型的拟合曲线中筛选出与所述目标视场角类型匹配的标定视场角类型对应的拟合曲线,得到第三目标拟合曲线;
    将所述第一像素比例和所述第二像素比例代入所述第三目标拟合曲线中进行拟合,得到拟合距离。
  13. 根据权利要求8-12任一项所述的摄像头对焦方法,其中,在所述将所述第一像素比例和所述第二像素比例代入拟合曲线中进行拟合,得到拟合距离之前,还包括:
    获取多个所述标定距离;
    通过所述第一摄像头获取位于每个所述标定距离所在位置上的测试物体的第一标定图像,以及通过所述第二摄像头获取位于每个所述标定距离所在位置上的测试物体的第二标定图像;
    根据所述第一标定图像,确定所述测试物体在所述第一标定图像中所占的第一标定像素比例,以及根据所述第二标定图像,确定所述测试物体在所述第二标定图像中所占的第二标定像素比例;
    根据每个标定距离对应的所述第一标定像素比例、所述第二标定像素比例以及每个所述标定距离进行拟合,得到所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线。
  14. 根据权利要求13所述的摄像头对焦方法,其中,所述根据所述第一标定图像,确定所述测试物体在所述第一摄像头的视场角中所占的第一标定像素比例,以及根据所述第二标定图像,确定所述测试物体在所述第二摄像头的视场角中所占的第二标定像素比例,包括:
    对所述第一标定图像进行划分,得到所述第一标定图像对应的各个第一标定对焦区域,以及对所述第二标定图像进行划分,得到所述第二标定图像对应的各个第二标定对焦区域;
    根据每个所述第一标定对焦区域,确定所述测试物体在每个所述第一标定对焦区域中所占的第一标定像素比例,以及根据每个所述第二标定对焦区域,确定所述测试物体在每个所述第二标定对焦区域中所占的第二标定像素比例;
    所述根据每个标定距离对应的所述第一标定像素比例、第二标定像素比例以及每个所述标定距离进行拟合,得到所述第一标定像素比例、所述第二标定像素比例以及所述标定距离 之间的拟合曲线,包括:
    根据每个所述第一标定对焦区域在各个所述标定距离上对应的第一标定像素比例,每个所述第二标定对焦区域在各个所述标定距离上对应的第二标定像素比例以及各个所述标定距离进行拟合,得到针对每个所述第一标定对焦区域和所述第二标定对焦区域,所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线。
  15. 根据权利要求13所述的摄像头对焦方法,其中,所述第一标定图像包括所述测试物体针对每个所述标定距离的不同的标定偏转角度的第一标定图像,所述第二标定图像包括所述测试物体针对每个所述标定距离的不同的标定偏转角度的第二标定图像;
    所述根据所述第一标定图像,确定所述测试物体在所述第一标定图像中所占的第一标定像素比例,以及根据所述第二标定图像,确定所述测试物体在所述第二标定图像中所占的第二标定像素比例,包括:
    根据所述标定偏转角度的第一标定图像,确定所述测试物体针对所述标定偏转角度在所述第一标定图像中所占的第一标定像素比例,以及根据所述标定偏转角度的第二标定图像,确定所述测试物体针对所述标定偏转角度在所述第二标定图像中所占的第二标定像素比例;
    所述根据每个标定距离对应的所述第一标定像素比例、所述第二标定像素比例以及每个所述标定距离进行拟合,得到所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线,包括:
    根据每个所述标定偏转角度在各个所述标定距离上对应的第一标定像素比例,每个所述标定偏转角度在各个所述标定距离上对应的第二标定像素比例以及各个所述标定距离进行拟合,得到针对每个所述标定偏转角度,所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线。
  16. 根据权利要求13所述的摄像头对焦方法,其中,所述根据所述第一标定图像,确定所述测试物体在所述第一标定图像中所占的第一标定像素比例,以及根据所述第二标定图像,确定所述测试物体在所述第二标定图像中所占的第二标定像素比例,包括:
    根据所述第一标定图像,确定所述测试物体针对不同标定视场角类型,在所述第一标定图像中所占的第一标定像素比例,以及根据所述第二标定图像,确定所述测试物体针对不同标定视场角类型,在所述第二标定图像中所占的第二标定像素比例;
    所述根据每个标定距离对应的所述第一标定像素比例、所述第二标定像素比例以及每个所述标定距离进行拟合,得到所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线,包括:
    根据每种标定视场角类型在各个标定距离对应的所述第一标定像素比例、第二标定像素比例以及各个所述标定距离进行拟合,得到针对每种所述标定视场角类型,所述第一标定像素比例、所述第二标定像素比例以及所述标定距离之间的拟合曲线。
  17. 根据权利要求1所述的摄像头对焦方法,其中,所述通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过所述电子设备的第二摄像头获取所述被摄物体的第二预览图像,包括:
    获取拍摄启动指令;
    根据所述拍摄启动指令,将电子设备的第一摄像头移动至预设初始位置,以及将所述电子设备的第二摄像头移动至所述预设初始位置;
    通过位于所述预设初始位置的第一摄像头获取被摄物体的第一预览图像,以及通过位于所述预设初始位置的第二摄像头获取所述被摄物体的第二预览图像。
  18. 一种摄像头对焦装置,其中,包括:
    获取模块,用于通过电子设备的第一摄像头获取被摄物体的第一预览图像,以及通过所述电子设备的第二摄像头获取所述被摄物体的第二预览图像;
    确定模块,用于根据所述第一预览图像,确定所述被摄物体在所述第一预览图像中所占的第一像素比例,以及根据所述第二预览图像,确定所述被摄物体在所述第二预览图像中所占的第二像素比例;
    映射模块,用于对所述第一像素比例和所述第二像素比例进行映射计算,得到所述被摄物体相对于所述第一摄像头和所述第二摄像头的目标距离;
    驱动模块,用于根据所述目标距离,驱动所述第一摄像头和所述第二摄像头移动至准焦位置。
  19. 一种电子设备,其中,包括处理器和存储器,所述存储器存储有计算机程序,所述处理器用于运行所述存储器内的计算机程序,以执行权利要求1至17任一项所述的摄像头对焦方法。
  20. 一种计算机可读存储介质,其中,所述计算机可读存储介质存储有计算机程序,所述计算机程序适于处理器进行加载,以执行权利要求1至17任一项所述的摄像头对焦方法。
PCT/CN2023/087491 2022-07-06 2023-04-11 摄像头对焦方法、装置、电子设备和计算机可读存储介质 WO2024007654A1 (zh)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
CN202210798330.4 2022-07-06
CN202210798330.4A CN117156269A (zh) 2022-07-06 2022-07-06 摄像头对焦方法、装置、电子设备和计算机可读存储介质

Publications (1)

Publication Number Publication Date
WO2024007654A1 true WO2024007654A1 (zh) 2024-01-11

Family

ID=88901388

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2023/087491 WO2024007654A1 (zh) 2022-07-06 2023-04-11 摄像头对焦方法、装置、电子设备和计算机可读存储介质

Country Status (2)

Country Link
CN (1) CN117156269A (zh)
WO (1) WO2024007654A1 (zh)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578026A (zh) * 2015-07-10 2016-05-11 宇龙计算机通信科技(深圳)有限公司 一种拍摄方法及用户终端
CN107071243A (zh) * 2017-03-09 2017-08-18 成都西纬科技有限公司 相机对焦校准系统及对焦校准方法
CN108200335A (zh) * 2017-12-28 2018-06-22 深圳市金立通信设备有限公司 基于双摄像头的拍照方法、终端及计算机可读存储介质
KR20180103657A (ko) * 2017-03-10 2018-09-19 삼성전자주식회사 스테레오 카메라 모듈의 캘리브레이션 방법 및 장치, 및 컴퓨터 판독 가능한 저장매체
CN108931191A (zh) * 2018-06-01 2018-12-04 普联技术有限公司 测距方法、尺寸测量方法及终端

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105578026A (zh) * 2015-07-10 2016-05-11 宇龙计算机通信科技(深圳)有限公司 一种拍摄方法及用户终端
CN107071243A (zh) * 2017-03-09 2017-08-18 成都西纬科技有限公司 相机对焦校准系统及对焦校准方法
KR20180103657A (ko) * 2017-03-10 2018-09-19 삼성전자주식회사 스테레오 카메라 모듈의 캘리브레이션 방법 및 장치, 및 컴퓨터 판독 가능한 저장매체
CN108200335A (zh) * 2017-12-28 2018-06-22 深圳市金立通信设备有限公司 基于双摄像头的拍照方法、终端及计算机可读存储介质
CN108931191A (zh) * 2018-06-01 2018-12-04 普联技术有限公司 测距方法、尺寸测量方法及终端

Also Published As

Publication number Publication date
CN117156269A (zh) 2023-12-01

Similar Documents

Publication Publication Date Title
WO2021196548A1 (zh) 距离确定方法、装置及系统
US20160292900A1 (en) Image group processing and visualization
CN109544643B (zh) 一种摄像机图像校正方法及装置
CN111325798B (zh) 相机模型纠正方法、装置、ar实现设备及可读存储介质
CN105516597A (zh) 一种全景拍摄处理方法及装置
WO2019045721A1 (en) IMAGE PROCESSING DEVICES USING INTEGRATED DYNAMIC CAMERA MODELS TO SUPPORT RAPID DETERMINATION OF CAMERA INTRINSIC ELEMENTS AND METHODS OF OPERATION THEREOF
WO2019232793A1 (zh) 双摄像头标定方法、电子设备、计算机可读存储介质
WO2022001648A1 (zh) 图像处理方法、装置、设备及介质
CN111340737A (zh) 图像矫正方法、装置和电子系统
CN113793387A (zh) 单目散斑结构光系统的标定方法、装置及终端
CN114640833A (zh) 投影画面调整方法、装置、电子设备和存储介质
WO2023010565A1 (zh) 单目散斑结构光系统的标定方法、装置及终端
WO2024007654A1 (zh) 摄像头对焦方法、装置、电子设备和计算机可读存储介质
CN111818260B (zh) 一种自动聚焦方法和装置及电子设备
CN110689565B (zh) 一种深度图确定的方法、装置及电子设备
CN108780572A (zh) 图像校正的方法及装置
WO2022121686A1 (zh) 投影融合方法、投影融合系统及计算机可读存储介质
CN113989376B (zh) 室内深度信息的获取方法、装置和可读存储介质
CN111669572A (zh) 摄像头模组的检测方法、装置、介质及电子设备
CN114063864A (zh) 图像显示方法、装置、电子设备及计算机可读存储介质
WO2022041119A1 (zh) 三维点云处理方法及装置
WO2023226548A1 (zh) 镜头对焦方法、装置、电子设备和计算机可读存储介质
Wang et al. A calibration method on 3D measurement based on structured-light with single camera
CN117412025A (zh) 亮度测试方法、装置、电子设备及计算机存储介质
CN115471403B (zh) 图像处理方法、装置及存储介质

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23834437

Country of ref document: EP

Kind code of ref document: A1